00:00:00.000 Started by upstream project "autotest-per-patch" build number 127089 00:00:00.000 originally caused by: 00:00:00.001 Started by user sys_sgci 00:00:00.070 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.070 The recommended git tool is: git 00:00:00.071 using credential 00000000-0000-0000-0000-000000000002 00:00:00.074 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.111 Fetching changes from the remote Git repository 00:00:00.113 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.156 Using shallow fetch with depth 1 00:00:00.156 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.156 > git --version # timeout=10 00:00:00.196 > git --version # 'git version 2.39.2' 00:00:00.196 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.221 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.221 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:04.953 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:04.967 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:04.979 Checking out Revision f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08 (FETCH_HEAD) 00:00:04.979 > git config core.sparsecheckout # timeout=10 00:00:04.989 > git read-tree -mu HEAD # timeout=10 00:00:05.007 > git checkout -f f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08 # timeout=5 00:00:05.041 Commit message: "spdk-abi-per-patch: fix check-so-deps-docker-autotest parameters" 00:00:05.042 > git rev-list --no-walk f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08 # timeout=10 00:00:05.140 [Pipeline] Start of Pipeline 00:00:05.153 [Pipeline] library 00:00:05.154 Loading library shm_lib@master 00:00:05.155 Library shm_lib@master is cached. Copying from home. 00:00:05.171 [Pipeline] node 00:00:05.194 Running on WFP50 in /var/jenkins/workspace/crypto-phy-autotest 00:00:05.195 [Pipeline] { 00:00:05.206 [Pipeline] catchError 00:00:05.207 [Pipeline] { 00:00:05.220 [Pipeline] wrap 00:00:05.229 [Pipeline] { 00:00:05.237 [Pipeline] stage 00:00:05.239 [Pipeline] { (Prologue) 00:00:05.417 [Pipeline] sh 00:00:05.701 + logger -p user.info -t JENKINS-CI 00:00:05.718 [Pipeline] echo 00:00:05.720 Node: WFP50 00:00:05.727 [Pipeline] sh 00:00:06.024 [Pipeline] setCustomBuildProperty 00:00:06.039 [Pipeline] echo 00:00:06.040 Cleanup processes 00:00:06.045 [Pipeline] sh 00:00:06.323 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:06.323 1212323 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:06.333 [Pipeline] sh 00:00:06.616 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:06.616 ++ grep -v 'sudo pgrep' 00:00:06.616 ++ awk '{print $1}' 00:00:06.616 + sudo kill -9 00:00:06.616 + true 00:00:06.631 [Pipeline] cleanWs 00:00:06.639 [WS-CLEANUP] Deleting project workspace... 00:00:06.639 [WS-CLEANUP] Deferred wipeout is used... 00:00:06.645 [WS-CLEANUP] done 00:00:06.649 [Pipeline] setCustomBuildProperty 00:00:06.694 [Pipeline] sh 00:00:06.973 + sudo git config --global --replace-all safe.directory '*' 00:00:07.053 [Pipeline] httpRequest 00:00:07.080 [Pipeline] echo 00:00:07.081 Sorcerer 10.211.164.101 is alive 00:00:07.088 [Pipeline] httpRequest 00:00:07.093 HttpMethod: GET 00:00:07.093 URL: http://10.211.164.101/packages/jbp_f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08.tar.gz 00:00:07.094 Sending request to url: http://10.211.164.101/packages/jbp_f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08.tar.gz 00:00:07.109 Response Code: HTTP/1.1 200 OK 00:00:07.109 Success: Status code 200 is in the accepted range: 200,404 00:00:07.110 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08.tar.gz 00:00:10.632 [Pipeline] sh 00:00:10.917 + tar --no-same-owner -xf jbp_f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08.tar.gz 00:00:10.934 [Pipeline] httpRequest 00:00:10.958 [Pipeline] echo 00:00:10.960 Sorcerer 10.211.164.101 is alive 00:00:10.969 [Pipeline] httpRequest 00:00:10.973 HttpMethod: GET 00:00:10.974 URL: http://10.211.164.101/packages/spdk_3bc1795d30f064434535a81a05bbd560c40a398b.tar.gz 00:00:10.974 Sending request to url: http://10.211.164.101/packages/spdk_3bc1795d30f064434535a81a05bbd560c40a398b.tar.gz 00:00:10.999 Response Code: HTTP/1.1 200 OK 00:00:10.999 Success: Status code 200 is in the accepted range: 200,404 00:00:11.000 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_3bc1795d30f064434535a81a05bbd560c40a398b.tar.gz 00:01:08.601 [Pipeline] sh 00:01:08.886 + tar --no-same-owner -xf spdk_3bc1795d30f064434535a81a05bbd560c40a398b.tar.gz 00:01:13.122 [Pipeline] sh 00:01:13.406 + git -C spdk log --oneline -n5 00:01:13.406 3bc1795d3 accel_perf: add support for DIX Generate/Verify 00:01:13.406 0a6bb28fa test/accel/dif: add DIX Generate/Verify suites 00:01:13.406 52c295e65 lib/accel: add DIX verify 00:01:13.406 b5c6fc4f3 lib/accel: add DIX generate 00:01:13.406 8ee2672c4 test/bdev: Add test for resized RAID with superblock 00:01:13.420 [Pipeline] } 00:01:13.440 [Pipeline] // stage 00:01:13.449 [Pipeline] stage 00:01:13.452 [Pipeline] { (Prepare) 00:01:13.470 [Pipeline] writeFile 00:01:13.486 [Pipeline] sh 00:01:13.765 + logger -p user.info -t JENKINS-CI 00:01:13.778 [Pipeline] sh 00:01:14.065 + logger -p user.info -t JENKINS-CI 00:01:14.076 [Pipeline] sh 00:01:14.360 + cat autorun-spdk.conf 00:01:14.360 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:14.360 SPDK_TEST_BLOCKDEV=1 00:01:14.360 SPDK_TEST_ISAL=1 00:01:14.360 SPDK_TEST_CRYPTO=1 00:01:14.360 SPDK_TEST_REDUCE=1 00:01:14.360 SPDK_TEST_VBDEV_COMPRESS=1 00:01:14.360 SPDK_RUN_UBSAN=1 00:01:14.360 SPDK_TEST_ACCEL=1 00:01:14.368 RUN_NIGHTLY=0 00:01:14.372 [Pipeline] readFile 00:01:14.399 [Pipeline] withEnv 00:01:14.401 [Pipeline] { 00:01:14.416 [Pipeline] sh 00:01:14.701 + set -ex 00:01:14.702 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:01:14.702 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:01:14.702 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:14.702 ++ SPDK_TEST_BLOCKDEV=1 00:01:14.702 ++ SPDK_TEST_ISAL=1 00:01:14.702 ++ SPDK_TEST_CRYPTO=1 00:01:14.702 ++ SPDK_TEST_REDUCE=1 00:01:14.702 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:01:14.702 ++ SPDK_RUN_UBSAN=1 00:01:14.702 ++ SPDK_TEST_ACCEL=1 00:01:14.702 ++ RUN_NIGHTLY=0 00:01:14.702 + case $SPDK_TEST_NVMF_NICS in 00:01:14.702 + DRIVERS= 00:01:14.702 + [[ -n '' ]] 00:01:14.702 + exit 0 00:01:14.711 [Pipeline] } 00:01:14.730 [Pipeline] // withEnv 00:01:14.736 [Pipeline] } 00:01:14.752 [Pipeline] // stage 00:01:14.762 [Pipeline] catchError 00:01:14.764 [Pipeline] { 00:01:14.783 [Pipeline] timeout 00:01:14.784 Timeout set to expire in 1 hr 0 min 00:01:14.785 [Pipeline] { 00:01:14.802 [Pipeline] stage 00:01:14.803 [Pipeline] { (Tests) 00:01:14.820 [Pipeline] sh 00:01:15.103 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:01:15.104 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:01:15.104 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:01:15.104 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:01:15.104 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:15.104 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:01:15.104 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:01:15.104 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:01:15.104 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:01:15.104 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:01:15.104 + [[ crypto-phy-autotest == pkgdep-* ]] 00:01:15.104 + cd /var/jenkins/workspace/crypto-phy-autotest 00:01:15.104 + source /etc/os-release 00:01:15.104 ++ NAME='Fedora Linux' 00:01:15.104 ++ VERSION='38 (Cloud Edition)' 00:01:15.104 ++ ID=fedora 00:01:15.104 ++ VERSION_ID=38 00:01:15.104 ++ VERSION_CODENAME= 00:01:15.104 ++ PLATFORM_ID=platform:f38 00:01:15.104 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:01:15.104 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:15.104 ++ LOGO=fedora-logo-icon 00:01:15.104 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:01:15.104 ++ HOME_URL=https://fedoraproject.org/ 00:01:15.104 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:01:15.104 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:15.104 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:15.104 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:15.104 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:01:15.104 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:15.104 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:01:15.104 ++ SUPPORT_END=2024-05-14 00:01:15.104 ++ VARIANT='Cloud Edition' 00:01:15.104 ++ VARIANT_ID=cloud 00:01:15.104 + uname -a 00:01:15.104 Linux spdk-wfp-50 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:01:15.104 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:01:18.397 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:01:18.397 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:01:18.397 Hugepages 00:01:18.397 node hugesize free / total 00:01:18.397 node0 1048576kB 0 / 0 00:01:18.397 node0 2048kB 0 / 0 00:01:18.397 node1 1048576kB 0 / 0 00:01:18.397 node1 2048kB 0 / 0 00:01:18.397 00:01:18.397 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:18.397 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:18.397 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:18.397 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:18.397 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:18.397 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:18.397 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:18.397 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:18.397 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:18.397 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:01:18.397 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:18.397 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:18.397 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:18.397 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:18.398 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:18.398 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:18.398 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:18.398 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:18.398 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:01:18.398 VMD 0000:d7:05.5 8086 201d 1 vfio-pci - - 00:01:18.657 + rm -f /tmp/spdk-ld-path 00:01:18.657 + source autorun-spdk.conf 00:01:18.657 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:18.657 ++ SPDK_TEST_BLOCKDEV=1 00:01:18.657 ++ SPDK_TEST_ISAL=1 00:01:18.657 ++ SPDK_TEST_CRYPTO=1 00:01:18.657 ++ SPDK_TEST_REDUCE=1 00:01:18.657 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:01:18.657 ++ SPDK_RUN_UBSAN=1 00:01:18.657 ++ SPDK_TEST_ACCEL=1 00:01:18.657 ++ RUN_NIGHTLY=0 00:01:18.657 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:18.657 + [[ -n '' ]] 00:01:18.657 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:18.657 + for M in /var/spdk/build-*-manifest.txt 00:01:18.657 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:18.657 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:18.657 + for M in /var/spdk/build-*-manifest.txt 00:01:18.657 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:18.657 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:18.657 ++ uname 00:01:18.657 + [[ Linux == \L\i\n\u\x ]] 00:01:18.657 + sudo dmesg -T 00:01:18.657 + sudo dmesg --clear 00:01:18.657 + dmesg_pid=1213812 00:01:18.657 + [[ Fedora Linux == FreeBSD ]] 00:01:18.658 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:18.658 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:18.658 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:18.658 + [[ -x /usr/src/fio-static/fio ]] 00:01:18.658 + export FIO_BIN=/usr/src/fio-static/fio 00:01:18.658 + FIO_BIN=/usr/src/fio-static/fio 00:01:18.658 + sudo dmesg -Tw 00:01:18.658 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:18.658 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:18.658 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:18.658 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:18.658 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:18.658 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:18.658 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:18.658 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:18.658 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:01:18.658 Test configuration: 00:01:18.658 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:18.658 SPDK_TEST_BLOCKDEV=1 00:01:18.658 SPDK_TEST_ISAL=1 00:01:18.658 SPDK_TEST_CRYPTO=1 00:01:18.658 SPDK_TEST_REDUCE=1 00:01:18.658 SPDK_TEST_VBDEV_COMPRESS=1 00:01:18.658 SPDK_RUN_UBSAN=1 00:01:18.658 SPDK_TEST_ACCEL=1 00:01:18.658 RUN_NIGHTLY=0 19:37:10 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:01:18.658 19:37:10 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:18.658 19:37:10 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:18.658 19:37:10 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:18.658 19:37:10 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:18.658 19:37:10 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:18.658 19:37:10 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:18.658 19:37:10 -- paths/export.sh@5 -- $ export PATH 00:01:18.658 19:37:10 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:18.658 19:37:10 -- common/autobuild_common.sh@446 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:01:18.658 19:37:10 -- common/autobuild_common.sh@447 -- $ date +%s 00:01:18.658 19:37:10 -- common/autobuild_common.sh@447 -- $ mktemp -dt spdk_1721842630.XXXXXX 00:01:18.658 19:37:10 -- common/autobuild_common.sh@447 -- $ SPDK_WORKSPACE=/tmp/spdk_1721842630.x5VG9v 00:01:18.658 19:37:10 -- common/autobuild_common.sh@449 -- $ [[ -n '' ]] 00:01:18.658 19:37:10 -- common/autobuild_common.sh@453 -- $ '[' -n '' ']' 00:01:18.658 19:37:10 -- common/autobuild_common.sh@456 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:01:18.658 19:37:10 -- common/autobuild_common.sh@460 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:18.658 19:37:10 -- common/autobuild_common.sh@462 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:18.658 19:37:10 -- common/autobuild_common.sh@463 -- $ get_config_params 00:01:18.658 19:37:10 -- common/autotest_common.sh@398 -- $ xtrace_disable 00:01:18.658 19:37:10 -- common/autotest_common.sh@10 -- $ set +x 00:01:18.918 19:37:10 -- common/autobuild_common.sh@463 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:01:18.918 19:37:10 -- common/autobuild_common.sh@465 -- $ start_monitor_resources 00:01:18.918 19:37:10 -- pm/common@17 -- $ local monitor 00:01:18.918 19:37:10 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:18.918 19:37:10 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:18.918 19:37:10 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:18.918 19:37:10 -- pm/common@21 -- $ date +%s 00:01:18.918 19:37:10 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:18.918 19:37:10 -- pm/common@21 -- $ date +%s 00:01:18.918 19:37:10 -- pm/common@25 -- $ sleep 1 00:01:18.918 19:37:10 -- pm/common@21 -- $ date +%s 00:01:18.918 19:37:10 -- pm/common@21 -- $ date +%s 00:01:18.918 19:37:10 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721842630 00:01:18.918 19:37:10 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721842630 00:01:18.918 19:37:10 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721842630 00:01:18.918 19:37:10 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721842630 00:01:18.918 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721842630_collect-vmstat.pm.log 00:01:18.918 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721842630_collect-cpu-load.pm.log 00:01:18.918 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721842630_collect-cpu-temp.pm.log 00:01:18.918 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721842630_collect-bmc-pm.bmc.pm.log 00:01:19.856 19:37:11 -- common/autobuild_common.sh@466 -- $ trap stop_monitor_resources EXIT 00:01:19.856 19:37:11 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:19.856 19:37:11 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:19.856 19:37:11 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:19.856 19:37:11 -- spdk/autobuild.sh@16 -- $ date -u 00:01:19.856 Wed Jul 24 05:37:11 PM UTC 2024 00:01:19.856 19:37:11 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:19.856 v24.09-pre-320-g3bc1795d3 00:01:19.856 19:37:11 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:19.856 19:37:11 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:19.856 19:37:11 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:19.856 19:37:11 -- common/autotest_common.sh@1101 -- $ '[' 3 -le 1 ']' 00:01:19.856 19:37:11 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:01:19.856 19:37:11 -- common/autotest_common.sh@10 -- $ set +x 00:01:19.856 ************************************ 00:01:19.856 START TEST ubsan 00:01:19.856 ************************************ 00:01:19.856 19:37:11 ubsan -- common/autotest_common.sh@1125 -- $ echo 'using ubsan' 00:01:19.856 using ubsan 00:01:19.856 00:01:19.856 real 0m0.001s 00:01:19.856 user 0m0.001s 00:01:19.856 sys 0m0.000s 00:01:19.856 19:37:11 ubsan -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:01:19.856 19:37:11 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:01:19.856 ************************************ 00:01:19.856 END TEST ubsan 00:01:19.856 ************************************ 00:01:19.856 19:37:11 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:01:19.856 19:37:11 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:01:19.856 19:37:11 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:01:19.856 19:37:11 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:01:19.856 19:37:11 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:01:19.856 19:37:11 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:01:19.856 19:37:11 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:01:19.856 19:37:11 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:01:19.856 19:37:11 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:01:20.114 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:01:20.114 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:01:20.373 Using 'verbs' RDMA provider 00:01:36.631 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:51.594 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:52.162 Creating mk/config.mk...done. 00:01:52.162 Creating mk/cc.flags.mk...done. 00:01:52.162 Type 'make' to build. 00:01:52.162 19:37:43 -- spdk/autobuild.sh@69 -- $ run_test make make -j72 00:01:52.162 19:37:43 -- common/autotest_common.sh@1101 -- $ '[' 3 -le 1 ']' 00:01:52.163 19:37:43 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:01:52.163 19:37:43 -- common/autotest_common.sh@10 -- $ set +x 00:01:52.163 ************************************ 00:01:52.163 START TEST make 00:01:52.163 ************************************ 00:01:52.163 19:37:43 make -- common/autotest_common.sh@1125 -- $ make -j72 00:01:52.421 make[1]: Nothing to be done for 'all'. 00:02:31.178 The Meson build system 00:02:31.178 Version: 1.3.1 00:02:31.178 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:02:31.178 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:02:31.178 Build type: native build 00:02:31.178 Program cat found: YES (/usr/bin/cat) 00:02:31.178 Project name: DPDK 00:02:31.178 Project version: 24.03.0 00:02:31.178 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:02:31.178 C linker for the host machine: cc ld.bfd 2.39-16 00:02:31.178 Host machine cpu family: x86_64 00:02:31.178 Host machine cpu: x86_64 00:02:31.178 Message: ## Building in Developer Mode ## 00:02:31.178 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:31.178 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:02:31.178 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:02:31.178 Program python3 found: YES (/usr/bin/python3) 00:02:31.178 Program cat found: YES (/usr/bin/cat) 00:02:31.178 Compiler for C supports arguments -march=native: YES 00:02:31.178 Checking for size of "void *" : 8 00:02:31.179 Checking for size of "void *" : 8 (cached) 00:02:31.179 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:02:31.179 Library m found: YES 00:02:31.179 Library numa found: YES 00:02:31.179 Has header "numaif.h" : YES 00:02:31.179 Library fdt found: NO 00:02:31.179 Library execinfo found: NO 00:02:31.179 Has header "execinfo.h" : YES 00:02:31.179 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:02:31.179 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:31.179 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:31.179 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:31.179 Run-time dependency openssl found: YES 3.0.9 00:02:31.179 Run-time dependency libpcap found: YES 1.10.4 00:02:31.179 Has header "pcap.h" with dependency libpcap: YES 00:02:31.179 Compiler for C supports arguments -Wcast-qual: YES 00:02:31.179 Compiler for C supports arguments -Wdeprecated: YES 00:02:31.179 Compiler for C supports arguments -Wformat: YES 00:02:31.179 Compiler for C supports arguments -Wformat-nonliteral: NO 00:02:31.179 Compiler for C supports arguments -Wformat-security: NO 00:02:31.179 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:31.179 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:31.179 Compiler for C supports arguments -Wnested-externs: YES 00:02:31.179 Compiler for C supports arguments -Wold-style-definition: YES 00:02:31.179 Compiler for C supports arguments -Wpointer-arith: YES 00:02:31.179 Compiler for C supports arguments -Wsign-compare: YES 00:02:31.179 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:31.179 Compiler for C supports arguments -Wundef: YES 00:02:31.179 Compiler for C supports arguments -Wwrite-strings: YES 00:02:31.179 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:31.179 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:02:31.179 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:31.179 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:02:31.179 Program objdump found: YES (/usr/bin/objdump) 00:02:31.179 Compiler for C supports arguments -mavx512f: YES 00:02:31.179 Checking if "AVX512 checking" compiles: YES 00:02:31.179 Fetching value of define "__SSE4_2__" : 1 00:02:31.179 Fetching value of define "__AES__" : 1 00:02:31.179 Fetching value of define "__AVX__" : 1 00:02:31.179 Fetching value of define "__AVX2__" : 1 00:02:31.179 Fetching value of define "__AVX512BW__" : 1 00:02:31.179 Fetching value of define "__AVX512CD__" : 1 00:02:31.179 Fetching value of define "__AVX512DQ__" : 1 00:02:31.179 Fetching value of define "__AVX512F__" : 1 00:02:31.179 Fetching value of define "__AVX512VL__" : 1 00:02:31.179 Fetching value of define "__PCLMUL__" : 1 00:02:31.179 Fetching value of define "__RDRND__" : 1 00:02:31.179 Fetching value of define "__RDSEED__" : 1 00:02:31.179 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:31.179 Fetching value of define "__znver1__" : (undefined) 00:02:31.179 Fetching value of define "__znver2__" : (undefined) 00:02:31.179 Fetching value of define "__znver3__" : (undefined) 00:02:31.179 Fetching value of define "__znver4__" : (undefined) 00:02:31.179 Compiler for C supports arguments -Wno-format-truncation: YES 00:02:31.179 Message: lib/log: Defining dependency "log" 00:02:31.179 Message: lib/kvargs: Defining dependency "kvargs" 00:02:31.179 Message: lib/telemetry: Defining dependency "telemetry" 00:02:31.179 Checking for function "getentropy" : NO 00:02:31.179 Message: lib/eal: Defining dependency "eal" 00:02:31.179 Message: lib/ring: Defining dependency "ring" 00:02:31.179 Message: lib/rcu: Defining dependency "rcu" 00:02:31.179 Message: lib/mempool: Defining dependency "mempool" 00:02:31.179 Message: lib/mbuf: Defining dependency "mbuf" 00:02:31.179 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:31.179 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:31.179 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:31.179 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:31.179 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:31.179 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:31.179 Compiler for C supports arguments -mpclmul: YES 00:02:31.179 Compiler for C supports arguments -maes: YES 00:02:31.179 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:31.179 Compiler for C supports arguments -mavx512bw: YES 00:02:31.179 Compiler for C supports arguments -mavx512dq: YES 00:02:31.179 Compiler for C supports arguments -mavx512vl: YES 00:02:31.179 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:31.179 Compiler for C supports arguments -mavx2: YES 00:02:31.179 Compiler for C supports arguments -mavx: YES 00:02:31.179 Message: lib/net: Defining dependency "net" 00:02:31.179 Message: lib/meter: Defining dependency "meter" 00:02:31.179 Message: lib/ethdev: Defining dependency "ethdev" 00:02:31.179 Message: lib/pci: Defining dependency "pci" 00:02:31.179 Message: lib/cmdline: Defining dependency "cmdline" 00:02:31.179 Message: lib/hash: Defining dependency "hash" 00:02:31.179 Message: lib/timer: Defining dependency "timer" 00:02:31.179 Message: lib/compressdev: Defining dependency "compressdev" 00:02:31.179 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:31.179 Message: lib/dmadev: Defining dependency "dmadev" 00:02:31.179 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:31.179 Message: lib/power: Defining dependency "power" 00:02:31.179 Message: lib/reorder: Defining dependency "reorder" 00:02:31.179 Message: lib/security: Defining dependency "security" 00:02:31.179 Has header "linux/userfaultfd.h" : YES 00:02:31.179 Has header "linux/vduse.h" : YES 00:02:31.179 Message: lib/vhost: Defining dependency "vhost" 00:02:31.179 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:02:31.179 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:02:31.179 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:31.179 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:31.179 Compiler for C supports arguments -std=c11: YES 00:02:31.179 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:02:31.179 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:02:31.179 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:02:31.179 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:02:31.179 Run-time dependency libmlx5 found: YES 1.24.44.0 00:02:31.179 Run-time dependency libibverbs found: YES 1.14.44.0 00:02:31.179 Library mtcr_ul found: NO 00:02:31.179 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:02:31.179 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:02:31.179 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:02:31.179 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:02:31.179 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:02:31.179 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:02:31.179 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:02:31.179 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:02:31.179 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:02:31.179 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:02:31.179 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:02:31.179 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:02:31.179 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:02:31.179 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:02:31.179 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:02:33.716 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:02:33.716 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:02:33.716 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:02:33.717 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:02:33.717 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:02:33.717 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:02:33.717 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:02:33.717 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:02:33.717 Configuring mlx5_autoconf.h using configuration 00:02:33.717 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:02:33.717 Run-time dependency libcrypto found: YES 3.0.9 00:02:33.717 Library IPSec_MB found: YES 00:02:33.717 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:02:33.717 Message: drivers/common/qat: Defining dependency "common_qat" 00:02:33.717 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:33.717 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:02:33.717 Library IPSec_MB found: YES 00:02:33.717 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:02:33.717 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:02:33.717 Compiler for C supports arguments -std=c11: YES (cached) 00:02:33.717 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:33.717 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:33.717 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:33.717 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:33.717 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:02:33.717 Run-time dependency libisal found: NO (tried pkgconfig) 00:02:33.717 Library libisal found: NO 00:02:33.717 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:02:33.717 Compiler for C supports arguments -std=c11: YES (cached) 00:02:33.717 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:33.717 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:33.717 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:33.717 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:33.717 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:02:33.717 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:02:33.717 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:02:33.717 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:02:33.717 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:02:33.717 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:02:33.717 Program doxygen found: YES (/usr/bin/doxygen) 00:02:33.717 Configuring doxy-api-html.conf using configuration 00:02:33.717 Configuring doxy-api-man.conf using configuration 00:02:33.717 Program mandb found: YES (/usr/bin/mandb) 00:02:33.717 Program sphinx-build found: NO 00:02:33.717 Configuring rte_build_config.h using configuration 00:02:33.717 Message: 00:02:33.717 ================= 00:02:33.717 Applications Enabled 00:02:33.717 ================= 00:02:33.717 00:02:33.717 apps: 00:02:33.717 00:02:33.717 00:02:33.717 Message: 00:02:33.717 ================= 00:02:33.717 Libraries Enabled 00:02:33.717 ================= 00:02:33.717 00:02:33.717 libs: 00:02:33.717 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:02:33.717 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:02:33.717 cryptodev, dmadev, power, reorder, security, vhost, 00:02:33.717 00:02:33.717 Message: 00:02:33.717 =============== 00:02:33.717 Drivers Enabled 00:02:33.717 =============== 00:02:33.717 00:02:33.717 common: 00:02:33.717 mlx5, qat, 00:02:33.717 bus: 00:02:33.717 auxiliary, pci, vdev, 00:02:33.717 mempool: 00:02:33.717 ring, 00:02:33.717 dma: 00:02:33.717 00:02:33.717 net: 00:02:33.717 00:02:33.717 crypto: 00:02:33.717 ipsec_mb, mlx5, 00:02:33.717 compress: 00:02:33.717 isal, mlx5, 00:02:33.717 vdpa: 00:02:33.717 00:02:33.717 00:02:33.717 Message: 00:02:33.717 ================= 00:02:33.717 Content Skipped 00:02:33.717 ================= 00:02:33.717 00:02:33.717 apps: 00:02:33.717 dumpcap: explicitly disabled via build config 00:02:33.717 graph: explicitly disabled via build config 00:02:33.717 pdump: explicitly disabled via build config 00:02:33.717 proc-info: explicitly disabled via build config 00:02:33.717 test-acl: explicitly disabled via build config 00:02:33.717 test-bbdev: explicitly disabled via build config 00:02:33.717 test-cmdline: explicitly disabled via build config 00:02:33.717 test-compress-perf: explicitly disabled via build config 00:02:33.717 test-crypto-perf: explicitly disabled via build config 00:02:33.717 test-dma-perf: explicitly disabled via build config 00:02:33.717 test-eventdev: explicitly disabled via build config 00:02:33.717 test-fib: explicitly disabled via build config 00:02:33.717 test-flow-perf: explicitly disabled via build config 00:02:33.717 test-gpudev: explicitly disabled via build config 00:02:33.717 test-mldev: explicitly disabled via build config 00:02:33.717 test-pipeline: explicitly disabled via build config 00:02:33.717 test-pmd: explicitly disabled via build config 00:02:33.717 test-regex: explicitly disabled via build config 00:02:33.717 test-sad: explicitly disabled via build config 00:02:33.717 test-security-perf: explicitly disabled via build config 00:02:33.717 00:02:33.717 libs: 00:02:33.717 argparse: explicitly disabled via build config 00:02:33.717 metrics: explicitly disabled via build config 00:02:33.717 acl: explicitly disabled via build config 00:02:33.717 bbdev: explicitly disabled via build config 00:02:33.717 bitratestats: explicitly disabled via build config 00:02:33.717 bpf: explicitly disabled via build config 00:02:33.717 cfgfile: explicitly disabled via build config 00:02:33.717 distributor: explicitly disabled via build config 00:02:33.718 efd: explicitly disabled via build config 00:02:33.718 eventdev: explicitly disabled via build config 00:02:33.718 dispatcher: explicitly disabled via build config 00:02:33.718 gpudev: explicitly disabled via build config 00:02:33.718 gro: explicitly disabled via build config 00:02:33.718 gso: explicitly disabled via build config 00:02:33.718 ip_frag: explicitly disabled via build config 00:02:33.718 jobstats: explicitly disabled via build config 00:02:33.718 latencystats: explicitly disabled via build config 00:02:33.718 lpm: explicitly disabled via build config 00:02:33.718 member: explicitly disabled via build config 00:02:33.718 pcapng: explicitly disabled via build config 00:02:33.718 rawdev: explicitly disabled via build config 00:02:33.718 regexdev: explicitly disabled via build config 00:02:33.718 mldev: explicitly disabled via build config 00:02:33.718 rib: explicitly disabled via build config 00:02:33.718 sched: explicitly disabled via build config 00:02:33.718 stack: explicitly disabled via build config 00:02:33.718 ipsec: explicitly disabled via build config 00:02:33.718 pdcp: explicitly disabled via build config 00:02:33.718 fib: explicitly disabled via build config 00:02:33.718 port: explicitly disabled via build config 00:02:33.718 pdump: explicitly disabled via build config 00:02:33.718 table: explicitly disabled via build config 00:02:33.718 pipeline: explicitly disabled via build config 00:02:33.718 graph: explicitly disabled via build config 00:02:33.718 node: explicitly disabled via build config 00:02:33.718 00:02:33.718 drivers: 00:02:33.718 common/cpt: not in enabled drivers build config 00:02:33.718 common/dpaax: not in enabled drivers build config 00:02:33.718 common/iavf: not in enabled drivers build config 00:02:33.718 common/idpf: not in enabled drivers build config 00:02:33.718 common/ionic: not in enabled drivers build config 00:02:33.718 common/mvep: not in enabled drivers build config 00:02:33.718 common/octeontx: not in enabled drivers build config 00:02:33.718 bus/cdx: not in enabled drivers build config 00:02:33.718 bus/dpaa: not in enabled drivers build config 00:02:33.718 bus/fslmc: not in enabled drivers build config 00:02:33.718 bus/ifpga: not in enabled drivers build config 00:02:33.718 bus/platform: not in enabled drivers build config 00:02:33.718 bus/uacce: not in enabled drivers build config 00:02:33.718 bus/vmbus: not in enabled drivers build config 00:02:33.718 common/cnxk: not in enabled drivers build config 00:02:33.718 common/nfp: not in enabled drivers build config 00:02:33.718 common/nitrox: not in enabled drivers build config 00:02:33.718 common/sfc_efx: not in enabled drivers build config 00:02:33.718 mempool/bucket: not in enabled drivers build config 00:02:33.718 mempool/cnxk: not in enabled drivers build config 00:02:33.718 mempool/dpaa: not in enabled drivers build config 00:02:33.718 mempool/dpaa2: not in enabled drivers build config 00:02:33.718 mempool/octeontx: not in enabled drivers build config 00:02:33.718 mempool/stack: not in enabled drivers build config 00:02:33.718 dma/cnxk: not in enabled drivers build config 00:02:33.718 dma/dpaa: not in enabled drivers build config 00:02:33.718 dma/dpaa2: not in enabled drivers build config 00:02:33.718 dma/hisilicon: not in enabled drivers build config 00:02:33.718 dma/idxd: not in enabled drivers build config 00:02:33.718 dma/ioat: not in enabled drivers build config 00:02:33.718 dma/skeleton: not in enabled drivers build config 00:02:33.718 net/af_packet: not in enabled drivers build config 00:02:33.718 net/af_xdp: not in enabled drivers build config 00:02:33.718 net/ark: not in enabled drivers build config 00:02:33.718 net/atlantic: not in enabled drivers build config 00:02:33.718 net/avp: not in enabled drivers build config 00:02:33.718 net/axgbe: not in enabled drivers build config 00:02:33.718 net/bnx2x: not in enabled drivers build config 00:02:33.718 net/bnxt: not in enabled drivers build config 00:02:33.718 net/bonding: not in enabled drivers build config 00:02:33.718 net/cnxk: not in enabled drivers build config 00:02:33.718 net/cpfl: not in enabled drivers build config 00:02:33.718 net/cxgbe: not in enabled drivers build config 00:02:33.718 net/dpaa: not in enabled drivers build config 00:02:33.718 net/dpaa2: not in enabled drivers build config 00:02:33.718 net/e1000: not in enabled drivers build config 00:02:33.718 net/ena: not in enabled drivers build config 00:02:33.718 net/enetc: not in enabled drivers build config 00:02:33.718 net/enetfec: not in enabled drivers build config 00:02:33.718 net/enic: not in enabled drivers build config 00:02:33.718 net/failsafe: not in enabled drivers build config 00:02:33.718 net/fm10k: not in enabled drivers build config 00:02:33.718 net/gve: not in enabled drivers build config 00:02:33.718 net/hinic: not in enabled drivers build config 00:02:33.718 net/hns3: not in enabled drivers build config 00:02:33.718 net/i40e: not in enabled drivers build config 00:02:33.718 net/iavf: not in enabled drivers build config 00:02:33.718 net/ice: not in enabled drivers build config 00:02:33.718 net/idpf: not in enabled drivers build config 00:02:33.718 net/igc: not in enabled drivers build config 00:02:33.718 net/ionic: not in enabled drivers build config 00:02:33.718 net/ipn3ke: not in enabled drivers build config 00:02:33.718 net/ixgbe: not in enabled drivers build config 00:02:33.718 net/mana: not in enabled drivers build config 00:02:33.718 net/memif: not in enabled drivers build config 00:02:33.718 net/mlx4: not in enabled drivers build config 00:02:33.718 net/mlx5: not in enabled drivers build config 00:02:33.718 net/mvneta: not in enabled drivers build config 00:02:33.718 net/mvpp2: not in enabled drivers build config 00:02:33.718 net/netvsc: not in enabled drivers build config 00:02:33.718 net/nfb: not in enabled drivers build config 00:02:33.718 net/nfp: not in enabled drivers build config 00:02:33.718 net/ngbe: not in enabled drivers build config 00:02:33.718 net/null: not in enabled drivers build config 00:02:33.718 net/octeontx: not in enabled drivers build config 00:02:33.718 net/octeon_ep: not in enabled drivers build config 00:02:33.718 net/pcap: not in enabled drivers build config 00:02:33.718 net/pfe: not in enabled drivers build config 00:02:33.718 net/qede: not in enabled drivers build config 00:02:33.718 net/ring: not in enabled drivers build config 00:02:33.718 net/sfc: not in enabled drivers build config 00:02:33.718 net/softnic: not in enabled drivers build config 00:02:33.718 net/tap: not in enabled drivers build config 00:02:33.718 net/thunderx: not in enabled drivers build config 00:02:33.718 net/txgbe: not in enabled drivers build config 00:02:33.718 net/vdev_netvsc: not in enabled drivers build config 00:02:33.718 net/vhost: not in enabled drivers build config 00:02:33.718 net/virtio: not in enabled drivers build config 00:02:33.718 net/vmxnet3: not in enabled drivers build config 00:02:33.718 raw/*: missing internal dependency, "rawdev" 00:02:33.718 crypto/armv8: not in enabled drivers build config 00:02:33.718 crypto/bcmfs: not in enabled drivers build config 00:02:33.718 crypto/caam_jr: not in enabled drivers build config 00:02:33.718 crypto/ccp: not in enabled drivers build config 00:02:33.718 crypto/cnxk: not in enabled drivers build config 00:02:33.718 crypto/dpaa_sec: not in enabled drivers build config 00:02:33.718 crypto/dpaa2_sec: not in enabled drivers build config 00:02:33.718 crypto/mvsam: not in enabled drivers build config 00:02:33.718 crypto/nitrox: not in enabled drivers build config 00:02:33.718 crypto/null: not in enabled drivers build config 00:02:33.718 crypto/octeontx: not in enabled drivers build config 00:02:33.718 crypto/openssl: not in enabled drivers build config 00:02:33.718 crypto/scheduler: not in enabled drivers build config 00:02:33.718 crypto/uadk: not in enabled drivers build config 00:02:33.718 crypto/virtio: not in enabled drivers build config 00:02:33.718 compress/nitrox: not in enabled drivers build config 00:02:33.718 compress/octeontx: not in enabled drivers build config 00:02:33.718 compress/zlib: not in enabled drivers build config 00:02:33.718 regex/*: missing internal dependency, "regexdev" 00:02:33.718 ml/*: missing internal dependency, "mldev" 00:02:33.718 vdpa/ifc: not in enabled drivers build config 00:02:33.718 vdpa/mlx5: not in enabled drivers build config 00:02:33.718 vdpa/nfp: not in enabled drivers build config 00:02:33.718 vdpa/sfc: not in enabled drivers build config 00:02:33.718 event/*: missing internal dependency, "eventdev" 00:02:33.718 baseband/*: missing internal dependency, "bbdev" 00:02:33.718 gpu/*: missing internal dependency, "gpudev" 00:02:33.718 00:02:33.718 00:02:33.978 Build targets in project: 115 00:02:33.978 00:02:33.978 DPDK 24.03.0 00:02:33.978 00:02:33.978 User defined options 00:02:33.978 buildtype : debug 00:02:33.978 default_library : shared 00:02:33.978 libdir : lib 00:02:33.978 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:02:33.978 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:02:33.978 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:02:33.978 cpu_instruction_set: native 00:02:33.978 disable_apps : test-sad,test-acl,test-dma-perf,test-pipeline,test-compress-perf,test-fib,test-flow-perf,test-crypto-perf,test-bbdev,test-eventdev,pdump,test-mldev,test-cmdline,graph,test-security-perf,test-pmd,test,proc-info,test-regex,dumpcap,test-gpudev 00:02:33.978 disable_libs : port,sched,rib,node,ipsec,distributor,gro,eventdev,pdcp,acl,member,latencystats,efd,stack,regexdev,rawdev,bpf,metrics,gpudev,pipeline,pdump,table,fib,dispatcher,mldev,gso,cfgfile,bitratestats,ip_frag,graph,lpm,jobstats,argparse,pcapng,bbdev 00:02:33.978 enable_docs : false 00:02:33.978 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:02:33.978 enable_kmods : false 00:02:33.978 max_lcores : 128 00:02:33.978 tests : false 00:02:33.978 00:02:33.978 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:34.552 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:02:34.552 [1/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:34.552 [2/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:34.552 [3/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:34.552 [4/378] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:02:34.552 [5/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:34.552 [6/378] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:34.552 [7/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:34.552 [8/378] Linking static target lib/librte_kvargs.a 00:02:34.552 [9/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:34.552 [10/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:34.552 [11/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:34.552 [12/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:34.552 [13/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:34.814 [14/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:34.814 [15/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:34.814 [16/378] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:34.814 [17/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:34.814 [18/378] Linking static target lib/librte_log.a 00:02:34.814 [19/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:35.085 [20/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:35.085 [21/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:35.085 [22/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:35.085 [23/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:35.085 [24/378] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:35.085 [25/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:35.085 [26/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:35.085 [27/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:35.085 [28/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:35.085 [29/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:35.085 [30/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:35.085 [31/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:35.085 [32/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:35.085 [33/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:35.085 [34/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:35.086 [35/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:35.086 [36/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:35.086 [37/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:35.086 [38/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:35.086 [39/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:35.086 [40/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:35.086 [41/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:35.086 [42/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:35.086 [43/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:35.086 [44/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:35.345 [45/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:35.345 [46/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:35.345 [47/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:35.345 [48/378] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:35.345 [49/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:35.345 [50/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:35.345 [51/378] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:35.345 [52/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:35.345 [53/378] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:35.345 [54/378] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:35.345 [55/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:35.345 [56/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:35.345 [57/378] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:35.345 [58/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:35.345 [59/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:35.345 [60/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:35.345 [61/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:35.345 [62/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:35.345 [63/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:35.345 [64/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:35.345 [65/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:35.345 [66/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:35.345 [67/378] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:35.345 [68/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:35.345 [69/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:35.345 [70/378] Linking static target lib/librte_telemetry.a 00:02:35.345 [71/378] Linking static target lib/librte_pci.a 00:02:35.345 [72/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:35.345 [73/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:35.345 [74/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:35.345 [75/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:35.345 [76/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:35.345 [77/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:35.345 [78/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:35.345 [79/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:35.345 [80/378] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:02:35.345 [81/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:35.345 [82/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:35.345 [83/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:35.345 [84/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:35.345 [85/378] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:35.345 [86/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:35.345 [87/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:35.345 [88/378] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:35.345 [89/378] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:35.345 [90/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:35.345 [91/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:35.345 [92/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:35.345 [93/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:35.345 [94/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:35.603 [95/378] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:35.603 [96/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:35.603 [97/378] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:35.603 [98/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:35.603 [99/378] Linking static target lib/librte_mempool.a 00:02:35.603 [100/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:35.603 [101/378] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:35.603 [102/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:35.603 [103/378] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:35.603 [104/378] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:35.603 [105/378] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:35.603 [106/378] Linking static target lib/librte_net.a 00:02:35.604 [107/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:02:35.604 [108/378] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:35.604 [109/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:35.604 [110/378] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:35.604 [111/378] Linking static target lib/librte_meter.a 00:02:35.604 [112/378] Linking target lib/librte_log.so.24.1 00:02:35.870 [113/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:35.870 [114/378] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:35.870 [115/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:35.870 [116/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:35.870 [117/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:35.870 [118/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:02:35.870 [119/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:35.870 [120/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:35.870 [121/378] Linking static target lib/librte_mbuf.a 00:02:35.870 [122/378] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:02:35.870 [123/378] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:35.870 [124/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:35.870 [125/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:35.870 [126/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:35.870 [127/378] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:35.870 [128/378] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:35.870 [129/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:35.870 [130/378] Linking static target lib/librte_timer.a 00:02:35.870 [131/378] Linking target lib/librte_kvargs.so.24.1 00:02:35.870 [132/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:35.870 [133/378] Linking static target lib/librte_cmdline.a 00:02:35.870 [134/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:35.870 [135/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:36.132 [136/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:36.132 [137/378] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:36.132 [138/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:02:36.132 [139/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:36.132 [140/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:36.132 [141/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:36.132 [142/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:36.132 [143/378] Linking static target lib/librte_eal.a 00:02:36.132 [144/378] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:36.132 [145/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:36.132 [146/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:36.132 [147/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:36.132 [148/378] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:36.132 [149/378] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:36.132 [150/378] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:36.132 [151/378] Linking static target lib/librte_compressdev.a 00:02:36.132 [152/378] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:36.132 [153/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:36.132 [154/378] Linking static target lib/librte_ring.a 00:02:36.132 [155/378] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:36.132 [156/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:02:36.132 [157/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:02:36.132 [158/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:36.132 [159/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:36.132 [160/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:36.132 [161/378] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:36.132 [162/378] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:36.132 [163/378] Linking static target lib/librte_dmadev.a 00:02:36.132 [164/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:36.132 [165/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:02:36.132 [166/378] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:02:36.132 [167/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:36.132 [168/378] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:36.132 [169/378] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:36.132 [170/378] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:36.132 [171/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:36.132 [172/378] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:02:36.132 [173/378] Linking static target lib/librte_power.a 00:02:36.132 [174/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:36.132 [175/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:36.132 [176/378] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:36.396 [177/378] Linking static target lib/librte_reorder.a 00:02:36.396 [178/378] Linking target lib/librte_telemetry.so.24.1 00:02:36.396 [179/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:36.396 [180/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:36.396 [181/378] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:36.396 [182/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:36.396 [183/378] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:36.396 [184/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:02:36.396 [185/378] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:36.396 [186/378] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:36.396 [187/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:36.396 [188/378] Linking static target lib/librte_security.a 00:02:36.396 [189/378] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:36.396 [190/378] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:36.396 [191/378] Linking static target lib/librte_rcu.a 00:02:36.656 [192/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:02:36.656 [193/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:02:36.656 [194/378] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:36.656 [195/378] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:02:36.656 [196/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:02:36.656 [197/378] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:02:36.656 [198/378] Linking static target lib/librte_hash.a 00:02:36.656 [199/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:36.656 [200/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:36.656 [201/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:02:36.656 [202/378] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:36.656 [203/378] Compiling C object drivers/librte_bus_auxiliary.so.24.1.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:36.656 [204/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:02:36.656 [205/378] Linking static target drivers/librte_bus_auxiliary.a 00:02:36.656 [206/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:36.656 [207/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:02:36.656 [208/378] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:36.656 [209/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:02:36.656 [210/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:02:36.656 [211/378] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:36.656 [212/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:02:36.656 [213/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:02:36.656 [214/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:02:36.656 [215/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:02:36.656 [216/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:02:36.656 [217/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:02:36.656 [218/378] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:36.915 [219/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:02:36.915 [220/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen5.c.o 00:02:36.915 [221/378] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:36.915 [222/378] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:36.915 [223/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen5.c.o 00:02:36.915 [224/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:02:36.915 [225/378] Linking static target drivers/librte_bus_vdev.a 00:02:36.915 [226/378] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:36.915 [227/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:02:36.915 [228/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:02:36.915 [229/378] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:36.915 [230/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen_lce.c.o 00:02:36.915 [231/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:02:36.915 [232/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:02:36.915 [233/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:02:36.915 [234/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:02:36.915 [235/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:36.915 [236/378] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:36.915 [237/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:02:36.915 [238/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:36.915 [239/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:02:36.915 [240/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:02:36.915 [241/378] Linking static target lib/librte_cryptodev.a 00:02:36.915 [242/378] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:36.915 [243/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:02:36.915 [244/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:02:36.915 [245/378] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:02:36.915 [246/378] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:36.915 [247/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen5.c.o 00:02:36.915 [248/378] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:36.915 [249/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen_lce.c.o 00:02:37.174 [250/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:02:37.174 [251/378] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:37.174 [252/378] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:37.175 [253/378] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:37.175 [254/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:02:37.175 [255/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:02:37.175 [256/378] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:37.175 [257/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:02:37.175 [258/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:02:37.175 [259/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:02:37.175 [260/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:37.175 [261/378] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:02:37.175 [262/378] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:37.175 [263/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:02:37.175 [264/378] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:37.175 [265/378] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:02:37.175 [266/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:02:37.434 [267/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:02:37.434 [268/378] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:37.434 [269/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:02:37.434 [270/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:02:37.434 [271/378] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:37.434 [272/378] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:37.434 [273/378] Linking static target drivers/librte_mempool_ring.a 00:02:37.434 [274/378] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:37.434 [275/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:02:37.434 [276/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:02:37.434 [277/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:02:37.434 [278/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:37.434 [279/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:02:37.434 [280/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:02:37.434 [281/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:02:37.434 [282/378] Linking static target lib/librte_ethdev.a 00:02:37.434 [283/378] Linking static target drivers/libtmp_rte_common_mlx5.a 00:02:37.434 [284/378] Linking static target drivers/libtmp_rte_compress_isal.a 00:02:37.434 [285/378] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:37.434 [286/378] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:02:37.434 [287/378] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:02:37.434 [288/378] Compiling C object drivers/librte_compress_mlx5.so.24.1.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:37.434 [289/378] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:37.434 [290/378] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:37.434 [291/378] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:37.434 [292/378] Linking static target drivers/librte_compress_mlx5.a 00:02:37.434 [293/378] Linking static target drivers/librte_bus_pci.a 00:02:37.434 [294/378] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:37.694 [295/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:02:37.694 [296/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:37.694 [297/378] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:02:37.694 [298/378] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:02:37.694 [299/378] Compiling C object drivers/librte_compress_isal.so.24.1.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:37.694 [300/378] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:37.694 [301/378] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:37.694 [302/378] Linking static target drivers/librte_compress_isal.a 00:02:37.694 [303/378] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:02:37.694 [304/378] Compiling C object drivers/librte_crypto_mlx5.so.24.1.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:37.694 [305/378] Linking static target drivers/librte_crypto_mlx5.a 00:02:37.694 [306/378] Compiling C object drivers/librte_common_mlx5.so.24.1.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:37.694 [307/378] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:37.694 [308/378] Linking static target drivers/librte_common_mlx5.a 00:02:38.262 [309/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:02:38.262 [310/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:02:38.262 [311/378] Linking static target drivers/libtmp_rte_common_qat.a 00:02:38.262 [312/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:02:38.262 [313/378] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:38.521 [314/378] Generating drivers/rte_common_qat.pmd.c with a custom command 00:02:38.521 [315/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:02:38.521 [316/378] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:02:38.521 [317/378] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:38.521 [318/378] Compiling C object drivers/librte_common_qat.so.24.1.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:38.521 [319/378] Linking static target drivers/librte_common_qat.a 00:02:38.779 [320/378] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:02:38.780 [321/378] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.1.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:38.780 [322/378] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:39.063 [323/378] Linking static target drivers/librte_crypto_ipsec_mb.a 00:02:39.063 [324/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:39.063 [325/378] Linking static target lib/librte_vhost.a 00:02:39.063 [326/378] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:41.606 [327/378] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:44.141 [328/378] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:02:47.435 [329/378] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:49.343 [330/378] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:49.343 [331/378] Linking target lib/librte_eal.so.24.1 00:02:49.603 [332/378] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:02:49.603 [333/378] Linking target lib/librte_ring.so.24.1 00:02:49.603 [334/378] Linking target lib/librte_dmadev.so.24.1 00:02:49.603 [335/378] Linking target lib/librte_timer.so.24.1 00:02:49.603 [336/378] Linking target drivers/librte_bus_vdev.so.24.1 00:02:49.603 [337/378] Linking target lib/librte_meter.so.24.1 00:02:49.603 [338/378] Linking target drivers/librte_bus_auxiliary.so.24.1 00:02:49.603 [339/378] Linking target lib/librte_pci.so.24.1 00:02:49.862 [340/378] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:02:49.862 [341/378] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:02:49.862 [342/378] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:02:49.862 [343/378] Generating symbol file drivers/librte_bus_vdev.so.24.1.p/librte_bus_vdev.so.24.1.symbols 00:02:49.862 [344/378] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:02:49.862 [345/378] Linking target lib/librte_rcu.so.24.1 00:02:49.862 [346/378] Linking target lib/librte_mempool.so.24.1 00:02:49.862 [347/378] Generating symbol file drivers/librte_bus_auxiliary.so.24.1.p/librte_bus_auxiliary.so.24.1.symbols 00:02:49.862 [348/378] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:02:50.122 [349/378] Linking target drivers/librte_bus_pci.so.24.1 00:02:50.122 [350/378] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:02:50.122 [351/378] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:02:50.122 [352/378] Linking target drivers/librte_mempool_ring.so.24.1 00:02:50.122 [353/378] Linking target lib/librte_mbuf.so.24.1 00:02:50.122 [354/378] Generating symbol file drivers/librte_bus_pci.so.24.1.p/librte_bus_pci.so.24.1.symbols 00:02:50.122 [355/378] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:02:50.381 [356/378] Linking target lib/librte_net.so.24.1 00:02:50.381 [357/378] Linking target lib/librte_reorder.so.24.1 00:02:50.381 [358/378] Linking target lib/librte_compressdev.so.24.1 00:02:50.381 [359/378] Linking target lib/librte_cryptodev.so.24.1 00:02:50.381 [360/378] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:02:50.381 [361/378] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:02:50.381 [362/378] Generating symbol file lib/librte_compressdev.so.24.1.p/librte_compressdev.so.24.1.symbols 00:02:50.381 [363/378] Linking target lib/librte_hash.so.24.1 00:02:50.381 [364/378] Linking target lib/librte_security.so.24.1 00:02:50.381 [365/378] Linking target lib/librte_ethdev.so.24.1 00:02:50.381 [366/378] Linking target drivers/librte_compress_isal.so.24.1 00:02:50.641 [367/378] Linking target lib/librte_cmdline.so.24.1 00:02:50.641 [368/378] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:02:50.641 [369/378] Generating symbol file lib/librte_security.so.24.1.p/librte_security.so.24.1.symbols 00:02:50.641 [370/378] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:02:50.641 [371/378] Linking target lib/librte_power.so.24.1 00:02:50.641 [372/378] Linking target lib/librte_vhost.so.24.1 00:02:50.641 [373/378] Linking target drivers/librte_common_mlx5.so.24.1 00:02:50.900 [374/378] Linking target drivers/librte_crypto_ipsec_mb.so.24.1 00:02:50.900 [375/378] Generating symbol file drivers/librte_common_mlx5.so.24.1.p/librte_common_mlx5.so.24.1.symbols 00:02:50.900 [376/378] Linking target drivers/librte_common_qat.so.24.1 00:02:50.900 [377/378] Linking target drivers/librte_crypto_mlx5.so.24.1 00:02:51.159 [378/378] Linking target drivers/librte_compress_mlx5.so.24.1 00:02:51.159 INFO: autodetecting backend as ninja 00:02:51.159 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 72 00:02:52.541 CC lib/ut/ut.o 00:02:52.541 CC lib/log/log.o 00:02:52.541 CC lib/log/log_flags.o 00:02:52.541 CC lib/log/log_deprecated.o 00:02:52.541 CC lib/ut_mock/mock.o 00:02:52.541 LIB libspdk_ut.a 00:02:52.541 SO libspdk_ut.so.2.0 00:02:52.541 LIB libspdk_ut_mock.a 00:02:52.541 LIB libspdk_log.a 00:02:52.541 SO libspdk_ut_mock.so.6.0 00:02:52.541 SYMLINK libspdk_ut.so 00:02:52.541 SO libspdk_log.so.7.0 00:02:52.801 SYMLINK libspdk_ut_mock.so 00:02:52.801 SYMLINK libspdk_log.so 00:02:53.060 CC lib/dma/dma.o 00:02:53.060 CC lib/ioat/ioat.o 00:02:53.060 CC lib/util/base64.o 00:02:53.060 CC lib/util/bit_array.o 00:02:53.060 CC lib/util/cpuset.o 00:02:53.060 CC lib/util/crc16.o 00:02:53.060 CC lib/util/crc32.o 00:02:53.060 CC lib/util/crc32c.o 00:02:53.060 CC lib/util/crc32_ieee.o 00:02:53.060 CXX lib/trace_parser/trace.o 00:02:53.060 CC lib/util/crc64.o 00:02:53.060 CC lib/util/dif.o 00:02:53.060 CC lib/util/fd.o 00:02:53.060 CC lib/util/fd_group.o 00:02:53.060 CC lib/util/file.o 00:02:53.060 CC lib/util/hexlify.o 00:02:53.060 CC lib/util/iov.o 00:02:53.060 CC lib/util/math.o 00:02:53.060 CC lib/util/net.o 00:02:53.060 CC lib/util/strerror_tls.o 00:02:53.060 CC lib/util/pipe.o 00:02:53.060 CC lib/util/string.o 00:02:53.060 CC lib/util/xor.o 00:02:53.060 CC lib/util/uuid.o 00:02:53.060 CC lib/util/zipf.o 00:02:53.319 CC lib/vfio_user/host/vfio_user_pci.o 00:02:53.319 CC lib/vfio_user/host/vfio_user.o 00:02:53.319 LIB libspdk_dma.a 00:02:53.319 SO libspdk_dma.so.4.0 00:02:53.319 LIB libspdk_ioat.a 00:02:53.319 SYMLINK libspdk_dma.so 00:02:53.578 SO libspdk_ioat.so.7.0 00:02:53.579 LIB libspdk_vfio_user.a 00:02:53.579 SYMLINK libspdk_ioat.so 00:02:53.579 SO libspdk_vfio_user.so.5.0 00:02:53.579 SYMLINK libspdk_vfio_user.so 00:02:53.579 LIB libspdk_util.a 00:02:53.837 SO libspdk_util.so.10.0 00:02:54.097 SYMLINK libspdk_util.so 00:02:54.097 LIB libspdk_trace_parser.a 00:02:54.097 SO libspdk_trace_parser.so.5.0 00:02:54.356 SYMLINK libspdk_trace_parser.so 00:02:54.356 CC lib/reduce/reduce.o 00:02:54.356 CC lib/vmd/vmd.o 00:02:54.356 CC lib/vmd/led.o 00:02:54.356 CC lib/rdma_utils/rdma_utils.o 00:02:54.356 CC lib/idxd/idxd.o 00:02:54.356 CC lib/idxd/idxd_user.o 00:02:54.356 CC lib/idxd/idxd_kernel.o 00:02:54.356 CC lib/env_dpdk/env.o 00:02:54.356 CC lib/env_dpdk/memory.o 00:02:54.356 CC lib/rdma_provider/common.o 00:02:54.356 CC lib/rdma_provider/rdma_provider_verbs.o 00:02:54.356 CC lib/json/json_parse.o 00:02:54.356 CC lib/env_dpdk/pci.o 00:02:54.356 CC lib/env_dpdk/init.o 00:02:54.356 CC lib/conf/conf.o 00:02:54.356 CC lib/json/json_util.o 00:02:54.356 CC lib/json/json_write.o 00:02:54.356 CC lib/env_dpdk/threads.o 00:02:54.356 CC lib/env_dpdk/pci_ioat.o 00:02:54.356 CC lib/env_dpdk/pci_virtio.o 00:02:54.356 CC lib/env_dpdk/pci_vmd.o 00:02:54.356 CC lib/env_dpdk/pci_idxd.o 00:02:54.356 CC lib/env_dpdk/pci_event.o 00:02:54.356 CC lib/env_dpdk/sigbus_handler.o 00:02:54.356 CC lib/env_dpdk/pci_dpdk.o 00:02:54.356 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:54.356 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:54.614 LIB libspdk_rdma_provider.a 00:02:54.614 LIB libspdk_conf.a 00:02:54.614 LIB libspdk_rdma_utils.a 00:02:54.614 LIB libspdk_json.a 00:02:54.615 SO libspdk_rdma_provider.so.6.0 00:02:54.615 SO libspdk_conf.so.6.0 00:02:54.615 SO libspdk_rdma_utils.so.1.0 00:02:54.615 SO libspdk_json.so.6.0 00:02:54.873 SYMLINK libspdk_rdma_provider.so 00:02:54.873 SYMLINK libspdk_conf.so 00:02:54.873 SYMLINK libspdk_rdma_utils.so 00:02:54.873 SYMLINK libspdk_json.so 00:02:55.133 LIB libspdk_vmd.a 00:02:55.133 LIB libspdk_reduce.a 00:02:55.133 LIB libspdk_idxd.a 00:02:55.133 SO libspdk_vmd.so.6.0 00:02:55.133 SO libspdk_reduce.so.6.1 00:02:55.133 SO libspdk_idxd.so.12.0 00:02:55.133 SYMLINK libspdk_vmd.so 00:02:55.133 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:55.133 CC lib/jsonrpc/jsonrpc_server.o 00:02:55.133 CC lib/jsonrpc/jsonrpc_client.o 00:02:55.133 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:55.133 SYMLINK libspdk_idxd.so 00:02:55.133 SYMLINK libspdk_reduce.so 00:02:55.392 LIB libspdk_jsonrpc.a 00:02:55.653 SO libspdk_jsonrpc.so.6.0 00:02:55.653 SYMLINK libspdk_jsonrpc.so 00:02:55.913 CC lib/rpc/rpc.o 00:02:56.173 LIB libspdk_rpc.a 00:02:56.173 SO libspdk_rpc.so.6.0 00:02:56.433 SYMLINK libspdk_rpc.so 00:02:56.693 CC lib/notify/notify.o 00:02:56.693 CC lib/notify/notify_rpc.o 00:02:56.693 CC lib/keyring/keyring.o 00:02:56.693 CC lib/keyring/keyring_rpc.o 00:02:56.693 CC lib/trace/trace.o 00:02:56.693 CC lib/trace/trace_flags.o 00:02:56.693 CC lib/trace/trace_rpc.o 00:02:56.953 LIB libspdk_notify.a 00:02:56.953 SO libspdk_notify.so.6.0 00:02:56.953 LIB libspdk_trace.a 00:02:56.953 SYMLINK libspdk_notify.so 00:02:56.953 SO libspdk_trace.so.10.0 00:02:57.212 SYMLINK libspdk_trace.so 00:02:57.212 LIB libspdk_keyring.a 00:02:57.212 SO libspdk_keyring.so.1.0 00:02:57.212 LIB libspdk_env_dpdk.a 00:02:57.212 SYMLINK libspdk_keyring.so 00:02:57.471 SO libspdk_env_dpdk.so.15.0 00:02:57.471 CC lib/thread/thread.o 00:02:57.471 CC lib/thread/iobuf.o 00:02:57.471 CC lib/sock/sock.o 00:02:57.471 CC lib/sock/sock_rpc.o 00:02:57.731 SYMLINK libspdk_env_dpdk.so 00:02:57.731 LIB libspdk_sock.a 00:02:57.731 SO libspdk_sock.so.10.0 00:02:57.990 SYMLINK libspdk_sock.so 00:02:58.249 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:58.249 CC lib/nvme/nvme_ctrlr.o 00:02:58.249 CC lib/nvme/nvme_fabric.o 00:02:58.249 CC lib/nvme/nvme_ns_cmd.o 00:02:58.249 CC lib/nvme/nvme_ns.o 00:02:58.249 CC lib/nvme/nvme_pcie_common.o 00:02:58.249 CC lib/nvme/nvme_pcie.o 00:02:58.249 CC lib/nvme/nvme_qpair.o 00:02:58.249 CC lib/nvme/nvme.o 00:02:58.249 CC lib/nvme/nvme_quirks.o 00:02:58.249 CC lib/nvme/nvme_transport.o 00:02:58.249 CC lib/nvme/nvme_discovery.o 00:02:58.249 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:58.249 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:58.249 CC lib/nvme/nvme_tcp.o 00:02:58.249 CC lib/nvme/nvme_opal.o 00:02:58.249 CC lib/nvme/nvme_io_msg.o 00:02:58.249 CC lib/nvme/nvme_poll_group.o 00:02:58.249 CC lib/nvme/nvme_zns.o 00:02:58.249 CC lib/nvme/nvme_stubs.o 00:02:58.249 CC lib/nvme/nvme_auth.o 00:02:58.249 CC lib/nvme/nvme_cuse.o 00:02:58.249 CC lib/nvme/nvme_rdma.o 00:02:59.186 LIB libspdk_thread.a 00:02:59.186 SO libspdk_thread.so.10.1 00:02:59.186 SYMLINK libspdk_thread.so 00:02:59.445 CC lib/blob/blobstore.o 00:02:59.445 CC lib/blob/request.o 00:02:59.445 CC lib/blob/zeroes.o 00:02:59.445 CC lib/blob/blob_bs_dev.o 00:02:59.445 CC lib/accel/accel.o 00:02:59.445 CC lib/accel/accel_rpc.o 00:02:59.445 CC lib/accel/accel_sw.o 00:02:59.445 CC lib/virtio/virtio.o 00:02:59.445 CC lib/virtio/virtio_vhost_user.o 00:02:59.445 CC lib/virtio/virtio_vfio_user.o 00:02:59.445 CC lib/virtio/virtio_pci.o 00:02:59.445 CC lib/init/json_config.o 00:02:59.445 CC lib/init/subsystem.o 00:02:59.445 CC lib/init/subsystem_rpc.o 00:02:59.445 CC lib/init/rpc.o 00:02:59.704 LIB libspdk_init.a 00:02:59.704 SO libspdk_init.so.5.0 00:03:00.015 SYMLINK libspdk_init.so 00:03:00.275 LIB libspdk_virtio.a 00:03:00.275 SO libspdk_virtio.so.7.0 00:03:00.275 CC lib/event/app.o 00:03:00.275 CC lib/event/log_rpc.o 00:03:00.275 CC lib/event/reactor.o 00:03:00.275 CC lib/event/app_rpc.o 00:03:00.275 CC lib/event/scheduler_static.o 00:03:00.275 SYMLINK libspdk_virtio.so 00:03:00.535 LIB libspdk_accel.a 00:03:00.535 LIB libspdk_nvme.a 00:03:00.535 SO libspdk_accel.so.16.0 00:03:00.535 SYMLINK libspdk_accel.so 00:03:00.793 SO libspdk_nvme.so.13.1 00:03:00.793 LIB libspdk_event.a 00:03:00.793 SO libspdk_event.so.14.0 00:03:01.052 SYMLINK libspdk_event.so 00:03:01.052 CC lib/bdev/bdev.o 00:03:01.052 CC lib/bdev/bdev_rpc.o 00:03:01.052 CC lib/bdev/bdev_zone.o 00:03:01.052 CC lib/bdev/part.o 00:03:01.052 CC lib/bdev/scsi_nvme.o 00:03:01.052 SYMLINK libspdk_nvme.so 00:03:02.441 LIB libspdk_blob.a 00:03:02.441 SO libspdk_blob.so.11.0 00:03:02.701 SYMLINK libspdk_blob.so 00:03:02.961 CC lib/blobfs/blobfs.o 00:03:02.961 CC lib/lvol/lvol.o 00:03:02.961 CC lib/blobfs/tree.o 00:03:03.900 LIB libspdk_lvol.a 00:03:04.160 SO libspdk_lvol.so.10.0 00:03:04.160 LIB libspdk_blobfs.a 00:03:04.160 SYMLINK libspdk_lvol.so 00:03:04.160 SO libspdk_blobfs.so.10.0 00:03:04.160 SYMLINK libspdk_blobfs.so 00:03:06.700 LIB libspdk_bdev.a 00:03:06.700 SO libspdk_bdev.so.16.0 00:03:06.700 SYMLINK libspdk_bdev.so 00:03:07.274 CC lib/scsi/dev.o 00:03:07.274 CC lib/scsi/lun.o 00:03:07.274 CC lib/scsi/port.o 00:03:07.274 CC lib/scsi/scsi.o 00:03:07.274 CC lib/scsi/scsi_bdev.o 00:03:07.274 CC lib/scsi/scsi_pr.o 00:03:07.274 CC lib/scsi/scsi_rpc.o 00:03:07.274 CC lib/scsi/task.o 00:03:07.274 CC lib/nvmf/ctrlr.o 00:03:07.274 CC lib/nvmf/ctrlr_discovery.o 00:03:07.274 CC lib/ublk/ublk.o 00:03:07.274 CC lib/nvmf/ctrlr_bdev.o 00:03:07.274 CC lib/ftl/ftl_core.o 00:03:07.274 CC lib/ublk/ublk_rpc.o 00:03:07.274 CC lib/nvmf/subsystem.o 00:03:07.274 CC lib/nvmf/nvmf.o 00:03:07.274 CC lib/ftl/ftl_init.o 00:03:07.274 CC lib/ftl/ftl_layout.o 00:03:07.274 CC lib/ftl/ftl_debug.o 00:03:07.274 CC lib/nvmf/nvmf_rpc.o 00:03:07.274 CC lib/nvmf/transport.o 00:03:07.274 CC lib/ftl/ftl_io.o 00:03:07.274 CC lib/nvmf/tcp.o 00:03:07.274 CC lib/nbd/nbd.o 00:03:07.274 CC lib/ftl/ftl_sb.o 00:03:07.274 CC lib/nbd/nbd_rpc.o 00:03:07.274 CC lib/nvmf/stubs.o 00:03:07.274 CC lib/ftl/ftl_l2p.o 00:03:07.274 CC lib/nvmf/rdma.o 00:03:07.274 CC lib/nvmf/mdns_server.o 00:03:07.274 CC lib/ftl/ftl_l2p_flat.o 00:03:07.274 CC lib/ftl/ftl_nv_cache.o 00:03:07.274 CC lib/ftl/ftl_band.o 00:03:07.274 CC lib/nvmf/auth.o 00:03:07.274 CC lib/ftl/ftl_band_ops.o 00:03:07.274 CC lib/ftl/ftl_rq.o 00:03:07.274 CC lib/ftl/ftl_reloc.o 00:03:07.274 CC lib/ftl/ftl_writer.o 00:03:07.274 CC lib/ftl/ftl_l2p_cache.o 00:03:07.274 CC lib/ftl/ftl_p2l.o 00:03:07.274 CC lib/ftl/mngt/ftl_mngt.o 00:03:07.274 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:03:07.274 CC lib/ftl/mngt/ftl_mngt_startup.o 00:03:07.274 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:03:07.274 CC lib/ftl/mngt/ftl_mngt_md.o 00:03:07.274 CC lib/ftl/mngt/ftl_mngt_misc.o 00:03:07.274 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:03:07.274 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:03:07.274 CC lib/ftl/mngt/ftl_mngt_band.o 00:03:07.274 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:03:07.274 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:03:07.274 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:03:07.274 CC lib/ftl/utils/ftl_conf.o 00:03:07.274 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:03:07.274 CC lib/ftl/utils/ftl_mempool.o 00:03:07.274 CC lib/ftl/utils/ftl_md.o 00:03:07.274 CC lib/ftl/utils/ftl_bitmap.o 00:03:07.274 CC lib/ftl/utils/ftl_property.o 00:03:07.274 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:03:07.274 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:03:07.274 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:03:07.274 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:03:07.274 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:03:07.274 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:03:07.274 CC lib/ftl/upgrade/ftl_sb_v5.o 00:03:07.274 CC lib/ftl/nvc/ftl_nvc_dev.o 00:03:07.274 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:03:07.274 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:03:07.274 CC lib/ftl/upgrade/ftl_sb_v3.o 00:03:07.274 CC lib/ftl/base/ftl_base_dev.o 00:03:07.274 CC lib/ftl/base/ftl_base_bdev.o 00:03:07.274 CC lib/ftl/ftl_trace.o 00:03:07.842 LIB libspdk_nbd.a 00:03:07.842 SO libspdk_nbd.so.7.0 00:03:08.102 SYMLINK libspdk_nbd.so 00:03:08.102 LIB libspdk_ublk.a 00:03:08.102 SO libspdk_ublk.so.3.0 00:03:08.102 LIB libspdk_scsi.a 00:03:08.102 SYMLINK libspdk_ublk.so 00:03:08.361 SO libspdk_scsi.so.9.0 00:03:08.361 SYMLINK libspdk_scsi.so 00:03:08.620 LIB libspdk_ftl.a 00:03:08.620 SO libspdk_ftl.so.9.0 00:03:08.880 CC lib/vhost/vhost.o 00:03:08.880 CC lib/vhost/vhost_rpc.o 00:03:08.880 CC lib/vhost/vhost_scsi.o 00:03:08.880 CC lib/vhost/vhost_blk.o 00:03:08.880 CC lib/vhost/rte_vhost_user.o 00:03:08.880 CC lib/iscsi/conn.o 00:03:08.880 CC lib/iscsi/init_grp.o 00:03:08.880 CC lib/iscsi/iscsi.o 00:03:08.880 CC lib/iscsi/md5.o 00:03:08.880 CC lib/iscsi/param.o 00:03:08.880 CC lib/iscsi/portal_grp.o 00:03:08.880 CC lib/iscsi/tgt_node.o 00:03:08.880 CC lib/iscsi/iscsi_subsystem.o 00:03:08.880 CC lib/iscsi/iscsi_rpc.o 00:03:08.880 CC lib/iscsi/task.o 00:03:09.449 SYMLINK libspdk_ftl.so 00:03:10.018 LIB libspdk_nvmf.a 00:03:10.018 LIB libspdk_vhost.a 00:03:10.018 SO libspdk_vhost.so.8.0 00:03:10.018 SO libspdk_nvmf.so.19.0 00:03:10.278 SYMLINK libspdk_vhost.so 00:03:10.278 SYMLINK libspdk_nvmf.so 00:03:10.278 LIB libspdk_iscsi.a 00:03:10.278 SO libspdk_iscsi.so.8.0 00:03:10.538 SYMLINK libspdk_iscsi.so 00:03:11.107 CC module/env_dpdk/env_dpdk_rpc.o 00:03:11.365 CC module/sock/posix/posix.o 00:03:11.365 CC module/scheduler/dynamic/scheduler_dynamic.o 00:03:11.365 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:03:11.365 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:03:11.365 LIB libspdk_env_dpdk_rpc.a 00:03:11.365 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:03:11.365 CC module/scheduler/gscheduler/gscheduler.o 00:03:11.365 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:03:11.365 CC module/accel/dsa/accel_dsa.o 00:03:11.365 CC module/accel/error/accel_error.o 00:03:11.365 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:03:11.365 CC module/accel/error/accel_error_rpc.o 00:03:11.365 CC module/accel/dsa/accel_dsa_rpc.o 00:03:11.365 CC module/keyring/file/keyring.o 00:03:11.365 CC module/keyring/file/keyring_rpc.o 00:03:11.365 CC module/keyring/linux/keyring.o 00:03:11.365 CC module/keyring/linux/keyring_rpc.o 00:03:11.365 CC module/accel/ioat/accel_ioat_rpc.o 00:03:11.365 CC module/accel/ioat/accel_ioat.o 00:03:11.365 CC module/blob/bdev/blob_bdev.o 00:03:11.365 CC module/accel/iaa/accel_iaa.o 00:03:11.365 CC module/accel/iaa/accel_iaa_rpc.o 00:03:11.365 SO libspdk_env_dpdk_rpc.so.6.0 00:03:11.365 SYMLINK libspdk_env_dpdk_rpc.so 00:03:11.365 LIB libspdk_scheduler_dpdk_governor.a 00:03:11.365 LIB libspdk_scheduler_gscheduler.a 00:03:11.365 LIB libspdk_keyring_linux.a 00:03:11.365 LIB libspdk_keyring_file.a 00:03:11.624 LIB libspdk_scheduler_dynamic.a 00:03:11.624 LIB libspdk_accel_error.a 00:03:11.624 SO libspdk_scheduler_gscheduler.so.4.0 00:03:11.624 SO libspdk_scheduler_dpdk_governor.so.4.0 00:03:11.624 LIB libspdk_accel_dsa.a 00:03:11.624 LIB libspdk_accel_ioat.a 00:03:11.624 SO libspdk_keyring_linux.so.1.0 00:03:11.624 SO libspdk_scheduler_dynamic.so.4.0 00:03:11.624 SO libspdk_keyring_file.so.1.0 00:03:11.624 LIB libspdk_accel_iaa.a 00:03:11.624 SO libspdk_accel_error.so.2.0 00:03:11.624 SO libspdk_accel_ioat.so.6.0 00:03:11.624 SYMLINK libspdk_scheduler_gscheduler.so 00:03:11.624 SO libspdk_accel_dsa.so.5.0 00:03:11.624 SYMLINK libspdk_scheduler_dpdk_governor.so 00:03:11.624 LIB libspdk_blob_bdev.a 00:03:11.624 SO libspdk_accel_iaa.so.3.0 00:03:11.624 SYMLINK libspdk_keyring_linux.so 00:03:11.624 SYMLINK libspdk_keyring_file.so 00:03:11.624 SYMLINK libspdk_accel_error.so 00:03:11.624 SO libspdk_blob_bdev.so.11.0 00:03:11.624 SYMLINK libspdk_accel_ioat.so 00:03:11.624 SYMLINK libspdk_scheduler_dynamic.so 00:03:11.624 SYMLINK libspdk_accel_dsa.so 00:03:11.624 SYMLINK libspdk_accel_iaa.so 00:03:11.884 SYMLINK libspdk_blob_bdev.so 00:03:12.142 LIB libspdk_sock_posix.a 00:03:12.142 SO libspdk_sock_posix.so.6.0 00:03:12.142 SYMLINK libspdk_sock_posix.so 00:03:12.142 CC module/blobfs/bdev/blobfs_bdev.o 00:03:12.142 CC module/bdev/delay/vbdev_delay.o 00:03:12.142 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:03:12.142 CC module/bdev/delay/vbdev_delay_rpc.o 00:03:12.142 CC module/bdev/split/vbdev_split.o 00:03:12.142 CC module/bdev/split/vbdev_split_rpc.o 00:03:12.142 CC module/bdev/null/bdev_null.o 00:03:12.142 CC module/bdev/null/bdev_null_rpc.o 00:03:12.142 CC module/bdev/gpt/vbdev_gpt.o 00:03:12.142 CC module/bdev/gpt/gpt.o 00:03:12.398 CC module/bdev/ftl/bdev_ftl_rpc.o 00:03:12.398 CC module/bdev/ftl/bdev_ftl.o 00:03:12.398 CC module/bdev/passthru/vbdev_passthru.o 00:03:12.398 CC module/bdev/malloc/bdev_malloc.o 00:03:12.398 CC module/bdev/malloc/bdev_malloc_rpc.o 00:03:12.398 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:03:12.398 CC module/bdev/raid/bdev_raid.o 00:03:12.398 CC module/bdev/error/vbdev_error.o 00:03:12.398 CC module/bdev/raid/bdev_raid_rpc.o 00:03:12.399 CC module/bdev/error/vbdev_error_rpc.o 00:03:12.399 CC module/bdev/zone_block/vbdev_zone_block.o 00:03:12.399 CC module/bdev/raid/raid0.o 00:03:12.399 CC module/bdev/raid/bdev_raid_sb.o 00:03:12.399 CC module/bdev/iscsi/bdev_iscsi.o 00:03:12.399 CC module/bdev/raid/raid1.o 00:03:12.399 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:03:12.399 CC module/bdev/raid/concat.o 00:03:12.399 CC module/bdev/virtio/bdev_virtio_scsi.o 00:03:12.399 CC module/bdev/nvme/bdev_nvme.o 00:03:12.399 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:03:12.399 CC module/bdev/lvol/vbdev_lvol.o 00:03:12.399 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:03:12.399 CC module/bdev/virtio/bdev_virtio_rpc.o 00:03:12.399 CC module/bdev/virtio/bdev_virtio_blk.o 00:03:12.399 CC module/bdev/nvme/bdev_nvme_rpc.o 00:03:12.399 CC module/bdev/nvme/nvme_rpc.o 00:03:12.399 CC module/bdev/nvme/bdev_mdns_client.o 00:03:12.399 CC module/bdev/compress/vbdev_compress.o 00:03:12.399 CC module/bdev/nvme/vbdev_opal.o 00:03:12.399 CC module/bdev/aio/bdev_aio.o 00:03:12.399 CC module/bdev/compress/vbdev_compress_rpc.o 00:03:12.399 CC module/bdev/nvme/vbdev_opal_rpc.o 00:03:12.399 CC module/bdev/crypto/vbdev_crypto.o 00:03:12.399 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:03:12.399 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:03:12.399 CC module/bdev/aio/bdev_aio_rpc.o 00:03:12.399 LIB libspdk_accel_dpdk_compressdev.a 00:03:12.399 SO libspdk_accel_dpdk_compressdev.so.3.0 00:03:12.657 LIB libspdk_blobfs_bdev.a 00:03:12.657 LIB libspdk_bdev_split.a 00:03:12.657 SO libspdk_blobfs_bdev.so.6.0 00:03:12.657 SO libspdk_bdev_split.so.6.0 00:03:12.657 LIB libspdk_bdev_gpt.a 00:03:12.657 SYMLINK libspdk_blobfs_bdev.so 00:03:12.657 SYMLINK libspdk_bdev_split.so 00:03:12.657 SO libspdk_bdev_gpt.so.6.0 00:03:12.657 LIB libspdk_accel_dpdk_cryptodev.a 00:03:12.657 SYMLINK libspdk_accel_dpdk_compressdev.so 00:03:12.657 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:03:12.657 LIB libspdk_bdev_iscsi.a 00:03:12.657 LIB libspdk_bdev_crypto.a 00:03:12.657 SYMLINK libspdk_bdev_gpt.so 00:03:12.657 LIB libspdk_bdev_error.a 00:03:12.657 LIB libspdk_bdev_malloc.a 00:03:12.657 LIB libspdk_bdev_compress.a 00:03:12.657 LIB libspdk_bdev_null.a 00:03:12.657 SO libspdk_bdev_iscsi.so.6.0 00:03:12.915 SO libspdk_bdev_crypto.so.6.0 00:03:12.915 LIB libspdk_bdev_ftl.a 00:03:12.915 SO libspdk_bdev_malloc.so.6.0 00:03:12.915 SO libspdk_bdev_error.so.6.0 00:03:12.915 LIB libspdk_bdev_zone_block.a 00:03:12.915 LIB libspdk_bdev_passthru.a 00:03:12.915 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:03:12.915 SO libspdk_bdev_compress.so.6.0 00:03:12.915 SO libspdk_bdev_null.so.6.0 00:03:12.915 LIB libspdk_bdev_delay.a 00:03:12.915 SO libspdk_bdev_ftl.so.6.0 00:03:12.915 SO libspdk_bdev_zone_block.so.6.0 00:03:12.915 SO libspdk_bdev_passthru.so.6.0 00:03:12.915 SYMLINK libspdk_bdev_iscsi.so 00:03:12.915 SYMLINK libspdk_bdev_malloc.so 00:03:12.915 SO libspdk_bdev_delay.so.6.0 00:03:12.915 SYMLINK libspdk_bdev_error.so 00:03:12.915 SYMLINK libspdk_bdev_compress.so 00:03:12.915 SYMLINK libspdk_bdev_null.so 00:03:12.915 LIB libspdk_bdev_virtio.a 00:03:12.915 SYMLINK libspdk_bdev_zone_block.so 00:03:12.915 SYMLINK libspdk_bdev_ftl.so 00:03:12.915 SYMLINK libspdk_bdev_passthru.so 00:03:12.915 SYMLINK libspdk_bdev_crypto.so 00:03:12.915 SO libspdk_bdev_virtio.so.6.0 00:03:12.915 SYMLINK libspdk_bdev_delay.so 00:03:12.915 LIB libspdk_bdev_lvol.a 00:03:12.915 SO libspdk_bdev_lvol.so.6.0 00:03:12.915 SYMLINK libspdk_bdev_virtio.so 00:03:13.174 SYMLINK libspdk_bdev_lvol.so 00:03:13.174 LIB libspdk_bdev_aio.a 00:03:13.174 SO libspdk_bdev_aio.so.6.0 00:03:13.174 SYMLINK libspdk_bdev_aio.so 00:03:13.433 LIB libspdk_bdev_raid.a 00:03:13.433 SO libspdk_bdev_raid.so.6.0 00:03:13.433 SYMLINK libspdk_bdev_raid.so 00:03:14.809 LIB libspdk_bdev_nvme.a 00:03:14.809 SO libspdk_bdev_nvme.so.7.0 00:03:14.809 SYMLINK libspdk_bdev_nvme.so 00:03:15.745 CC module/event/subsystems/keyring/keyring.o 00:03:15.745 CC module/event/subsystems/scheduler/scheduler.o 00:03:15.745 CC module/event/subsystems/sock/sock.o 00:03:15.745 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:03:15.745 CC module/event/subsystems/iobuf/iobuf.o 00:03:15.745 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:03:15.745 CC module/event/subsystems/vmd/vmd.o 00:03:15.745 CC module/event/subsystems/vmd/vmd_rpc.o 00:03:15.745 LIB libspdk_event_keyring.a 00:03:15.745 LIB libspdk_event_scheduler.a 00:03:15.745 LIB libspdk_event_vhost_blk.a 00:03:15.745 LIB libspdk_event_sock.a 00:03:15.745 LIB libspdk_event_vmd.a 00:03:15.745 SO libspdk_event_keyring.so.1.0 00:03:15.745 SO libspdk_event_scheduler.so.4.0 00:03:15.745 SO libspdk_event_vhost_blk.so.3.0 00:03:15.745 SO libspdk_event_sock.so.5.0 00:03:15.745 SO libspdk_event_vmd.so.6.0 00:03:15.745 SYMLINK libspdk_event_keyring.so 00:03:15.745 SYMLINK libspdk_event_scheduler.so 00:03:15.745 SYMLINK libspdk_event_vhost_blk.so 00:03:15.745 SYMLINK libspdk_event_sock.so 00:03:15.745 SYMLINK libspdk_event_vmd.so 00:03:16.004 LIB libspdk_event_iobuf.a 00:03:16.004 SO libspdk_event_iobuf.so.3.0 00:03:16.004 SYMLINK libspdk_event_iobuf.so 00:03:16.572 CC module/event/subsystems/accel/accel.o 00:03:16.572 LIB libspdk_event_accel.a 00:03:16.572 SO libspdk_event_accel.so.6.0 00:03:16.572 SYMLINK libspdk_event_accel.so 00:03:17.138 CC module/event/subsystems/bdev/bdev.o 00:03:17.397 LIB libspdk_event_bdev.a 00:03:17.397 SO libspdk_event_bdev.so.6.0 00:03:17.397 SYMLINK libspdk_event_bdev.so 00:03:17.656 CC module/event/subsystems/scsi/scsi.o 00:03:17.656 CC module/event/subsystems/nbd/nbd.o 00:03:17.656 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:03:17.656 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:03:17.656 CC module/event/subsystems/ublk/ublk.o 00:03:17.914 LIB libspdk_event_nbd.a 00:03:17.914 LIB libspdk_event_ublk.a 00:03:17.914 LIB libspdk_event_scsi.a 00:03:17.914 SO libspdk_event_nbd.so.6.0 00:03:17.914 SO libspdk_event_scsi.so.6.0 00:03:17.914 SO libspdk_event_ublk.so.3.0 00:03:18.172 SYMLINK libspdk_event_nbd.so 00:03:18.172 SYMLINK libspdk_event_scsi.so 00:03:18.172 SYMLINK libspdk_event_ublk.so 00:03:18.172 LIB libspdk_event_nvmf.a 00:03:18.433 SO libspdk_event_nvmf.so.6.0 00:03:18.433 SYMLINK libspdk_event_nvmf.so 00:03:18.433 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:18.433 CC module/event/subsystems/iscsi/iscsi.o 00:03:18.763 LIB libspdk_event_vhost_scsi.a 00:03:18.763 LIB libspdk_event_iscsi.a 00:03:18.763 SO libspdk_event_vhost_scsi.so.3.0 00:03:18.763 SO libspdk_event_iscsi.so.6.0 00:03:18.763 SYMLINK libspdk_event_vhost_scsi.so 00:03:18.763 SYMLINK libspdk_event_iscsi.so 00:03:19.022 SO libspdk.so.6.0 00:03:19.022 SYMLINK libspdk.so 00:03:19.281 CC app/trace_record/trace_record.o 00:03:19.281 CC test/rpc_client/rpc_client_test.o 00:03:19.281 CXX app/trace/trace.o 00:03:19.281 CC app/spdk_nvme_perf/perf.o 00:03:19.281 CC app/spdk_lspci/spdk_lspci.o 00:03:19.281 CC app/spdk_top/spdk_top.o 00:03:19.281 TEST_HEADER include/spdk/accel.h 00:03:19.281 TEST_HEADER include/spdk/accel_module.h 00:03:19.281 TEST_HEADER include/spdk/assert.h 00:03:19.281 TEST_HEADER include/spdk/base64.h 00:03:19.281 TEST_HEADER include/spdk/bdev_module.h 00:03:19.281 TEST_HEADER include/spdk/barrier.h 00:03:19.281 TEST_HEADER include/spdk/bdev_zone.h 00:03:19.281 TEST_HEADER include/spdk/bit_array.h 00:03:19.281 TEST_HEADER include/spdk/bdev.h 00:03:19.281 CC app/spdk_nvme_discover/discovery_aer.o 00:03:19.281 TEST_HEADER include/spdk/blob_bdev.h 00:03:19.281 TEST_HEADER include/spdk/bit_pool.h 00:03:19.281 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:19.281 TEST_HEADER include/spdk/blobfs.h 00:03:19.281 TEST_HEADER include/spdk/config.h 00:03:19.281 TEST_HEADER include/spdk/cpuset.h 00:03:19.281 TEST_HEADER include/spdk/blob.h 00:03:19.281 TEST_HEADER include/spdk/conf.h 00:03:19.281 TEST_HEADER include/spdk/crc16.h 00:03:19.281 TEST_HEADER include/spdk/crc64.h 00:03:19.281 CC app/spdk_nvme_identify/identify.o 00:03:19.281 TEST_HEADER include/spdk/dif.h 00:03:19.281 TEST_HEADER include/spdk/dma.h 00:03:19.281 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:19.281 TEST_HEADER include/spdk/crc32.h 00:03:19.282 TEST_HEADER include/spdk/endian.h 00:03:19.282 TEST_HEADER include/spdk/env.h 00:03:19.282 TEST_HEADER include/spdk/env_dpdk.h 00:03:19.282 TEST_HEADER include/spdk/event.h 00:03:19.282 TEST_HEADER include/spdk/fd_group.h 00:03:19.282 TEST_HEADER include/spdk/fd.h 00:03:19.282 TEST_HEADER include/spdk/file.h 00:03:19.282 TEST_HEADER include/spdk/ftl.h 00:03:19.282 TEST_HEADER include/spdk/gpt_spec.h 00:03:19.282 TEST_HEADER include/spdk/hexlify.h 00:03:19.282 TEST_HEADER include/spdk/idxd.h 00:03:19.282 TEST_HEADER include/spdk/histogram_data.h 00:03:19.282 TEST_HEADER include/spdk/idxd_spec.h 00:03:19.282 TEST_HEADER include/spdk/init.h 00:03:19.282 TEST_HEADER include/spdk/ioat.h 00:03:19.282 TEST_HEADER include/spdk/ioat_spec.h 00:03:19.282 TEST_HEADER include/spdk/iscsi_spec.h 00:03:19.282 TEST_HEADER include/spdk/json.h 00:03:19.282 TEST_HEADER include/spdk/jsonrpc.h 00:03:19.282 TEST_HEADER include/spdk/keyring.h 00:03:19.282 TEST_HEADER include/spdk/keyring_module.h 00:03:19.282 TEST_HEADER include/spdk/log.h 00:03:19.282 TEST_HEADER include/spdk/likely.h 00:03:19.282 TEST_HEADER include/spdk/lvol.h 00:03:19.282 TEST_HEADER include/spdk/mmio.h 00:03:19.282 TEST_HEADER include/spdk/nbd.h 00:03:19.282 TEST_HEADER include/spdk/memory.h 00:03:19.282 TEST_HEADER include/spdk/net.h 00:03:19.282 TEST_HEADER include/spdk/notify.h 00:03:19.282 TEST_HEADER include/spdk/nvme.h 00:03:19.282 TEST_HEADER include/spdk/nvme_intel.h 00:03:19.282 CC app/spdk_dd/spdk_dd.o 00:03:19.282 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:19.282 TEST_HEADER include/spdk/nvme_spec.h 00:03:19.282 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:19.282 TEST_HEADER include/spdk/nvme_zns.h 00:03:19.282 CC app/nvmf_tgt/nvmf_main.o 00:03:19.282 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:19.282 TEST_HEADER include/spdk/nvmf.h 00:03:19.282 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:19.282 TEST_HEADER include/spdk/nvmf_spec.h 00:03:19.282 TEST_HEADER include/spdk/nvmf_transport.h 00:03:19.282 TEST_HEADER include/spdk/opal.h 00:03:19.282 TEST_HEADER include/spdk/pci_ids.h 00:03:19.282 TEST_HEADER include/spdk/queue.h 00:03:19.282 TEST_HEADER include/spdk/pipe.h 00:03:19.282 TEST_HEADER include/spdk/opal_spec.h 00:03:19.282 TEST_HEADER include/spdk/reduce.h 00:03:19.282 TEST_HEADER include/spdk/rpc.h 00:03:19.282 TEST_HEADER include/spdk/scheduler.h 00:03:19.282 TEST_HEADER include/spdk/scsi.h 00:03:19.282 TEST_HEADER include/spdk/scsi_spec.h 00:03:19.282 TEST_HEADER include/spdk/stdinc.h 00:03:19.282 TEST_HEADER include/spdk/sock.h 00:03:19.282 TEST_HEADER include/spdk/thread.h 00:03:19.282 TEST_HEADER include/spdk/trace.h 00:03:19.282 TEST_HEADER include/spdk/trace_parser.h 00:03:19.282 TEST_HEADER include/spdk/string.h 00:03:19.282 TEST_HEADER include/spdk/ublk.h 00:03:19.282 TEST_HEADER include/spdk/tree.h 00:03:19.282 TEST_HEADER include/spdk/util.h 00:03:19.282 TEST_HEADER include/spdk/uuid.h 00:03:19.282 TEST_HEADER include/spdk/version.h 00:03:19.282 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:19.282 TEST_HEADER include/spdk/vhost.h 00:03:19.282 TEST_HEADER include/spdk/vmd.h 00:03:19.282 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:19.282 TEST_HEADER include/spdk/xor.h 00:03:19.282 CC app/spdk_tgt/spdk_tgt.o 00:03:19.282 TEST_HEADER include/spdk/zipf.h 00:03:19.282 CXX test/cpp_headers/accel.o 00:03:19.282 CXX test/cpp_headers/accel_module.o 00:03:19.551 CXX test/cpp_headers/assert.o 00:03:19.551 CXX test/cpp_headers/barrier.o 00:03:19.551 CXX test/cpp_headers/base64.o 00:03:19.551 CXX test/cpp_headers/bdev.o 00:03:19.551 CXX test/cpp_headers/bdev_module.o 00:03:19.551 CXX test/cpp_headers/bdev_zone.o 00:03:19.551 CXX test/cpp_headers/bit_array.o 00:03:19.551 CXX test/cpp_headers/bit_pool.o 00:03:19.551 CXX test/cpp_headers/blob_bdev.o 00:03:19.551 CXX test/cpp_headers/blobfs_bdev.o 00:03:19.551 CXX test/cpp_headers/blobfs.o 00:03:19.551 CXX test/cpp_headers/blob.o 00:03:19.551 CXX test/cpp_headers/conf.o 00:03:19.551 CXX test/cpp_headers/config.o 00:03:19.551 CXX test/cpp_headers/crc16.o 00:03:19.551 CXX test/cpp_headers/cpuset.o 00:03:19.551 CXX test/cpp_headers/crc32.o 00:03:19.551 CXX test/cpp_headers/crc64.o 00:03:19.551 CXX test/cpp_headers/dif.o 00:03:19.551 CXX test/cpp_headers/dma.o 00:03:19.551 CXX test/cpp_headers/endian.o 00:03:19.551 CXX test/cpp_headers/event.o 00:03:19.551 CXX test/cpp_headers/env_dpdk.o 00:03:19.551 CXX test/cpp_headers/env.o 00:03:19.551 CXX test/cpp_headers/fd_group.o 00:03:19.551 CXX test/cpp_headers/ftl.o 00:03:19.551 CXX test/cpp_headers/fd.o 00:03:19.551 CXX test/cpp_headers/file.o 00:03:19.551 CXX test/cpp_headers/gpt_spec.o 00:03:19.551 CXX test/cpp_headers/hexlify.o 00:03:19.551 CXX test/cpp_headers/histogram_data.o 00:03:19.551 CXX test/cpp_headers/idxd.o 00:03:19.551 CXX test/cpp_headers/idxd_spec.o 00:03:19.551 CXX test/cpp_headers/ioat.o 00:03:19.551 CXX test/cpp_headers/init.o 00:03:19.551 CXX test/cpp_headers/ioat_spec.o 00:03:19.551 CXX test/cpp_headers/iscsi_spec.o 00:03:19.551 CXX test/cpp_headers/json.o 00:03:19.551 CXX test/cpp_headers/jsonrpc.o 00:03:19.551 CXX test/cpp_headers/keyring.o 00:03:19.551 CC test/thread/poller_perf/poller_perf.o 00:03:19.551 CC examples/util/zipf/zipf.o 00:03:19.551 CC test/env/pci/pci_ut.o 00:03:19.551 CC examples/ioat/verify/verify.o 00:03:19.551 CC test/app/jsoncat/jsoncat.o 00:03:19.551 CC test/env/memory/memory_ut.o 00:03:19.551 CC examples/ioat/perf/perf.o 00:03:19.551 CC test/app/stub/stub.o 00:03:19.551 CC test/env/vtophys/vtophys.o 00:03:19.551 CC app/iscsi_tgt/iscsi_tgt.o 00:03:19.551 CC test/app/histogram_perf/histogram_perf.o 00:03:19.551 CXX test/cpp_headers/keyring_module.o 00:03:19.551 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:19.551 CC app/fio/nvme/fio_plugin.o 00:03:19.551 LINK spdk_lspci 00:03:19.551 CC app/fio/bdev/fio_plugin.o 00:03:19.551 CC test/dma/test_dma/test_dma.o 00:03:19.814 LINK rpc_client_test 00:03:19.814 CC test/app/bdev_svc/bdev_svc.o 00:03:19.814 LINK interrupt_tgt 00:03:19.814 CC test/env/mem_callbacks/mem_callbacks.o 00:03:19.814 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:19.814 LINK spdk_trace_record 00:03:19.814 LINK spdk_nvme_discover 00:03:19.814 LINK nvmf_tgt 00:03:20.073 LINK zipf 00:03:20.073 LINK jsoncat 00:03:20.073 LINK poller_perf 00:03:20.073 LINK env_dpdk_post_init 00:03:20.073 LINK stub 00:03:20.073 LINK histogram_perf 00:03:20.073 CXX test/cpp_headers/likely.o 00:03:20.073 CXX test/cpp_headers/log.o 00:03:20.073 LINK vtophys 00:03:20.073 CXX test/cpp_headers/lvol.o 00:03:20.073 CXX test/cpp_headers/memory.o 00:03:20.073 CXX test/cpp_headers/mmio.o 00:03:20.073 CXX test/cpp_headers/nbd.o 00:03:20.073 LINK spdk_tgt 00:03:20.073 CXX test/cpp_headers/net.o 00:03:20.073 CXX test/cpp_headers/notify.o 00:03:20.073 CXX test/cpp_headers/nvme_intel.o 00:03:20.073 CXX test/cpp_headers/nvme.o 00:03:20.073 CXX test/cpp_headers/nvme_ocssd.o 00:03:20.073 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:20.073 LINK ioat_perf 00:03:20.073 CXX test/cpp_headers/nvme_spec.o 00:03:20.073 CXX test/cpp_headers/nvme_zns.o 00:03:20.073 LINK verify 00:03:20.073 CXX test/cpp_headers/nvmf_cmd.o 00:03:20.073 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:20.073 CXX test/cpp_headers/nvmf.o 00:03:20.073 CXX test/cpp_headers/nvmf_spec.o 00:03:20.073 CXX test/cpp_headers/nvmf_transport.o 00:03:20.073 CXX test/cpp_headers/opal.o 00:03:20.073 CXX test/cpp_headers/opal_spec.o 00:03:20.073 LINK spdk_dd 00:03:20.073 CXX test/cpp_headers/pci_ids.o 00:03:20.073 CXX test/cpp_headers/pipe.o 00:03:20.073 CXX test/cpp_headers/queue.o 00:03:20.073 CXX test/cpp_headers/reduce.o 00:03:20.073 CXX test/cpp_headers/rpc.o 00:03:20.073 CXX test/cpp_headers/scheduler.o 00:03:20.073 CXX test/cpp_headers/scsi.o 00:03:20.073 CXX test/cpp_headers/scsi_spec.o 00:03:20.073 CXX test/cpp_headers/sock.o 00:03:20.073 CXX test/cpp_headers/stdinc.o 00:03:20.073 CXX test/cpp_headers/string.o 00:03:20.073 CXX test/cpp_headers/thread.o 00:03:20.073 CXX test/cpp_headers/trace.o 00:03:20.073 CXX test/cpp_headers/trace_parser.o 00:03:20.337 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:20.337 LINK iscsi_tgt 00:03:20.337 CXX test/cpp_headers/tree.o 00:03:20.337 CXX test/cpp_headers/ublk.o 00:03:20.337 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:20.337 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:20.337 CXX test/cpp_headers/util.o 00:03:20.337 CXX test/cpp_headers/uuid.o 00:03:20.337 CXX test/cpp_headers/version.o 00:03:20.337 CXX test/cpp_headers/vfio_user_pci.o 00:03:20.337 CXX test/cpp_headers/vfio_user_spec.o 00:03:20.337 CXX test/cpp_headers/vhost.o 00:03:20.337 CXX test/cpp_headers/vmd.o 00:03:20.337 CXX test/cpp_headers/xor.o 00:03:20.337 CXX test/cpp_headers/zipf.o 00:03:20.337 LINK spdk_trace 00:03:20.595 LINK bdev_svc 00:03:20.595 LINK pci_ut 00:03:20.595 LINK test_dma 00:03:20.595 LINK nvme_fuzz 00:03:20.595 LINK spdk_bdev 00:03:20.595 CC test/event/event_perf/event_perf.o 00:03:20.595 CC test/event/reactor_perf/reactor_perf.o 00:03:20.595 LINK spdk_nvme 00:03:20.595 CC test/event/reactor/reactor.o 00:03:20.595 CC test/event/app_repeat/app_repeat.o 00:03:20.595 CC examples/vmd/lsvmd/lsvmd.o 00:03:20.595 CC examples/sock/hello_world/hello_sock.o 00:03:20.595 CC examples/vmd/led/led.o 00:03:20.595 CC examples/idxd/perf/perf.o 00:03:20.853 CC test/event/scheduler/scheduler.o 00:03:20.853 CC examples/thread/thread/thread_ex.o 00:03:20.853 LINK spdk_nvme_perf 00:03:20.853 LINK spdk_nvme_identify 00:03:20.853 LINK mem_callbacks 00:03:20.853 LINK reactor_perf 00:03:20.853 CC app/vhost/vhost.o 00:03:20.853 LINK event_perf 00:03:20.853 LINK spdk_top 00:03:20.853 LINK reactor 00:03:20.853 LINK lsvmd 00:03:20.853 LINK led 00:03:20.853 LINK app_repeat 00:03:20.853 LINK vhost_fuzz 00:03:21.112 LINK hello_sock 00:03:21.112 LINK scheduler 00:03:21.112 LINK thread 00:03:21.112 LINK idxd_perf 00:03:21.112 LINK vhost 00:03:21.112 CC test/nvme/fdp/fdp.o 00:03:21.112 CC test/nvme/overhead/overhead.o 00:03:21.112 CC test/nvme/reset/reset.o 00:03:21.112 CC test/nvme/aer/aer.o 00:03:21.112 CC test/nvme/startup/startup.o 00:03:21.112 CC test/nvme/connect_stress/connect_stress.o 00:03:21.112 CC test/nvme/compliance/nvme_compliance.o 00:03:21.112 CC test/nvme/cuse/cuse.o 00:03:21.112 CC test/nvme/boot_partition/boot_partition.o 00:03:21.112 CC test/nvme/simple_copy/simple_copy.o 00:03:21.112 CC test/nvme/sgl/sgl.o 00:03:21.112 CC test/nvme/err_injection/err_injection.o 00:03:21.112 CC test/nvme/reserve/reserve.o 00:03:21.112 CC test/nvme/e2edp/nvme_dp.o 00:03:21.112 CC test/nvme/fused_ordering/fused_ordering.o 00:03:21.112 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:21.112 CC test/accel/dif/dif.o 00:03:21.112 CC test/blobfs/mkfs/mkfs.o 00:03:21.370 LINK memory_ut 00:03:21.370 CC test/lvol/esnap/esnap.o 00:03:21.370 LINK startup 00:03:21.370 LINK boot_partition 00:03:21.370 LINK doorbell_aers 00:03:21.370 LINK fused_ordering 00:03:21.370 LINK reserve 00:03:21.370 LINK nvme_dp 00:03:21.370 LINK err_injection 00:03:21.370 LINK simple_copy 00:03:21.370 LINK mkfs 00:03:21.370 LINK nvme_compliance 00:03:21.370 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:21.370 LINK reset 00:03:21.370 CC examples/nvme/abort/abort.o 00:03:21.370 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:21.370 CC examples/nvme/reconnect/reconnect.o 00:03:21.628 LINK sgl 00:03:21.628 CC examples/nvme/hotplug/hotplug.o 00:03:21.628 CC examples/nvme/arbitration/arbitration.o 00:03:21.628 CC examples/nvme/hello_world/hello_world.o 00:03:21.628 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:21.628 LINK overhead 00:03:21.628 LINK fdp 00:03:21.628 LINK connect_stress 00:03:21.628 CC examples/accel/perf/accel_perf.o 00:03:21.628 CC examples/blob/cli/blobcli.o 00:03:21.628 CC examples/blob/hello_world/hello_blob.o 00:03:21.628 LINK dif 00:03:21.628 LINK pmr_persistence 00:03:21.628 LINK cmb_copy 00:03:21.894 LINK hello_world 00:03:21.894 LINK hotplug 00:03:21.894 LINK aer 00:03:21.894 LINK arbitration 00:03:21.894 LINK reconnect 00:03:21.894 LINK abort 00:03:21.894 LINK hello_blob 00:03:22.154 LINK accel_perf 00:03:22.154 LINK nvme_manage 00:03:22.154 LINK iscsi_fuzz 00:03:22.411 LINK blobcli 00:03:22.411 CC test/bdev/bdevio/bdevio.o 00:03:22.670 LINK cuse 00:03:22.670 CC examples/bdev/hello_world/hello_bdev.o 00:03:22.670 CC examples/bdev/bdevperf/bdevperf.o 00:03:22.928 LINK bdevio 00:03:22.928 LINK hello_bdev 00:03:23.494 LINK bdevperf 00:03:24.429 CC examples/nvmf/nvmf/nvmf.o 00:03:24.689 LINK nvmf 00:03:27.227 LINK esnap 00:03:27.487 00:03:27.487 real 1m35.348s 00:03:27.487 user 18m10.605s 00:03:27.487 sys 4m27.124s 00:03:27.487 19:39:18 make -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:03:27.487 19:39:18 make -- common/autotest_common.sh@10 -- $ set +x 00:03:27.487 ************************************ 00:03:27.487 END TEST make 00:03:27.487 ************************************ 00:03:27.487 19:39:18 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:03:27.487 19:39:18 -- pm/common@29 -- $ signal_monitor_resources TERM 00:03:27.487 19:39:18 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:03:27.487 19:39:18 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:27.487 19:39:18 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:03:27.487 19:39:18 -- pm/common@44 -- $ pid=1213858 00:03:27.487 19:39:18 -- pm/common@50 -- $ kill -TERM 1213858 00:03:27.487 19:39:18 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:27.487 19:39:18 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:03:27.487 19:39:18 -- pm/common@44 -- $ pid=1213860 00:03:27.487 19:39:18 -- pm/common@50 -- $ kill -TERM 1213860 00:03:27.487 19:39:18 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:27.487 19:39:18 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:03:27.487 19:39:18 -- pm/common@44 -- $ pid=1213862 00:03:27.487 19:39:18 -- pm/common@50 -- $ kill -TERM 1213862 00:03:27.487 19:39:18 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:27.487 19:39:18 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:03:27.487 19:39:18 -- pm/common@44 -- $ pid=1213886 00:03:27.487 19:39:18 -- pm/common@50 -- $ sudo -E kill -TERM 1213886 00:03:27.487 19:39:19 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:03:27.487 19:39:19 -- nvmf/common.sh@7 -- # uname -s 00:03:27.487 19:39:19 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:27.487 19:39:19 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:27.487 19:39:19 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:27.487 19:39:19 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:27.487 19:39:19 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:27.487 19:39:19 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:27.487 19:39:19 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:27.487 19:39:19 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:27.487 19:39:19 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:27.487 19:39:19 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:27.487 19:39:19 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:03:27.487 19:39:19 -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:03:27.487 19:39:19 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:27.487 19:39:19 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:27.487 19:39:19 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:27.487 19:39:19 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:03:27.487 19:39:19 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:03:27.745 19:39:19 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:27.745 19:39:19 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:27.745 19:39:19 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:27.745 19:39:19 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:27.745 19:39:19 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:27.745 19:39:19 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:27.745 19:39:19 -- paths/export.sh@5 -- # export PATH 00:03:27.745 19:39:19 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:27.745 19:39:19 -- nvmf/common.sh@47 -- # : 0 00:03:27.745 19:39:19 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:03:27.745 19:39:19 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:03:27.746 19:39:19 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:03:27.746 19:39:19 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:27.746 19:39:19 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:27.746 19:39:19 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:03:27.746 19:39:19 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:03:27.746 19:39:19 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:03:27.746 19:39:19 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:27.746 19:39:19 -- spdk/autotest.sh@32 -- # uname -s 00:03:27.746 19:39:19 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:27.746 19:39:19 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:27.746 19:39:19 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:03:27.746 19:39:19 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:27.746 19:39:19 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:03:27.746 19:39:19 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:27.746 19:39:19 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:27.746 19:39:19 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:27.746 19:39:19 -- spdk/autotest.sh@48 -- # udevadm_pid=1281446 00:03:27.746 19:39:19 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:03:27.746 19:39:19 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:27.746 19:39:19 -- pm/common@17 -- # local monitor 00:03:27.746 19:39:19 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:27.746 19:39:19 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:27.746 19:39:19 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:27.746 19:39:19 -- pm/common@21 -- # date +%s 00:03:27.746 19:39:19 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:27.746 19:39:19 -- pm/common@21 -- # date +%s 00:03:27.746 19:39:19 -- pm/common@25 -- # sleep 1 00:03:27.746 19:39:19 -- pm/common@21 -- # date +%s 00:03:27.746 19:39:19 -- pm/common@21 -- # date +%s 00:03:27.746 19:39:19 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721842759 00:03:27.746 19:39:19 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721842759 00:03:27.746 19:39:19 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721842759 00:03:27.746 19:39:19 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721842759 00:03:27.746 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721842759_collect-vmstat.pm.log 00:03:27.746 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721842759_collect-cpu-temp.pm.log 00:03:27.746 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721842759_collect-cpu-load.pm.log 00:03:27.746 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721842759_collect-bmc-pm.bmc.pm.log 00:03:28.683 19:39:20 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:28.683 19:39:20 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:03:28.683 19:39:20 -- common/autotest_common.sh@724 -- # xtrace_disable 00:03:28.683 19:39:20 -- common/autotest_common.sh@10 -- # set +x 00:03:28.683 19:39:20 -- spdk/autotest.sh@59 -- # create_test_list 00:03:28.683 19:39:20 -- common/autotest_common.sh@748 -- # xtrace_disable 00:03:28.683 19:39:20 -- common/autotest_common.sh@10 -- # set +x 00:03:28.683 19:39:20 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:03:28.683 19:39:20 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:28.683 19:39:20 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:28.683 19:39:20 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:03:28.683 19:39:20 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:28.683 19:39:20 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:03:28.683 19:39:20 -- common/autotest_common.sh@1455 -- # uname 00:03:28.683 19:39:20 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:03:28.683 19:39:20 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:03:28.683 19:39:20 -- common/autotest_common.sh@1475 -- # uname 00:03:28.683 19:39:20 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:03:28.683 19:39:20 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:03:28.683 19:39:20 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:03:28.683 19:39:20 -- spdk/autotest.sh@72 -- # hash lcov 00:03:28.683 19:39:20 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:03:28.683 19:39:20 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:03:28.683 --rc lcov_branch_coverage=1 00:03:28.683 --rc lcov_function_coverage=1 00:03:28.683 --rc genhtml_branch_coverage=1 00:03:28.683 --rc genhtml_function_coverage=1 00:03:28.683 --rc genhtml_legend=1 00:03:28.683 --rc geninfo_all_blocks=1 00:03:28.683 ' 00:03:28.683 19:39:20 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:03:28.683 --rc lcov_branch_coverage=1 00:03:28.683 --rc lcov_function_coverage=1 00:03:28.683 --rc genhtml_branch_coverage=1 00:03:28.683 --rc genhtml_function_coverage=1 00:03:28.683 --rc genhtml_legend=1 00:03:28.683 --rc geninfo_all_blocks=1 00:03:28.683 ' 00:03:28.683 19:39:20 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:03:28.683 --rc lcov_branch_coverage=1 00:03:28.683 --rc lcov_function_coverage=1 00:03:28.683 --rc genhtml_branch_coverage=1 00:03:28.683 --rc genhtml_function_coverage=1 00:03:28.683 --rc genhtml_legend=1 00:03:28.683 --rc geninfo_all_blocks=1 00:03:28.683 --no-external' 00:03:28.683 19:39:20 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:03:28.683 --rc lcov_branch_coverage=1 00:03:28.683 --rc lcov_function_coverage=1 00:03:28.683 --rc genhtml_branch_coverage=1 00:03:28.683 --rc genhtml_function_coverage=1 00:03:28.683 --rc genhtml_legend=1 00:03:28.683 --rc geninfo_all_blocks=1 00:03:28.683 --no-external' 00:03:28.683 19:39:20 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:03:28.943 lcov: LCOV version 1.14 00:03:28.943 19:39:20 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:03:47.041 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:03:47.041 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:03:59.301 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:03:59.301 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:03:59.301 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:03:59.301 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:03:59.301 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:03:59.301 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:03:59.301 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:03:59.301 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:03:59.301 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:03:59.301 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:03:59.301 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:03:59.301 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:03:59.301 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:03:59.301 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:03:59.301 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:03:59.301 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:03:59.301 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:03:59.301 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:03:59.301 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:03:59.301 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:03:59.301 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:03:59.301 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:03:59.301 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:03:59.301 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:03:59.301 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:03:59.301 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:03:59.301 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:03:59.301 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:03:59.301 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:03:59.301 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:03:59.301 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:03:59.301 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:03:59.301 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:03:59.301 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:03:59.301 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:03:59.301 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:03:59.301 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:03:59.301 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:03:59.301 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:03:59.301 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:03:59.301 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:03:59.301 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:03:59.301 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:03:59.301 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:03:59.301 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:03:59.301 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:03:59.301 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:03:59.301 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:03:59.301 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:03:59.301 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:03:59.301 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/net.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/net.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:03:59.302 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:03:59.302 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:03:59.303 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:03:59.303 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:03:59.303 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:03:59.303 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:03:59.303 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:03:59.303 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:03:59.303 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:03:59.303 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:03:59.303 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:03:59.303 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:03:59.303 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:03:59.303 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:03:59.303 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:03:59.303 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:03:59.303 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:03:59.303 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:03:59.303 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:03:59.303 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:03:59.303 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:03:59.303 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:03:59.303 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:03:59.303 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:03:59.303 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:03:59.303 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:03:59.303 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:03:59.303 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:04:04.577 19:39:55 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:04:04.577 19:39:55 -- common/autotest_common.sh@724 -- # xtrace_disable 00:04:04.577 19:39:55 -- common/autotest_common.sh@10 -- # set +x 00:04:04.577 19:39:55 -- spdk/autotest.sh@91 -- # rm -f 00:04:04.577 19:39:55 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:07.877 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:07.877 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:07.877 0000:5e:00.0 (8086 0b60): Already using the nvme driver 00:04:07.877 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:04:07.877 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:04:07.877 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:04:07.877 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:04:07.877 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:04:07.877 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:04:07.877 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:04:07.877 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:04:07.877 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:04:07.877 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:04:07.877 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:04:07.877 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:04:07.877 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:04:07.877 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:04:07.877 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:04:07.877 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:04:07.877 19:39:59 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:04:07.877 19:39:59 -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:04:07.877 19:39:59 -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:04:07.877 19:39:59 -- common/autotest_common.sh@1670 -- # local nvme bdf 00:04:07.877 19:39:59 -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:04:07.877 19:39:59 -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:04:07.877 19:39:59 -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:04:07.877 19:39:59 -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:07.877 19:39:59 -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:04:07.877 19:39:59 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:04:07.877 19:39:59 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:04:07.877 19:39:59 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:04:07.877 19:39:59 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:04:07.877 19:39:59 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:04:07.877 19:39:59 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:04:07.877 No valid GPT data, bailing 00:04:07.877 19:39:59 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:07.877 19:39:59 -- scripts/common.sh@391 -- # pt= 00:04:07.877 19:39:59 -- scripts/common.sh@392 -- # return 1 00:04:07.877 19:39:59 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:04:07.877 1+0 records in 00:04:07.877 1+0 records out 00:04:07.877 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00353687 s, 296 MB/s 00:04:07.877 19:39:59 -- spdk/autotest.sh@118 -- # sync 00:04:07.877 19:39:59 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:04:07.877 19:39:59 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:04:07.877 19:39:59 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:04:13.154 19:40:04 -- spdk/autotest.sh@124 -- # uname -s 00:04:13.154 19:40:04 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:04:13.154 19:40:04 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:04:13.154 19:40:04 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:13.154 19:40:04 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:13.154 19:40:04 -- common/autotest_common.sh@10 -- # set +x 00:04:13.154 ************************************ 00:04:13.154 START TEST setup.sh 00:04:13.154 ************************************ 00:04:13.154 19:40:04 setup.sh -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:04:13.154 * Looking for test storage... 00:04:13.154 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:13.154 19:40:04 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:04:13.154 19:40:04 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:04:13.154 19:40:04 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:04:13.154 19:40:04 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:13.154 19:40:04 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:13.154 19:40:04 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:13.154 ************************************ 00:04:13.154 START TEST acl 00:04:13.154 ************************************ 00:04:13.154 19:40:04 setup.sh.acl -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:04:13.154 * Looking for test storage... 00:04:13.154 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:13.154 19:40:04 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:04:13.154 19:40:04 setup.sh.acl -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:04:13.154 19:40:04 setup.sh.acl -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:04:13.154 19:40:04 setup.sh.acl -- common/autotest_common.sh@1670 -- # local nvme bdf 00:04:13.154 19:40:04 setup.sh.acl -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:04:13.154 19:40:04 setup.sh.acl -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:04:13.154 19:40:04 setup.sh.acl -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:04:13.154 19:40:04 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:13.154 19:40:04 setup.sh.acl -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:04:13.154 19:40:04 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:04:13.154 19:40:04 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:04:13.154 19:40:04 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:04:13.154 19:40:04 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:04:13.154 19:40:04 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:04:13.154 19:40:04 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:13.154 19:40:04 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:17.349 19:40:08 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:04:17.349 19:40:08 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:04:17.349 19:40:08 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:17.349 19:40:08 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:04:17.349 19:40:08 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:04:17.349 19:40:08 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:04:20.641 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:04:20.641 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:20.641 19:40:12 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:20.641 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:04:20.641 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:20.641 19:40:12 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:20.641 Hugepages 00:04:20.641 node hugesize free / total 00:04:20.641 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:20.641 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:20.641 19:40:12 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:20.641 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:20.641 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:20.641 19:40:12 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:20.641 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:20.641 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:20.641 19:40:12 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:20.641 00:04:20.641 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:20.641 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:20.641 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:20.641 19:40:12 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:5e:00.0 == *:*:*.* ]] 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:20.901 19:40:12 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:85:05.5 == *:*:*.* ]] 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:d7:05.5 == *:*:*.* ]] 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:04:20.902 19:40:12 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:04:20.902 19:40:12 setup.sh.acl -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:20.902 19:40:12 setup.sh.acl -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:20.902 19:40:12 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:21.161 ************************************ 00:04:21.161 START TEST denied 00:04:21.161 ************************************ 00:04:21.161 19:40:12 setup.sh.acl.denied -- common/autotest_common.sh@1125 -- # denied 00:04:21.161 19:40:12 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:5e:00.0' 00:04:21.161 19:40:12 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:04:21.161 19:40:12 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:5e:00.0' 00:04:21.161 19:40:12 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:04:21.161 19:40:12 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:25.357 0000:5e:00.0 (8086 0b60): Skipping denied controller at 0000:5e:00.0 00:04:25.357 19:40:16 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:5e:00.0 00:04:25.357 19:40:16 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:04:25.357 19:40:16 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:04:25.357 19:40:16 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:5e:00.0 ]] 00:04:25.357 19:40:16 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:5e:00.0/driver 00:04:25.357 19:40:16 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:04:25.357 19:40:16 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:04:25.357 19:40:16 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:04:25.357 19:40:16 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:25.357 19:40:16 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:30.638 00:04:30.638 real 0m9.095s 00:04:30.638 user 0m2.997s 00:04:30.638 sys 0m5.408s 00:04:30.638 19:40:21 setup.sh.acl.denied -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:30.638 19:40:21 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:04:30.638 ************************************ 00:04:30.638 END TEST denied 00:04:30.638 ************************************ 00:04:30.638 19:40:21 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:04:30.638 19:40:21 setup.sh.acl -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:30.638 19:40:21 setup.sh.acl -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:30.638 19:40:21 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:30.638 ************************************ 00:04:30.638 START TEST allowed 00:04:30.638 ************************************ 00:04:30.638 19:40:21 setup.sh.acl.allowed -- common/autotest_common.sh@1125 -- # allowed 00:04:30.638 19:40:21 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:5e:00.0 00:04:30.638 19:40:21 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:04:30.638 19:40:21 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:5e:00.0 .*: nvme -> .*' 00:04:30.638 19:40:21 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:04:30.638 19:40:21 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:37.347 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:04:37.347 19:40:27 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:04:37.347 19:40:27 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:04:37.347 19:40:27 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:04:37.347 19:40:27 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:37.347 19:40:27 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:40.642 00:04:40.642 real 0m10.247s 00:04:40.642 user 0m2.735s 00:04:40.642 sys 0m5.268s 00:04:40.642 19:40:31 setup.sh.acl.allowed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:40.642 19:40:31 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:04:40.642 ************************************ 00:04:40.642 END TEST allowed 00:04:40.642 ************************************ 00:04:40.642 00:04:40.642 real 0m27.505s 00:04:40.642 user 0m8.656s 00:04:40.642 sys 0m16.218s 00:04:40.642 19:40:31 setup.sh.acl -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:40.642 19:40:31 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:40.642 ************************************ 00:04:40.642 END TEST acl 00:04:40.642 ************************************ 00:04:40.642 19:40:32 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:04:40.642 19:40:32 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:40.642 19:40:32 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:40.642 19:40:32 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:40.642 ************************************ 00:04:40.642 START TEST hugepages 00:04:40.642 ************************************ 00:04:40.642 19:40:32 setup.sh.hugepages -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:04:40.642 * Looking for test storage... 00:04:40.642 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:40.642 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:40.642 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:40.642 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:40.642 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:40.642 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:40.642 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:40.642 19:40:32 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:40.642 19:40:32 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:04:40.642 19:40:32 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:04:40.642 19:40:32 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:04:40.642 19:40:32 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:40.642 19:40:32 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:40.642 19:40:32 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:40.642 19:40:32 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:04:40.642 19:40:32 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:40.642 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.642 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 76980724 kB' 'MemAvailable: 80255468 kB' 'Buffers: 11136 kB' 'Cached: 9299256 kB' 'SwapCached: 0 kB' 'Active: 6345000 kB' 'Inactive: 3442544 kB' 'Active(anon): 5954208 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 480428 kB' 'Mapped: 180064 kB' 'Shmem: 5477056 kB' 'KReclaimable: 190100 kB' 'Slab: 507564 kB' 'SReclaimable: 190100 kB' 'SUnreclaim: 317464 kB' 'KernelStack: 16144 kB' 'PageTables: 8092 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52438196 kB' 'Committed_AS: 7335104 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200888 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.643 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.644 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:40.645 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:40.905 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:40.905 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:40.905 19:40:32 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:04:40.905 19:40:32 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:40.905 19:40:32 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:40.905 19:40:32 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:40.905 ************************************ 00:04:40.905 START TEST default_setup 00:04:40.905 ************************************ 00:04:40.905 19:40:32 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1125 -- # default_setup 00:04:40.905 19:40:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:04:40.905 19:40:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:04:40.905 19:40:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:40.905 19:40:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:04:40.905 19:40:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:40.905 19:40:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:04:40.905 19:40:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:40.905 19:40:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:40.905 19:40:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:40.905 19:40:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:40.905 19:40:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:04:40.905 19:40:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:40.905 19:40:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:40.905 19:40:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:40.905 19:40:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:40.905 19:40:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:40.905 19:40:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:40.905 19:40:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:40.905 19:40:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:04:40.905 19:40:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:04:40.905 19:40:32 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:04:40.905 19:40:32 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:45.103 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:45.103 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:45.103 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:45.103 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:45.103 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:45.103 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:45.103 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:45.103 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:45.103 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:45.103 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:45.103 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:45.103 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:45.103 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:45.103 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:45.103 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:45.103 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:45.103 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:45.103 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:47.646 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:04:47.646 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:04:47.646 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:04:47.646 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:04:47.646 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:04:47.646 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:04:47.646 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:04:47.646 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:04:47.646 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:47.646 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:47.646 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:47.646 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:47.646 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:47.646 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:47.646 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:47.646 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:47.646 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:47.646 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:47.646 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:47.646 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.646 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.646 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 79106076 kB' 'MemAvailable: 82380820 kB' 'Buffers: 11136 kB' 'Cached: 9299376 kB' 'SwapCached: 0 kB' 'Active: 6364348 kB' 'Inactive: 3442544 kB' 'Active(anon): 5973556 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 499760 kB' 'Mapped: 180292 kB' 'Shmem: 5477176 kB' 'KReclaimable: 190100 kB' 'Slab: 506852 kB' 'SReclaimable: 190100 kB' 'SUnreclaim: 316752 kB' 'KernelStack: 16464 kB' 'PageTables: 8740 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7356488 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200952 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:04:47.646 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.646 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.646 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.646 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.646 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.646 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.647 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 79108556 kB' 'MemAvailable: 82383300 kB' 'Buffers: 11136 kB' 'Cached: 9299380 kB' 'SwapCached: 0 kB' 'Active: 6364288 kB' 'Inactive: 3442544 kB' 'Active(anon): 5973496 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 499656 kB' 'Mapped: 180240 kB' 'Shmem: 5477180 kB' 'KReclaimable: 190100 kB' 'Slab: 506648 kB' 'SReclaimable: 190100 kB' 'SUnreclaim: 316548 kB' 'KernelStack: 16608 kB' 'PageTables: 9036 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7357748 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.648 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.649 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 79108712 kB' 'MemAvailable: 82383456 kB' 'Buffers: 11136 kB' 'Cached: 9299396 kB' 'SwapCached: 0 kB' 'Active: 6363848 kB' 'Inactive: 3442544 kB' 'Active(anon): 5973056 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 499148 kB' 'Mapped: 180240 kB' 'Shmem: 5477196 kB' 'KReclaimable: 190100 kB' 'Slab: 506240 kB' 'SReclaimable: 190100 kB' 'SUnreclaim: 316140 kB' 'KernelStack: 16480 kB' 'PageTables: 8900 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7358016 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201032 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.650 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.651 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:47.652 nr_hugepages=1024 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:47.652 resv_hugepages=0 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:47.652 surplus_hugepages=0 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:47.652 anon_hugepages=0 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 79108592 kB' 'MemAvailable: 82383336 kB' 'Buffers: 11136 kB' 'Cached: 9299400 kB' 'SwapCached: 0 kB' 'Active: 6363020 kB' 'Inactive: 3442544 kB' 'Active(anon): 5972228 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 498264 kB' 'Mapped: 180240 kB' 'Shmem: 5477200 kB' 'KReclaimable: 190100 kB' 'Slab: 506240 kB' 'SReclaimable: 190100 kB' 'SUnreclaim: 316140 kB' 'KernelStack: 16320 kB' 'PageTables: 8172 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7356552 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200984 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.652 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.653 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 43398680 kB' 'MemUsed: 4718260 kB' 'SwapCached: 0 kB' 'Active: 1829504 kB' 'Inactive: 89344 kB' 'Active(anon): 1631712 kB' 'Inactive(anon): 0 kB' 'Active(file): 197792 kB' 'Inactive(file): 89344 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1529572 kB' 'Mapped: 149668 kB' 'AnonPages: 392376 kB' 'Shmem: 1242436 kB' 'KernelStack: 10504 kB' 'PageTables: 6804 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 103580 kB' 'Slab: 286788 kB' 'SReclaimable: 103580 kB' 'SUnreclaim: 183208 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.654 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:47.655 node0=1024 expecting 1024 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:47.655 00:04:47.655 real 0m6.652s 00:04:47.655 user 0m1.591s 00:04:47.655 sys 0m2.660s 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:47.655 19:40:38 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:04:47.655 ************************************ 00:04:47.655 END TEST default_setup 00:04:47.655 ************************************ 00:04:47.655 19:40:38 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:04:47.655 19:40:38 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:47.655 19:40:38 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:47.655 19:40:38 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:47.655 ************************************ 00:04:47.656 START TEST per_node_1G_alloc 00:04:47.656 ************************************ 00:04:47.656 19:40:39 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1125 -- # per_node_1G_alloc 00:04:47.656 19:40:39 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:04:47.656 19:40:39 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:04:47.656 19:40:39 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:47.656 19:40:39 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:04:47.656 19:40:39 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:04:47.656 19:40:39 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:04:47.656 19:40:39 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:47.656 19:40:39 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:47.656 19:40:39 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:47.656 19:40:39 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:04:47.656 19:40:39 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:04:47.656 19:40:39 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:47.656 19:40:39 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:47.656 19:40:39 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:47.656 19:40:39 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:47.656 19:40:39 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:47.656 19:40:39 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:04:47.656 19:40:39 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:47.656 19:40:39 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:47.656 19:40:39 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:47.656 19:40:39 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:47.656 19:40:39 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:47.656 19:40:39 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:04:47.656 19:40:39 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:04:47.656 19:40:39 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:04:47.656 19:40:39 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:47.656 19:40:39 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:50.949 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:50.949 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:50.949 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:50.949 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:50.949 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:50.949 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:50.949 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:51.212 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:51.212 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:51.212 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:51.212 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:51.212 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:51.212 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:51.212 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:51.212 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:51.212 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:51.212 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:51.212 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:51.212 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 79100632 kB' 'MemAvailable: 82375376 kB' 'Buffers: 11136 kB' 'Cached: 9299512 kB' 'SwapCached: 0 kB' 'Active: 6358164 kB' 'Inactive: 3442544 kB' 'Active(anon): 5967372 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 492744 kB' 'Mapped: 179208 kB' 'Shmem: 5477312 kB' 'KReclaimable: 190100 kB' 'Slab: 506120 kB' 'SReclaimable: 190100 kB' 'SUnreclaim: 316020 kB' 'KernelStack: 16128 kB' 'PageTables: 7816 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7344320 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200968 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.212 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.213 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 79101196 kB' 'MemAvailable: 82375940 kB' 'Buffers: 11136 kB' 'Cached: 9299516 kB' 'SwapCached: 0 kB' 'Active: 6357456 kB' 'Inactive: 3442544 kB' 'Active(anon): 5966664 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 492520 kB' 'Mapped: 179112 kB' 'Shmem: 5477316 kB' 'KReclaimable: 190100 kB' 'Slab: 506076 kB' 'SReclaimable: 190100 kB' 'SUnreclaim: 315976 kB' 'KernelStack: 16144 kB' 'PageTables: 7864 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7345832 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200952 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.214 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.215 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 79101972 kB' 'MemAvailable: 82376716 kB' 'Buffers: 11136 kB' 'Cached: 9299532 kB' 'SwapCached: 0 kB' 'Active: 6357084 kB' 'Inactive: 3442544 kB' 'Active(anon): 5966292 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 492104 kB' 'Mapped: 179112 kB' 'Shmem: 5477332 kB' 'KReclaimable: 190100 kB' 'Slab: 506076 kB' 'SReclaimable: 190100 kB' 'SUnreclaim: 315976 kB' 'KernelStack: 16080 kB' 'PageTables: 7656 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7345480 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200920 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.216 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.217 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.218 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.481 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:51.482 nr_hugepages=1024 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:51.482 resv_hugepages=0 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:51.482 surplus_hugepages=0 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:51.482 anon_hugepages=0 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 79103000 kB' 'MemAvailable: 82377744 kB' 'Buffers: 11136 kB' 'Cached: 9299552 kB' 'SwapCached: 0 kB' 'Active: 6357576 kB' 'Inactive: 3442544 kB' 'Active(anon): 5966784 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 492608 kB' 'Mapped: 179112 kB' 'Shmem: 5477352 kB' 'KReclaimable: 190100 kB' 'Slab: 506076 kB' 'SReclaimable: 190100 kB' 'SUnreclaim: 315976 kB' 'KernelStack: 16224 kB' 'PageTables: 7940 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7346992 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.482 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.483 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 44442080 kB' 'MemUsed: 3674860 kB' 'SwapCached: 0 kB' 'Active: 1825012 kB' 'Inactive: 89344 kB' 'Active(anon): 1627220 kB' 'Inactive(anon): 0 kB' 'Active(file): 197792 kB' 'Inactive(file): 89344 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1529572 kB' 'Mapped: 149324 kB' 'AnonPages: 387872 kB' 'Shmem: 1242436 kB' 'KernelStack: 10424 kB' 'PageTables: 6248 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 103580 kB' 'Slab: 286580 kB' 'SReclaimable: 103580 kB' 'SUnreclaim: 183000 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.484 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:51.485 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176552 kB' 'MemFree: 34659940 kB' 'MemUsed: 9516612 kB' 'SwapCached: 0 kB' 'Active: 4532520 kB' 'Inactive: 3353200 kB' 'Active(anon): 4339520 kB' 'Inactive(anon): 0 kB' 'Active(file): 193000 kB' 'Inactive(file): 3353200 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7781164 kB' 'Mapped: 29788 kB' 'AnonPages: 104600 kB' 'Shmem: 4234964 kB' 'KernelStack: 5944 kB' 'PageTables: 1984 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 86520 kB' 'Slab: 219496 kB' 'SReclaimable: 86520 kB' 'SUnreclaim: 132976 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.486 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:51.487 node0=512 expecting 512 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:51.487 node1=512 expecting 512 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:51.487 00:04:51.487 real 0m3.916s 00:04:51.487 user 0m1.551s 00:04:51.487 sys 0m2.468s 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:51.487 19:40:42 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:51.487 ************************************ 00:04:51.487 END TEST per_node_1G_alloc 00:04:51.487 ************************************ 00:04:51.487 19:40:42 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:51.487 19:40:42 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:51.487 19:40:42 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:51.487 19:40:42 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:51.487 ************************************ 00:04:51.487 START TEST even_2G_alloc 00:04:51.487 ************************************ 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1125 -- # even_2G_alloc 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:51.487 19:40:43 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:55.686 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:55.686 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:55.686 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:55.686 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:55.686 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:55.686 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:55.686 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:55.686 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:55.686 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:55.686 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:55.686 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:55.686 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:55.686 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:55.686 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:55.686 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:55.686 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:55.686 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:55.686 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:55.686 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:55.686 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:55.686 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:55.686 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:55.686 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:55.686 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:55.686 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:55.686 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:55.686 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:55.686 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:55.686 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:55.686 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:55.686 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:55.686 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:55.686 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:55.686 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:55.686 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:55.686 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:55.686 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:55.686 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.686 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 79114208 kB' 'MemAvailable: 82388952 kB' 'Buffers: 11136 kB' 'Cached: 9299672 kB' 'SwapCached: 0 kB' 'Active: 6358188 kB' 'Inactive: 3442544 kB' 'Active(anon): 5967396 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 493264 kB' 'Mapped: 179236 kB' 'Shmem: 5477472 kB' 'KReclaimable: 190100 kB' 'Slab: 506136 kB' 'SReclaimable: 190100 kB' 'SUnreclaim: 316036 kB' 'KernelStack: 16160 kB' 'PageTables: 7912 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7344868 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201016 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.687 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 79118104 kB' 'MemAvailable: 82392848 kB' 'Buffers: 11136 kB' 'Cached: 9299676 kB' 'SwapCached: 0 kB' 'Active: 6357548 kB' 'Inactive: 3442544 kB' 'Active(anon): 5966756 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 492560 kB' 'Mapped: 179124 kB' 'Shmem: 5477476 kB' 'KReclaimable: 190100 kB' 'Slab: 506108 kB' 'SReclaimable: 190100 kB' 'SUnreclaim: 316008 kB' 'KernelStack: 16128 kB' 'PageTables: 7804 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7344884 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.688 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.689 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 79119016 kB' 'MemAvailable: 82393760 kB' 'Buffers: 11136 kB' 'Cached: 9299692 kB' 'SwapCached: 0 kB' 'Active: 6358128 kB' 'Inactive: 3442544 kB' 'Active(anon): 5967336 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 493104 kB' 'Mapped: 179128 kB' 'Shmem: 5477492 kB' 'KReclaimable: 190100 kB' 'Slab: 506108 kB' 'SReclaimable: 190100 kB' 'SUnreclaim: 316008 kB' 'KernelStack: 16144 kB' 'PageTables: 7860 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7347512 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200952 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.690 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.691 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:55.692 nr_hugepages=1024 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:55.692 resv_hugepages=0 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:55.692 surplus_hugepages=0 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:55.692 anon_hugepages=0 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 79119880 kB' 'MemAvailable: 82394624 kB' 'Buffers: 11136 kB' 'Cached: 9299716 kB' 'SwapCached: 0 kB' 'Active: 6358216 kB' 'Inactive: 3442544 kB' 'Active(anon): 5967424 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 493204 kB' 'Mapped: 179128 kB' 'Shmem: 5477516 kB' 'KReclaimable: 190100 kB' 'Slab: 506108 kB' 'SReclaimable: 190100 kB' 'SUnreclaim: 316008 kB' 'KernelStack: 16272 kB' 'PageTables: 8096 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7346048 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201080 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.692 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.693 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.694 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 44437168 kB' 'MemUsed: 3679772 kB' 'SwapCached: 0 kB' 'Active: 1825212 kB' 'Inactive: 89344 kB' 'Active(anon): 1627420 kB' 'Inactive(anon): 0 kB' 'Active(file): 197792 kB' 'Inactive(file): 89344 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1529640 kB' 'Mapped: 149336 kB' 'AnonPages: 388060 kB' 'Shmem: 1242504 kB' 'KernelStack: 10216 kB' 'PageTables: 5868 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 103580 kB' 'Slab: 286860 kB' 'SReclaimable: 103580 kB' 'SUnreclaim: 183280 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.695 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:55.696 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:55.697 19:40:46 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176552 kB' 'MemFree: 34682548 kB' 'MemUsed: 9494004 kB' 'SwapCached: 0 kB' 'Active: 4533112 kB' 'Inactive: 3353200 kB' 'Active(anon): 4340112 kB' 'Inactive(anon): 0 kB' 'Active(file): 193000 kB' 'Inactive(file): 3353200 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7781232 kB' 'Mapped: 29792 kB' 'AnonPages: 104672 kB' 'Shmem: 4235032 kB' 'KernelStack: 6008 kB' 'PageTables: 2300 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 86520 kB' 'Slab: 219248 kB' 'SReclaimable: 86520 kB' 'SUnreclaim: 132728 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.697 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.698 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:55.699 node0=512 expecting 512 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:55.699 node1=512 expecting 512 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:55.699 00:04:55.699 real 0m4.043s 00:04:55.699 user 0m1.560s 00:04:55.699 sys 0m2.587s 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:55.699 19:40:47 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:55.699 ************************************ 00:04:55.699 END TEST even_2G_alloc 00:04:55.699 ************************************ 00:04:55.699 19:40:47 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:55.699 19:40:47 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:55.699 19:40:47 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:55.699 19:40:47 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:55.699 ************************************ 00:04:55.699 START TEST odd_alloc 00:04:55.699 ************************************ 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1125 -- # odd_alloc 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:55.699 19:40:47 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:59.900 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:59.900 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:59.900 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:59.900 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:59.900 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:59.900 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:59.900 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:59.900 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:59.900 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:59.900 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:59.900 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:59.900 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:59.900 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:59.900 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:59.900 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:59.900 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:59.900 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:59.900 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:59.900 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:59.900 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:59.900 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:04:59.900 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:59.900 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:59.900 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:59.900 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:59.900 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:59.900 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:59.900 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:59.900 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:59.900 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:59.900 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:59.900 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:59.900 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:59.900 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:59.900 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:59.900 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:59.900 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:59.900 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 79094208 kB' 'MemAvailable: 82368952 kB' 'Buffers: 11136 kB' 'Cached: 9299824 kB' 'SwapCached: 0 kB' 'Active: 6359100 kB' 'Inactive: 3442544 kB' 'Active(anon): 5968308 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 493932 kB' 'Mapped: 179180 kB' 'Shmem: 5477624 kB' 'KReclaimable: 190100 kB' 'Slab: 505992 kB' 'SReclaimable: 190100 kB' 'SUnreclaim: 315892 kB' 'KernelStack: 16128 kB' 'PageTables: 7788 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485748 kB' 'Committed_AS: 7345692 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.901 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 79094968 kB' 'MemAvailable: 82369712 kB' 'Buffers: 11136 kB' 'Cached: 9299828 kB' 'SwapCached: 0 kB' 'Active: 6358788 kB' 'Inactive: 3442544 kB' 'Active(anon): 5967996 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 493644 kB' 'Mapped: 179136 kB' 'Shmem: 5477628 kB' 'KReclaimable: 190100 kB' 'Slab: 505992 kB' 'SReclaimable: 190100 kB' 'SUnreclaim: 315892 kB' 'KernelStack: 16128 kB' 'PageTables: 7800 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485748 kB' 'Committed_AS: 7345344 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200984 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.902 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.903 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 79097872 kB' 'MemAvailable: 82372616 kB' 'Buffers: 11136 kB' 'Cached: 9299844 kB' 'SwapCached: 0 kB' 'Active: 6359220 kB' 'Inactive: 3442544 kB' 'Active(anon): 5968428 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 494064 kB' 'Mapped: 179140 kB' 'Shmem: 5477644 kB' 'KReclaimable: 190100 kB' 'Slab: 505992 kB' 'SReclaimable: 190100 kB' 'SUnreclaim: 315892 kB' 'KernelStack: 16096 kB' 'PageTables: 7712 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485748 kB' 'Committed_AS: 7348340 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.904 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.905 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:59.906 nr_hugepages=1025 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:59.906 resv_hugepages=0 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:59.906 surplus_hugepages=0 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:59.906 anon_hugepages=0 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 79099000 kB' 'MemAvailable: 82373744 kB' 'Buffers: 11136 kB' 'Cached: 9299864 kB' 'SwapCached: 0 kB' 'Active: 6358816 kB' 'Inactive: 3442544 kB' 'Active(anon): 5968024 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 493664 kB' 'Mapped: 179136 kB' 'Shmem: 5477664 kB' 'KReclaimable: 190100 kB' 'Slab: 505992 kB' 'SReclaimable: 190100 kB' 'SUnreclaim: 315892 kB' 'KernelStack: 16176 kB' 'PageTables: 7680 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485748 kB' 'Committed_AS: 7358132 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201032 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.906 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.907 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:59.908 19:40:50 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 44421008 kB' 'MemUsed: 3695932 kB' 'SwapCached: 0 kB' 'Active: 1826084 kB' 'Inactive: 89344 kB' 'Active(anon): 1628292 kB' 'Inactive(anon): 0 kB' 'Active(file): 197792 kB' 'Inactive(file): 89344 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1529740 kB' 'Mapped: 149348 kB' 'AnonPages: 388856 kB' 'Shmem: 1242604 kB' 'KernelStack: 10376 kB' 'PageTables: 6128 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 103580 kB' 'Slab: 286764 kB' 'SReclaimable: 103580 kB' 'SUnreclaim: 183184 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.908 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176552 kB' 'MemFree: 34681284 kB' 'MemUsed: 9495268 kB' 'SwapCached: 0 kB' 'Active: 4532760 kB' 'Inactive: 3353200 kB' 'Active(anon): 4339760 kB' 'Inactive(anon): 0 kB' 'Active(file): 193000 kB' 'Inactive(file): 3353200 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7781304 kB' 'Mapped: 29792 kB' 'AnonPages: 104748 kB' 'Shmem: 4235104 kB' 'KernelStack: 5832 kB' 'PageTables: 1604 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 86520 kB' 'Slab: 219156 kB' 'SReclaimable: 86520 kB' 'SUnreclaim: 132636 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.909 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.910 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:59.911 node0=512 expecting 513 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:59.911 node1=513 expecting 512 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:59.911 00:04:59.911 real 0m3.920s 00:04:59.911 user 0m1.523s 00:04:59.911 sys 0m2.501s 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:59.911 19:40:51 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:59.911 ************************************ 00:04:59.911 END TEST odd_alloc 00:04:59.911 ************************************ 00:04:59.911 19:40:51 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:59.911 19:40:51 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:59.911 19:40:51 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:59.911 19:40:51 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:59.911 ************************************ 00:04:59.911 START TEST custom_alloc 00:04:59.911 ************************************ 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1125 -- # custom_alloc 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:59.911 19:40:51 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:03.291 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:03.291 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:03.291 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:03.291 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:05:03.291 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:03.291 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:03.291 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:03.291 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:03.291 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:03.291 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:03.291 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:03.291 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:03.291 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:03.291 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:03.291 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:03.291 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:03.291 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:03.291 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:03.291 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 78067816 kB' 'MemAvailable: 81342560 kB' 'Buffers: 11136 kB' 'Cached: 9299976 kB' 'SwapCached: 0 kB' 'Active: 6360260 kB' 'Inactive: 3442544 kB' 'Active(anon): 5969468 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 494452 kB' 'Mapped: 179320 kB' 'Shmem: 5477776 kB' 'KReclaimable: 190100 kB' 'Slab: 505612 kB' 'SReclaimable: 190100 kB' 'SUnreclaim: 315512 kB' 'KernelStack: 16144 kB' 'PageTables: 7844 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962484 kB' 'Committed_AS: 7346356 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200968 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.291 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.292 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 78068912 kB' 'MemAvailable: 81343656 kB' 'Buffers: 11136 kB' 'Cached: 9299980 kB' 'SwapCached: 0 kB' 'Active: 6359412 kB' 'Inactive: 3442544 kB' 'Active(anon): 5968620 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 494100 kB' 'Mapped: 179156 kB' 'Shmem: 5477780 kB' 'KReclaimable: 190100 kB' 'Slab: 505596 kB' 'SReclaimable: 190100 kB' 'SUnreclaim: 315496 kB' 'KernelStack: 16128 kB' 'PageTables: 7800 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962484 kB' 'Committed_AS: 7346376 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200952 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.293 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.294 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.295 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.296 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 78069612 kB' 'MemAvailable: 81344356 kB' 'Buffers: 11136 kB' 'Cached: 9299980 kB' 'SwapCached: 0 kB' 'Active: 6359116 kB' 'Inactive: 3442544 kB' 'Active(anon): 5968324 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 493796 kB' 'Mapped: 179156 kB' 'Shmem: 5477780 kB' 'KReclaimable: 190100 kB' 'Slab: 505596 kB' 'SReclaimable: 190100 kB' 'SUnreclaim: 315496 kB' 'KernelStack: 16128 kB' 'PageTables: 7800 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962484 kB' 'Committed_AS: 7346396 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200952 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.297 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.298 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.299 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:05:03.300 nr_hugepages=1536 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:03.300 resv_hugepages=0 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:03.300 surplus_hugepages=0 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:03.300 anon_hugepages=0 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.300 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 78068860 kB' 'MemAvailable: 81343604 kB' 'Buffers: 11136 kB' 'Cached: 9300036 kB' 'SwapCached: 0 kB' 'Active: 6359092 kB' 'Inactive: 3442544 kB' 'Active(anon): 5968300 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 493704 kB' 'Mapped: 179156 kB' 'Shmem: 5477836 kB' 'KReclaimable: 190100 kB' 'Slab: 505596 kB' 'SReclaimable: 190100 kB' 'SUnreclaim: 315496 kB' 'KernelStack: 16112 kB' 'PageTables: 7744 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962484 kB' 'Committed_AS: 7346416 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200952 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.301 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.302 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:03.303 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 44438328 kB' 'MemUsed: 3678612 kB' 'SwapCached: 0 kB' 'Active: 1825088 kB' 'Inactive: 89344 kB' 'Active(anon): 1627296 kB' 'Inactive(anon): 0 kB' 'Active(file): 197792 kB' 'Inactive(file): 89344 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1529736 kB' 'Mapped: 149368 kB' 'AnonPages: 387772 kB' 'Shmem: 1242600 kB' 'KernelStack: 10200 kB' 'PageTables: 5804 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 103580 kB' 'Slab: 286564 kB' 'SReclaimable: 103580 kB' 'SUnreclaim: 182984 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.304 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.305 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176552 kB' 'MemFree: 33630360 kB' 'MemUsed: 10546192 kB' 'SwapCached: 0 kB' 'Active: 4534404 kB' 'Inactive: 3353200 kB' 'Active(anon): 4341404 kB' 'Inactive(anon): 0 kB' 'Active(file): 193000 kB' 'Inactive(file): 3353200 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7781460 kB' 'Mapped: 29788 kB' 'AnonPages: 106344 kB' 'Shmem: 4235260 kB' 'KernelStack: 5928 kB' 'PageTables: 1996 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 86520 kB' 'Slab: 219032 kB' 'SReclaimable: 86520 kB' 'SUnreclaim: 132512 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.568 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.569 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:05:03.570 node0=512 expecting 512 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:05:03.570 node1=1024 expecting 1024 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:05:03.570 00:05:03.570 real 0m3.778s 00:05:03.570 user 0m1.410s 00:05:03.570 sys 0m2.457s 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:03.570 19:40:54 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:03.570 ************************************ 00:05:03.570 END TEST custom_alloc 00:05:03.571 ************************************ 00:05:03.571 19:40:54 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:05:03.571 19:40:54 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:03.571 19:40:54 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:03.571 19:40:54 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:03.571 ************************************ 00:05:03.571 START TEST no_shrink_alloc 00:05:03.571 ************************************ 00:05:03.571 19:40:54 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1125 -- # no_shrink_alloc 00:05:03.571 19:40:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:05:03.571 19:40:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:05:03.571 19:40:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:05:03.571 19:40:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:05:03.571 19:40:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:05:03.571 19:40:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:05:03.571 19:40:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:03.571 19:40:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:05:03.571 19:40:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:05:03.571 19:40:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:05:03.571 19:40:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:03.571 19:40:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:03.571 19:40:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:03.571 19:40:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:03.571 19:40:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:03.571 19:40:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:05:03.571 19:40:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:05:03.571 19:40:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:05:03.571 19:40:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:05:03.571 19:40:54 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:05:03.571 19:40:54 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:03.571 19:40:54 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:06.861 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:06.861 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:06.861 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:06.861 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:05:06.861 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:06.861 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:06.861 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:06.861 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:06.861 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:06.861 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:06.861 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:06.861 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:06.861 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:06.861 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:06.861 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:06.861 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:07.125 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:07.125 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:07.125 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 79071252 kB' 'MemAvailable: 82345996 kB' 'Buffers: 11136 kB' 'Cached: 9300124 kB' 'SwapCached: 0 kB' 'Active: 6360288 kB' 'Inactive: 3442544 kB' 'Active(anon): 5969496 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 494788 kB' 'Mapped: 179216 kB' 'Shmem: 5477924 kB' 'KReclaimable: 190100 kB' 'Slab: 505468 kB' 'SReclaimable: 190100 kB' 'SUnreclaim: 315368 kB' 'KernelStack: 16160 kB' 'PageTables: 7856 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7346760 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200984 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.125 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.126 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 79072764 kB' 'MemAvailable: 82347508 kB' 'Buffers: 11136 kB' 'Cached: 9300128 kB' 'SwapCached: 0 kB' 'Active: 6360268 kB' 'Inactive: 3442544 kB' 'Active(anon): 5969476 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 494820 kB' 'Mapped: 179168 kB' 'Shmem: 5477928 kB' 'KReclaimable: 190100 kB' 'Slab: 505516 kB' 'SReclaimable: 190100 kB' 'SUnreclaim: 315416 kB' 'KernelStack: 16144 kB' 'PageTables: 7808 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7346780 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.127 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.128 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 79073212 kB' 'MemAvailable: 82347956 kB' 'Buffers: 11136 kB' 'Cached: 9300144 kB' 'SwapCached: 0 kB' 'Active: 6359988 kB' 'Inactive: 3442544 kB' 'Active(anon): 5969196 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 494484 kB' 'Mapped: 179168 kB' 'Shmem: 5477944 kB' 'KReclaimable: 190100 kB' 'Slab: 505512 kB' 'SReclaimable: 190100 kB' 'SUnreclaim: 315412 kB' 'KernelStack: 16144 kB' 'PageTables: 7804 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7346800 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.129 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:07.130 nr_hugepages=1024 00:05:07.130 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:07.130 resv_hugepages=0 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:07.131 surplus_hugepages=0 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:07.131 anon_hugepages=0 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 79073416 kB' 'MemAvailable: 82348160 kB' 'Buffers: 11136 kB' 'Cached: 9300164 kB' 'SwapCached: 0 kB' 'Active: 6359856 kB' 'Inactive: 3442544 kB' 'Active(anon): 5969064 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 494316 kB' 'Mapped: 179168 kB' 'Shmem: 5477964 kB' 'KReclaimable: 190100 kB' 'Slab: 505512 kB' 'SReclaimable: 190100 kB' 'SUnreclaim: 315412 kB' 'KernelStack: 16128 kB' 'PageTables: 7748 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7346824 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.131 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:07.132 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 43375728 kB' 'MemUsed: 4741212 kB' 'SwapCached: 0 kB' 'Active: 1825212 kB' 'Inactive: 89344 kB' 'Active(anon): 1627420 kB' 'Inactive(anon): 0 kB' 'Active(file): 197792 kB' 'Inactive(file): 89344 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1529740 kB' 'Mapped: 149380 kB' 'AnonPages: 387992 kB' 'Shmem: 1242604 kB' 'KernelStack: 10232 kB' 'PageTables: 5856 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 103580 kB' 'Slab: 286428 kB' 'SReclaimable: 103580 kB' 'SUnreclaim: 182848 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.133 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.393 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:05:07.394 node0=1024 expecting 1024 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:07.394 19:40:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:10.684 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:10.684 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:10.947 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:10.947 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:05:10.947 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:10.947 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:10.947 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:10.947 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:10.947 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:10.947 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:10.947 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:10.947 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:10.947 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:10.947 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:10.947 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:10.947 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:10.947 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:10.947 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:10.947 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:10.947 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 79071204 kB' 'MemAvailable: 82345952 kB' 'Buffers: 11136 kB' 'Cached: 9300264 kB' 'SwapCached: 0 kB' 'Active: 6362356 kB' 'Inactive: 3442544 kB' 'Active(anon): 5971564 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 496068 kB' 'Mapped: 179180 kB' 'Shmem: 5478064 kB' 'KReclaimable: 190108 kB' 'Slab: 505540 kB' 'SReclaimable: 190108 kB' 'SUnreclaim: 315432 kB' 'KernelStack: 16304 kB' 'PageTables: 8276 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7347448 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200968 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.947 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.948 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 79071928 kB' 'MemAvailable: 82346676 kB' 'Buffers: 11136 kB' 'Cached: 9300276 kB' 'SwapCached: 0 kB' 'Active: 6362024 kB' 'Inactive: 3442544 kB' 'Active(anon): 5971232 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 496408 kB' 'Mapped: 179172 kB' 'Shmem: 5478076 kB' 'KReclaimable: 190108 kB' 'Slab: 505524 kB' 'SReclaimable: 190108 kB' 'SUnreclaim: 315416 kB' 'KernelStack: 16272 kB' 'PageTables: 8244 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7347104 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200872 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.949 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.950 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 79072232 kB' 'MemAvailable: 82346980 kB' 'Buffers: 11136 kB' 'Cached: 9300296 kB' 'SwapCached: 0 kB' 'Active: 6363900 kB' 'Inactive: 3442544 kB' 'Active(anon): 5973108 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 498260 kB' 'Mapped: 179736 kB' 'Shmem: 5478096 kB' 'KReclaimable: 190108 kB' 'Slab: 505524 kB' 'SReclaimable: 190108 kB' 'SUnreclaim: 315416 kB' 'KernelStack: 16224 kB' 'PageTables: 8076 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7349804 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200824 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.951 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.284 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.284 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.284 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.284 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.284 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.284 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.284 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.284 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.284 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.284 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.284 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.284 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.284 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.284 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.284 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.284 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.284 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.285 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:11.286 nr_hugepages=1024 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:11.286 resv_hugepages=0 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:11.286 surplus_hugepages=0 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:11.286 anon_hugepages=0 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.286 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293492 kB' 'MemFree: 79065684 kB' 'MemAvailable: 82340432 kB' 'Buffers: 11136 kB' 'Cached: 9300316 kB' 'SwapCached: 0 kB' 'Active: 6367312 kB' 'Inactive: 3442544 kB' 'Active(anon): 5976520 kB' 'Inactive(anon): 0 kB' 'Active(file): 390792 kB' 'Inactive(file): 3442544 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 501652 kB' 'Mapped: 180088 kB' 'Shmem: 5478116 kB' 'KReclaimable: 190108 kB' 'Slab: 505524 kB' 'SReclaimable: 190108 kB' 'SUnreclaim: 315416 kB' 'KernelStack: 16240 kB' 'PageTables: 8124 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486772 kB' 'Committed_AS: 7353272 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200840 kB' 'VmallocChunk: 0 kB' 'Percpu: 50240 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 712100 kB' 'DirectMap2M: 14692352 kB' 'DirectMap1G: 85983232 kB' 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.287 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:11.288 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 43369444 kB' 'MemUsed: 4747496 kB' 'SwapCached: 0 kB' 'Active: 1827924 kB' 'Inactive: 89344 kB' 'Active(anon): 1630132 kB' 'Inactive(anon): 0 kB' 'Active(file): 197792 kB' 'Inactive(file): 89344 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1529764 kB' 'Mapped: 149384 kB' 'AnonPages: 390640 kB' 'Shmem: 1242628 kB' 'KernelStack: 10360 kB' 'PageTables: 6248 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 103580 kB' 'Slab: 286544 kB' 'SReclaimable: 103580 kB' 'SUnreclaim: 182964 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.289 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:05:11.290 node0=1024 expecting 1024 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:05:11.290 00:05:11.290 real 0m7.629s 00:05:11.290 user 0m2.949s 00:05:11.290 sys 0m4.874s 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:11.290 19:41:02 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:11.290 ************************************ 00:05:11.290 END TEST no_shrink_alloc 00:05:11.290 ************************************ 00:05:11.290 19:41:02 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:05:11.290 19:41:02 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:05:11.290 19:41:02 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:11.290 19:41:02 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:11.290 19:41:02 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:11.290 19:41:02 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:11.290 19:41:02 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:11.290 19:41:02 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:11.290 19:41:02 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:11.290 19:41:02 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:11.290 19:41:02 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:11.290 19:41:02 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:11.290 19:41:02 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:05:11.290 19:41:02 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:05:11.290 00:05:11.290 real 0m30.621s 00:05:11.290 user 0m10.834s 00:05:11.290 sys 0m18.032s 00:05:11.290 19:41:02 setup.sh.hugepages -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:11.290 19:41:02 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:11.290 ************************************ 00:05:11.290 END TEST hugepages 00:05:11.290 ************************************ 00:05:11.290 19:41:02 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:05:11.290 19:41:02 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:11.290 19:41:02 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:11.290 19:41:02 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:11.290 ************************************ 00:05:11.290 START TEST driver 00:05:11.290 ************************************ 00:05:11.290 19:41:02 setup.sh.driver -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:05:11.550 * Looking for test storage... 00:05:11.550 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:05:11.550 19:41:02 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:05:11.550 19:41:02 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:11.550 19:41:02 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:16.825 19:41:07 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:05:16.825 19:41:07 setup.sh.driver -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:16.825 19:41:07 setup.sh.driver -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:16.825 19:41:07 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:16.825 ************************************ 00:05:16.825 START TEST guess_driver 00:05:16.825 ************************************ 00:05:16.825 19:41:07 setup.sh.driver.guess_driver -- common/autotest_common.sh@1125 -- # guess_driver 00:05:16.825 19:41:07 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:05:16.825 19:41:07 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:05:16.825 19:41:07 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:05:16.825 19:41:07 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:05:16.825 19:41:07 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:05:16.825 19:41:07 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:05:16.825 19:41:07 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:05:16.825 19:41:07 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:05:16.825 19:41:07 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:05:16.825 19:41:07 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 216 > 0 )) 00:05:16.825 19:41:07 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:05:16.825 19:41:07 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:05:16.825 19:41:07 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:05:16.825 19:41:07 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:05:16.825 19:41:07 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:05:16.825 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:16.825 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:16.825 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:16.825 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:16.825 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:05:16.825 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:05:16.825 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:05:16.825 19:41:07 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:05:16.825 19:41:07 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:05:16.825 19:41:07 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:05:16.825 19:41:07 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:05:16.825 19:41:07 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:05:16.825 Looking for driver=vfio-pci 00:05:16.825 19:41:07 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:16.825 19:41:07 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:05:16.825 19:41:07 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:05:16.825 19:41:07 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:20.118 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:05:20.118 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:05:20.118 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.118 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:05:20.118 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:05:20.118 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.118 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.118 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.118 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.118 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.118 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.118 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.118 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.118 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.118 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.118 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.118 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.118 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.118 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.118 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.118 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.118 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.119 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.119 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.378 19:41:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:22.913 19:41:14 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:22.913 19:41:14 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:22.913 19:41:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:22.913 19:41:14 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:05:22.913 19:41:14 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:05:22.913 19:41:14 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:22.913 19:41:14 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:28.188 00:05:28.188 real 0m11.448s 00:05:28.188 user 0m2.857s 00:05:28.188 sys 0m5.405s 00:05:28.188 19:41:19 setup.sh.driver.guess_driver -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:28.188 19:41:19 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:05:28.188 ************************************ 00:05:28.188 END TEST guess_driver 00:05:28.188 ************************************ 00:05:28.188 00:05:28.188 real 0m16.709s 00:05:28.188 user 0m4.356s 00:05:28.188 sys 0m8.393s 00:05:28.188 19:41:19 setup.sh.driver -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:28.188 19:41:19 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:28.188 ************************************ 00:05:28.188 END TEST driver 00:05:28.188 ************************************ 00:05:28.188 19:41:19 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:05:28.188 19:41:19 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:28.188 19:41:19 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:28.188 19:41:19 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:28.188 ************************************ 00:05:28.188 START TEST devices 00:05:28.188 ************************************ 00:05:28.188 19:41:19 setup.sh.devices -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:05:28.188 * Looking for test storage... 00:05:28.188 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:05:28.189 19:41:19 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:05:28.189 19:41:19 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:05:28.189 19:41:19 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:28.189 19:41:19 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:32.456 19:41:23 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:05:32.456 19:41:23 setup.sh.devices -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:05:32.456 19:41:23 setup.sh.devices -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:05:32.456 19:41:23 setup.sh.devices -- common/autotest_common.sh@1670 -- # local nvme bdf 00:05:32.456 19:41:23 setup.sh.devices -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:05:32.456 19:41:23 setup.sh.devices -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:05:32.456 19:41:23 setup.sh.devices -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:05:32.456 19:41:23 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:32.456 19:41:23 setup.sh.devices -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:05:32.456 19:41:23 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:05:32.456 19:41:23 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:05:32.456 19:41:23 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:05:32.456 19:41:23 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:05:32.456 19:41:23 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:05:32.456 19:41:23 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:05:32.456 19:41:23 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:05:32.456 19:41:23 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:05:32.456 19:41:23 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5e:00.0 00:05:32.456 19:41:23 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:05:32.456 19:41:23 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:05:32.456 19:41:23 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:05:32.456 19:41:23 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:05:32.456 No valid GPT data, bailing 00:05:32.456 19:41:23 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:32.456 19:41:23 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:05:32.456 19:41:23 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:05:32.456 19:41:23 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:05:32.456 19:41:23 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:05:32.456 19:41:23 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:05:32.456 19:41:23 setup.sh.devices -- setup/common.sh@80 -- # echo 7681501126656 00:05:32.456 19:41:23 setup.sh.devices -- setup/devices.sh@204 -- # (( 7681501126656 >= min_disk_size )) 00:05:32.456 19:41:23 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:05:32.456 19:41:23 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:5e:00.0 00:05:32.456 19:41:23 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:05:32.456 19:41:23 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:05:32.456 19:41:23 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:05:32.456 19:41:23 setup.sh.devices -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:32.456 19:41:23 setup.sh.devices -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:32.456 19:41:23 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:32.456 ************************************ 00:05:32.456 START TEST nvme_mount 00:05:32.456 ************************************ 00:05:32.456 19:41:23 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1125 -- # nvme_mount 00:05:32.456 19:41:23 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:05:32.456 19:41:23 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:05:32.456 19:41:23 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:32.456 19:41:23 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:32.456 19:41:23 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:05:32.456 19:41:23 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:32.456 19:41:23 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:05:32.456 19:41:23 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:32.456 19:41:23 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:32.456 19:41:23 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:05:32.456 19:41:23 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:05:32.456 19:41:23 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:32.456 19:41:23 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:32.456 19:41:23 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:32.456 19:41:23 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:32.456 19:41:23 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:32.456 19:41:23 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:32.456 19:41:23 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:32.456 19:41:23 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:05:33.395 Creating new GPT entries in memory. 00:05:33.395 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:33.395 other utilities. 00:05:33.395 19:41:24 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:33.395 19:41:24 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:33.395 19:41:24 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:33.395 19:41:24 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:33.395 19:41:24 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:34.332 Creating new GPT entries in memory. 00:05:34.332 The operation has completed successfully. 00:05:34.332 19:41:25 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:34.332 19:41:25 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:34.332 19:41:25 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 1314638 00:05:34.332 19:41:25 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:34.332 19:41:25 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:34.332 19:41:25 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:34.591 19:41:25 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:34.591 19:41:25 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:34.591 19:41:25 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:34.591 19:41:25 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:34.591 19:41:25 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:34.591 19:41:25 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:34.591 19:41:25 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:34.591 19:41:25 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:34.591 19:41:25 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:34.591 19:41:25 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:34.591 19:41:25 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:34.591 19:41:25 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:34.591 19:41:25 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.591 19:41:25 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:34.591 19:41:25 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:34.591 19:41:25 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:34.591 19:41:25 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.880 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.140 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:38.140 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:38.140 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:38.140 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:38.140 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:38.140 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:05:38.140 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:38.140 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:38.140 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:38.140 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:38.140 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:38.140 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:38.140 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:38.399 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:38.399 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:05:38.399 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:38.399 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:38.399 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:05:38.399 19:41:29 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:05:38.399 19:41:29 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:38.399 19:41:29 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:05:38.399 19:41:29 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:05:38.400 19:41:29 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:38.400 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:38.400 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:38.400 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:05:38.400 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:38.400 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:38.400 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:38.400 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:38.400 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:38.400 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:38.400 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.400 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:38.400 19:41:29 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:38.400 19:41:29 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:38.400 19:41:29 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:5e:00.0 data@nvme0n1 '' '' 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:42.591 19:41:33 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:45.884 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.884 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:05:45.884 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:45.884 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.884 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.884 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.884 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.884 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.884 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.884 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.884 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.884 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.884 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.884 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.884 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:45.885 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:45.885 00:05:45.885 real 0m13.389s 00:05:45.885 user 0m3.950s 00:05:45.885 sys 0m7.420s 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:45.885 19:41:37 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:05:45.885 ************************************ 00:05:45.885 END TEST nvme_mount 00:05:45.885 ************************************ 00:05:45.885 19:41:37 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:05:45.885 19:41:37 setup.sh.devices -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:45.885 19:41:37 setup.sh.devices -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:45.885 19:41:37 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:45.885 ************************************ 00:05:45.885 START TEST dm_mount 00:05:45.885 ************************************ 00:05:45.885 19:41:37 setup.sh.devices.dm_mount -- common/autotest_common.sh@1125 -- # dm_mount 00:05:45.885 19:41:37 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:05:45.885 19:41:37 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:05:45.885 19:41:37 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:05:45.885 19:41:37 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:05:45.885 19:41:37 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:45.885 19:41:37 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:05:45.885 19:41:37 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:45.885 19:41:37 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:45.885 19:41:37 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:05:45.885 19:41:37 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:05:45.885 19:41:37 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:45.885 19:41:37 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:45.885 19:41:37 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:45.885 19:41:37 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:45.885 19:41:37 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:45.885 19:41:37 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:45.885 19:41:37 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:45.885 19:41:37 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:45.885 19:41:37 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:45.885 19:41:37 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:45.885 19:41:37 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:05:46.887 Creating new GPT entries in memory. 00:05:46.887 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:46.887 other utilities. 00:05:46.887 19:41:38 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:46.887 19:41:38 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:46.887 19:41:38 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:46.887 19:41:38 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:46.887 19:41:38 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:47.825 Creating new GPT entries in memory. 00:05:47.825 The operation has completed successfully. 00:05:47.825 19:41:39 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:47.825 19:41:39 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:47.825 19:41:39 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:47.825 19:41:39 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:47.825 19:41:39 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:49.204 The operation has completed successfully. 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 1318907 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:5e:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:49.204 19:41:40 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:49.205 19:41:40 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:49.205 19:41:40 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.498 19:41:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.498 19:41:44 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:52.498 19:41:44 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:52.498 19:41:44 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:52.498 19:41:44 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:52.498 19:41:44 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:52.758 19:41:44 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:52.758 19:41:44 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:5e:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:52.758 19:41:44 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:52.758 19:41:44 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:52.758 19:41:44 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:52.758 19:41:44 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:05:52.758 19:41:44 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:52.758 19:41:44 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:52.758 19:41:44 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:52.758 19:41:44 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.758 19:41:44 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:52.758 19:41:44 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:52.758 19:41:44 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:52.758 19:41:44 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.051 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.311 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:56.311 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:56.311 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:05:56.311 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:05:56.311 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:56.311 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:56.311 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:56.311 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:56.311 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:56.311 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:56.311 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:56.311 19:41:47 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:56.311 00:05:56.311 real 0m10.510s 00:05:56.311 user 0m2.772s 00:05:56.311 sys 0m4.879s 00:05:56.311 19:41:47 setup.sh.devices.dm_mount -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:56.311 19:41:47 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:05:56.311 ************************************ 00:05:56.311 END TEST dm_mount 00:05:56.311 ************************************ 00:05:56.311 19:41:47 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:05:56.311 19:41:47 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:05:56.311 19:41:47 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:56.311 19:41:47 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:56.311 19:41:47 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:56.311 19:41:47 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:56.311 19:41:47 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:56.571 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:56.571 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:05:56.571 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:56.571 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:56.571 19:41:48 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:05:56.571 19:41:48 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:56.899 19:41:48 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:56.899 19:41:48 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:56.899 19:41:48 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:56.899 19:41:48 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:56.899 19:41:48 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:56.899 00:05:56.899 real 0m28.616s 00:05:56.899 user 0m8.287s 00:05:56.899 sys 0m15.382s 00:05:56.899 19:41:48 setup.sh.devices -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:56.899 19:41:48 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:56.899 ************************************ 00:05:56.899 END TEST devices 00:05:56.899 ************************************ 00:05:56.899 00:05:56.899 real 1m43.906s 00:05:56.899 user 0m32.300s 00:05:56.899 sys 0m58.349s 00:05:56.899 19:41:48 setup.sh -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:56.899 19:41:48 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:56.899 ************************************ 00:05:56.899 END TEST setup.sh 00:05:56.899 ************************************ 00:05:56.899 19:41:48 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:06:01.098 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:06:01.098 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:06:01.099 Hugepages 00:06:01.099 node hugesize free / total 00:06:01.099 node0 1048576kB 0 / 0 00:06:01.099 node0 2048kB 1024 / 1024 00:06:01.099 node1 1048576kB 0 / 0 00:06:01.099 node1 2048kB 1024 / 1024 00:06:01.099 00:06:01.099 Type BDF Vendor Device NUMA Driver Device Block devices 00:06:01.099 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:06:01.099 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:06:01.099 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:06:01.099 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:06:01.099 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:06:01.099 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:06:01.099 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:06:01.099 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:06:01.099 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:06:01.099 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:06:01.099 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:06:01.099 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:06:01.099 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:06:01.099 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:06:01.099 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:06:01.099 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:06:01.099 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:06:01.099 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:06:01.099 VMD 0000:d7:05.5 8086 201d 1 vfio-pci - - 00:06:01.099 19:41:52 -- spdk/autotest.sh@130 -- # uname -s 00:06:01.099 19:41:52 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:06:01.099 19:41:52 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:06:01.099 19:41:52 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:06:04.388 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:06:04.388 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:06:04.388 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:04.388 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:04.388 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:04.388 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:04.388 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:04.388 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:04.388 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:04.389 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:04.389 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:04.389 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:04.389 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:04.389 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:04.389 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:04.389 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:04.389 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:04.389 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:06.923 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:06:07.181 19:41:58 -- common/autotest_common.sh@1532 -- # sleep 1 00:06:08.118 19:41:59 -- common/autotest_common.sh@1533 -- # bdfs=() 00:06:08.118 19:41:59 -- common/autotest_common.sh@1533 -- # local bdfs 00:06:08.118 19:41:59 -- common/autotest_common.sh@1534 -- # bdfs=($(get_nvme_bdfs)) 00:06:08.118 19:41:59 -- common/autotest_common.sh@1534 -- # get_nvme_bdfs 00:06:08.118 19:41:59 -- common/autotest_common.sh@1513 -- # bdfs=() 00:06:08.118 19:41:59 -- common/autotest_common.sh@1513 -- # local bdfs 00:06:08.118 19:41:59 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:08.118 19:41:59 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:08.118 19:41:59 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:06:08.118 19:41:59 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:06:08.118 19:41:59 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:06:08.118 19:41:59 -- common/autotest_common.sh@1536 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:06:12.309 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:06:12.309 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:06:12.309 Waiting for block devices as requested 00:06:12.309 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:06:12.309 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:12.309 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:12.309 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:12.309 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:12.309 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:12.309 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:12.309 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:12.568 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:12.568 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:12.568 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:12.827 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:12.827 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:12.827 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:13.085 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:13.085 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:13.085 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:13.344 19:42:04 -- common/autotest_common.sh@1538 -- # for bdf in "${bdfs[@]}" 00:06:13.344 19:42:04 -- common/autotest_common.sh@1539 -- # get_nvme_ctrlr_from_bdf 0000:5e:00.0 00:06:13.344 19:42:04 -- common/autotest_common.sh@1502 -- # readlink -f /sys/class/nvme/nvme0 00:06:13.344 19:42:04 -- common/autotest_common.sh@1502 -- # grep 0000:5e:00.0/nvme/nvme 00:06:13.344 19:42:04 -- common/autotest_common.sh@1502 -- # bdf_sysfs_path=/sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:06:13.344 19:42:04 -- common/autotest_common.sh@1503 -- # [[ -z /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 ]] 00:06:13.344 19:42:04 -- common/autotest_common.sh@1507 -- # basename /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:06:13.344 19:42:04 -- common/autotest_common.sh@1507 -- # printf '%s\n' nvme0 00:06:13.344 19:42:04 -- common/autotest_common.sh@1539 -- # nvme_ctrlr=/dev/nvme0 00:06:13.344 19:42:04 -- common/autotest_common.sh@1540 -- # [[ -z /dev/nvme0 ]] 00:06:13.344 19:42:04 -- common/autotest_common.sh@1545 -- # nvme id-ctrl /dev/nvme0 00:06:13.344 19:42:04 -- common/autotest_common.sh@1545 -- # grep oacs 00:06:13.344 19:42:04 -- common/autotest_common.sh@1545 -- # cut -d: -f2 00:06:13.344 19:42:04 -- common/autotest_common.sh@1545 -- # oacs=' 0x3f' 00:06:13.344 19:42:04 -- common/autotest_common.sh@1546 -- # oacs_ns_manage=8 00:06:13.344 19:42:04 -- common/autotest_common.sh@1548 -- # [[ 8 -ne 0 ]] 00:06:13.344 19:42:04 -- common/autotest_common.sh@1554 -- # nvme id-ctrl /dev/nvme0 00:06:13.344 19:42:04 -- common/autotest_common.sh@1554 -- # grep unvmcap 00:06:13.344 19:42:04 -- common/autotest_common.sh@1554 -- # cut -d: -f2 00:06:13.344 19:42:04 -- common/autotest_common.sh@1554 -- # unvmcap=' 0' 00:06:13.344 19:42:04 -- common/autotest_common.sh@1555 -- # [[ 0 -eq 0 ]] 00:06:13.344 19:42:04 -- common/autotest_common.sh@1557 -- # continue 00:06:13.344 19:42:04 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:06:13.344 19:42:04 -- common/autotest_common.sh@730 -- # xtrace_disable 00:06:13.344 19:42:04 -- common/autotest_common.sh@10 -- # set +x 00:06:13.344 19:42:04 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:06:13.344 19:42:04 -- common/autotest_common.sh@724 -- # xtrace_disable 00:06:13.344 19:42:04 -- common/autotest_common.sh@10 -- # set +x 00:06:13.344 19:42:04 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:06:17.545 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:06:17.545 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:06:17.545 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:17.545 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:17.545 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:17.545 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:17.545 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:17.545 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:17.545 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:17.545 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:17.545 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:17.545 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:17.545 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:17.545 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:17.545 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:17.545 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:17.545 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:17.545 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:20.082 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:06:20.082 19:42:11 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:06:20.082 19:42:11 -- common/autotest_common.sh@730 -- # xtrace_disable 00:06:20.082 19:42:11 -- common/autotest_common.sh@10 -- # set +x 00:06:20.082 19:42:11 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:06:20.082 19:42:11 -- common/autotest_common.sh@1591 -- # mapfile -t bdfs 00:06:20.082 19:42:11 -- common/autotest_common.sh@1591 -- # get_nvme_bdfs_by_id 0x0a54 00:06:20.082 19:42:11 -- common/autotest_common.sh@1577 -- # bdfs=() 00:06:20.082 19:42:11 -- common/autotest_common.sh@1577 -- # local bdfs 00:06:20.082 19:42:11 -- common/autotest_common.sh@1579 -- # get_nvme_bdfs 00:06:20.082 19:42:11 -- common/autotest_common.sh@1513 -- # bdfs=() 00:06:20.082 19:42:11 -- common/autotest_common.sh@1513 -- # local bdfs 00:06:20.082 19:42:11 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:20.082 19:42:11 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:20.082 19:42:11 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:06:20.082 19:42:11 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:06:20.082 19:42:11 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:06:20.082 19:42:11 -- common/autotest_common.sh@1579 -- # for bdf in $(get_nvme_bdfs) 00:06:20.082 19:42:11 -- common/autotest_common.sh@1580 -- # cat /sys/bus/pci/devices/0000:5e:00.0/device 00:06:20.082 19:42:11 -- common/autotest_common.sh@1580 -- # device=0x0b60 00:06:20.082 19:42:11 -- common/autotest_common.sh@1581 -- # [[ 0x0b60 == \0\x\0\a\5\4 ]] 00:06:20.082 19:42:11 -- common/autotest_common.sh@1586 -- # printf '%s\n' 00:06:20.082 19:42:11 -- common/autotest_common.sh@1592 -- # [[ -z '' ]] 00:06:20.082 19:42:11 -- common/autotest_common.sh@1593 -- # return 0 00:06:20.082 19:42:11 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:06:20.082 19:42:11 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:06:20.082 19:42:11 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:06:20.082 19:42:11 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:06:20.082 19:42:11 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:06:20.670 Restarting all devices. 00:06:24.917 lstat() error: No such file or directory 00:06:24.917 QAT Error: No GENERAL section found 00:06:24.917 Failed to configure qat_dev0 00:06:24.917 lstat() error: No such file or directory 00:06:24.917 QAT Error: No GENERAL section found 00:06:24.917 Failed to configure qat_dev1 00:06:24.917 lstat() error: No such file or directory 00:06:24.917 QAT Error: No GENERAL section found 00:06:24.917 Failed to configure qat_dev2 00:06:24.917 enable sriov 00:06:24.917 Checking status of all devices. 00:06:24.917 There is 3 QAT acceleration device(s) in the system: 00:06:24.917 qat_dev0 - type: c6xx, inst_id: 0, node_id: 0, bsf: 0000:3d:00.0, #accel: 5 #engines: 10 state: down 00:06:24.917 qat_dev1 - type: c6xx, inst_id: 1, node_id: 0, bsf: 0000:3f:00.0, #accel: 5 #engines: 10 state: down 00:06:24.917 qat_dev2 - type: c6xx, inst_id: 2, node_id: 1, bsf: 0000:da:00.0, #accel: 5 #engines: 10 state: down 00:06:24.917 0000:3d:00.0 set to 16 VFs 00:06:25.858 0000:3f:00.0 set to 16 VFs 00:06:26.425 0000:da:00.0 set to 16 VFs 00:06:28.329 Properly configured the qat device with driver uio_pci_generic. 00:06:28.329 19:42:19 -- spdk/autotest.sh@162 -- # timing_enter lib 00:06:28.329 19:42:19 -- common/autotest_common.sh@724 -- # xtrace_disable 00:06:28.329 19:42:19 -- common/autotest_common.sh@10 -- # set +x 00:06:28.329 19:42:19 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:06:28.329 19:42:19 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:06:28.329 19:42:19 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:28.329 19:42:19 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:28.329 19:42:19 -- common/autotest_common.sh@10 -- # set +x 00:06:28.329 ************************************ 00:06:28.329 START TEST env 00:06:28.329 ************************************ 00:06:28.329 19:42:19 env -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:06:28.329 * Looking for test storage... 00:06:28.329 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:06:28.329 19:42:19 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:06:28.329 19:42:19 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:28.329 19:42:19 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:28.329 19:42:19 env -- common/autotest_common.sh@10 -- # set +x 00:06:28.329 ************************************ 00:06:28.329 START TEST env_memory 00:06:28.329 ************************************ 00:06:28.329 19:42:19 env.env_memory -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:06:28.329 00:06:28.329 00:06:28.329 CUnit - A unit testing framework for C - Version 2.1-3 00:06:28.329 http://cunit.sourceforge.net/ 00:06:28.329 00:06:28.329 00:06:28.329 Suite: memory 00:06:28.329 Test: alloc and free memory map ...[2024-07-24 19:42:19.650737] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:06:28.329 passed 00:06:28.329 Test: mem map translation ...[2024-07-24 19:42:19.679991] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:06:28.329 [2024-07-24 19:42:19.680013] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:06:28.329 [2024-07-24 19:42:19.680068] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:06:28.329 [2024-07-24 19:42:19.680081] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:06:28.329 passed 00:06:28.329 Test: mem map registration ...[2024-07-24 19:42:19.737773] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:06:28.329 [2024-07-24 19:42:19.737795] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:06:28.329 passed 00:06:28.329 Test: mem map adjacent registrations ...passed 00:06:28.329 00:06:28.329 Run Summary: Type Total Ran Passed Failed Inactive 00:06:28.329 suites 1 1 n/a 0 0 00:06:28.329 tests 4 4 4 0 0 00:06:28.329 asserts 152 152 152 0 n/a 00:06:28.329 00:06:28.329 Elapsed time = 0.195 seconds 00:06:28.329 00:06:28.329 real 0m0.206s 00:06:28.329 user 0m0.193s 00:06:28.329 sys 0m0.012s 00:06:28.329 19:42:19 env.env_memory -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:28.329 19:42:19 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:06:28.329 ************************************ 00:06:28.329 END TEST env_memory 00:06:28.329 ************************************ 00:06:28.329 19:42:19 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:28.329 19:42:19 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:28.329 19:42:19 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:28.329 19:42:19 env -- common/autotest_common.sh@10 -- # set +x 00:06:28.329 ************************************ 00:06:28.329 START TEST env_vtophys 00:06:28.329 ************************************ 00:06:28.329 19:42:19 env.env_vtophys -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:28.329 EAL: lib.eal log level changed from notice to debug 00:06:28.329 EAL: Detected lcore 0 as core 0 on socket 0 00:06:28.329 EAL: Detected lcore 1 as core 1 on socket 0 00:06:28.329 EAL: Detected lcore 2 as core 2 on socket 0 00:06:28.329 EAL: Detected lcore 3 as core 3 on socket 0 00:06:28.329 EAL: Detected lcore 4 as core 4 on socket 0 00:06:28.329 EAL: Detected lcore 5 as core 8 on socket 0 00:06:28.329 EAL: Detected lcore 6 as core 9 on socket 0 00:06:28.329 EAL: Detected lcore 7 as core 10 on socket 0 00:06:28.329 EAL: Detected lcore 8 as core 11 on socket 0 00:06:28.329 EAL: Detected lcore 9 as core 16 on socket 0 00:06:28.329 EAL: Detected lcore 10 as core 17 on socket 0 00:06:28.329 EAL: Detected lcore 11 as core 18 on socket 0 00:06:28.329 EAL: Detected lcore 12 as core 19 on socket 0 00:06:28.329 EAL: Detected lcore 13 as core 20 on socket 0 00:06:28.329 EAL: Detected lcore 14 as core 24 on socket 0 00:06:28.329 EAL: Detected lcore 15 as core 25 on socket 0 00:06:28.329 EAL: Detected lcore 16 as core 26 on socket 0 00:06:28.329 EAL: Detected lcore 17 as core 27 on socket 0 00:06:28.329 EAL: Detected lcore 18 as core 0 on socket 1 00:06:28.329 EAL: Detected lcore 19 as core 1 on socket 1 00:06:28.329 EAL: Detected lcore 20 as core 2 on socket 1 00:06:28.329 EAL: Detected lcore 21 as core 3 on socket 1 00:06:28.329 EAL: Detected lcore 22 as core 4 on socket 1 00:06:28.329 EAL: Detected lcore 23 as core 8 on socket 1 00:06:28.329 EAL: Detected lcore 24 as core 9 on socket 1 00:06:28.329 EAL: Detected lcore 25 as core 10 on socket 1 00:06:28.329 EAL: Detected lcore 26 as core 11 on socket 1 00:06:28.329 EAL: Detected lcore 27 as core 16 on socket 1 00:06:28.329 EAL: Detected lcore 28 as core 17 on socket 1 00:06:28.329 EAL: Detected lcore 29 as core 18 on socket 1 00:06:28.329 EAL: Detected lcore 30 as core 19 on socket 1 00:06:28.329 EAL: Detected lcore 31 as core 20 on socket 1 00:06:28.329 EAL: Detected lcore 32 as core 24 on socket 1 00:06:28.329 EAL: Detected lcore 33 as core 25 on socket 1 00:06:28.329 EAL: Detected lcore 34 as core 26 on socket 1 00:06:28.329 EAL: Detected lcore 35 as core 27 on socket 1 00:06:28.329 EAL: Detected lcore 36 as core 0 on socket 0 00:06:28.329 EAL: Detected lcore 37 as core 1 on socket 0 00:06:28.329 EAL: Detected lcore 38 as core 2 on socket 0 00:06:28.329 EAL: Detected lcore 39 as core 3 on socket 0 00:06:28.329 EAL: Detected lcore 40 as core 4 on socket 0 00:06:28.329 EAL: Detected lcore 41 as core 8 on socket 0 00:06:28.329 EAL: Detected lcore 42 as core 9 on socket 0 00:06:28.329 EAL: Detected lcore 43 as core 10 on socket 0 00:06:28.329 EAL: Detected lcore 44 as core 11 on socket 0 00:06:28.329 EAL: Detected lcore 45 as core 16 on socket 0 00:06:28.329 EAL: Detected lcore 46 as core 17 on socket 0 00:06:28.329 EAL: Detected lcore 47 as core 18 on socket 0 00:06:28.329 EAL: Detected lcore 48 as core 19 on socket 0 00:06:28.329 EAL: Detected lcore 49 as core 20 on socket 0 00:06:28.329 EAL: Detected lcore 50 as core 24 on socket 0 00:06:28.329 EAL: Detected lcore 51 as core 25 on socket 0 00:06:28.329 EAL: Detected lcore 52 as core 26 on socket 0 00:06:28.329 EAL: Detected lcore 53 as core 27 on socket 0 00:06:28.329 EAL: Detected lcore 54 as core 0 on socket 1 00:06:28.329 EAL: Detected lcore 55 as core 1 on socket 1 00:06:28.329 EAL: Detected lcore 56 as core 2 on socket 1 00:06:28.329 EAL: Detected lcore 57 as core 3 on socket 1 00:06:28.329 EAL: Detected lcore 58 as core 4 on socket 1 00:06:28.329 EAL: Detected lcore 59 as core 8 on socket 1 00:06:28.329 EAL: Detected lcore 60 as core 9 on socket 1 00:06:28.329 EAL: Detected lcore 61 as core 10 on socket 1 00:06:28.329 EAL: Detected lcore 62 as core 11 on socket 1 00:06:28.329 EAL: Detected lcore 63 as core 16 on socket 1 00:06:28.329 EAL: Detected lcore 64 as core 17 on socket 1 00:06:28.329 EAL: Detected lcore 65 as core 18 on socket 1 00:06:28.329 EAL: Detected lcore 66 as core 19 on socket 1 00:06:28.329 EAL: Detected lcore 67 as core 20 on socket 1 00:06:28.329 EAL: Detected lcore 68 as core 24 on socket 1 00:06:28.329 EAL: Detected lcore 69 as core 25 on socket 1 00:06:28.329 EAL: Detected lcore 70 as core 26 on socket 1 00:06:28.329 EAL: Detected lcore 71 as core 27 on socket 1 00:06:28.590 EAL: Maximum logical cores by configuration: 128 00:06:28.590 EAL: Detected CPU lcores: 72 00:06:28.590 EAL: Detected NUMA nodes: 2 00:06:28.590 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:06:28.590 EAL: Detected shared linkage of DPDK 00:06:28.590 EAL: No shared files mode enabled, IPC will be disabled 00:06:28.590 EAL: No shared files mode enabled, IPC is disabled 00:06:28.590 EAL: PCI driver qat for device 0000:3d:01.0 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3d:01.1 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3d:01.2 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3d:01.3 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3d:01.4 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3d:01.5 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3d:01.6 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3d:01.7 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3d:02.0 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3d:02.1 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3d:02.2 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3d:02.3 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3d:02.4 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3d:02.5 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3d:02.6 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3d:02.7 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3f:01.0 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3f:01.1 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3f:01.2 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3f:01.3 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3f:01.4 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3f:01.5 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3f:01.6 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3f:01.7 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3f:02.0 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3f:02.1 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3f:02.2 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3f:02.3 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3f:02.4 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3f:02.5 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3f:02.6 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:3f:02.7 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:da:01.0 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:da:01.1 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:da:01.2 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:da:01.3 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:da:01.4 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:da:01.5 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:da:01.6 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:da:01.7 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:da:02.0 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:da:02.1 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:da:02.2 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:da:02.3 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:da:02.4 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:da:02.5 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:da:02.6 wants IOVA as 'PA' 00:06:28.590 EAL: PCI driver qat for device 0000:da:02.7 wants IOVA as 'PA' 00:06:28.590 EAL: Bus pci wants IOVA as 'PA' 00:06:28.590 EAL: Bus auxiliary wants IOVA as 'DC' 00:06:28.590 EAL: Bus vdev wants IOVA as 'DC' 00:06:28.590 EAL: Selected IOVA mode 'PA' 00:06:28.590 EAL: Probing VFIO support... 00:06:28.590 EAL: IOMMU type 1 (Type 1) is supported 00:06:28.590 EAL: IOMMU type 7 (sPAPR) is not supported 00:06:28.590 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:06:28.590 EAL: VFIO support initialized 00:06:28.590 EAL: Ask a virtual area of 0x2e000 bytes 00:06:28.590 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:06:28.590 EAL: Setting up physically contiguous memory... 00:06:28.590 EAL: Setting maximum number of open files to 524288 00:06:28.590 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:06:28.590 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:06:28.590 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:06:28.590 EAL: Ask a virtual area of 0x61000 bytes 00:06:28.590 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:06:28.590 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:28.590 EAL: Ask a virtual area of 0x400000000 bytes 00:06:28.590 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:06:28.590 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:06:28.590 EAL: Ask a virtual area of 0x61000 bytes 00:06:28.590 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:06:28.590 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:28.590 EAL: Ask a virtual area of 0x400000000 bytes 00:06:28.590 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:06:28.590 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:06:28.590 EAL: Ask a virtual area of 0x61000 bytes 00:06:28.590 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:06:28.590 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:28.590 EAL: Ask a virtual area of 0x400000000 bytes 00:06:28.590 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:06:28.590 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:06:28.590 EAL: Ask a virtual area of 0x61000 bytes 00:06:28.590 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:06:28.591 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:28.591 EAL: Ask a virtual area of 0x400000000 bytes 00:06:28.591 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:06:28.591 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:06:28.591 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:06:28.591 EAL: Ask a virtual area of 0x61000 bytes 00:06:28.591 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:06:28.591 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:28.591 EAL: Ask a virtual area of 0x400000000 bytes 00:06:28.591 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:06:28.591 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:06:28.591 EAL: Ask a virtual area of 0x61000 bytes 00:06:28.591 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:06:28.591 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:28.591 EAL: Ask a virtual area of 0x400000000 bytes 00:06:28.591 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:06:28.591 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:06:28.591 EAL: Ask a virtual area of 0x61000 bytes 00:06:28.591 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:06:28.591 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:28.591 EAL: Ask a virtual area of 0x400000000 bytes 00:06:28.591 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:06:28.591 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:06:28.591 EAL: Ask a virtual area of 0x61000 bytes 00:06:28.591 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:06:28.591 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:28.591 EAL: Ask a virtual area of 0x400000000 bytes 00:06:28.591 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:06:28.591 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:06:28.591 EAL: Hugepages will be freed exactly as allocated. 00:06:28.591 EAL: No shared files mode enabled, IPC is disabled 00:06:28.591 EAL: No shared files mode enabled, IPC is disabled 00:06:28.591 EAL: TSC frequency is ~2300000 KHz 00:06:28.591 EAL: Main lcore 0 is ready (tid=7f34a51b3b00;cpuset=[0]) 00:06:28.591 EAL: Trying to obtain current memory policy. 00:06:28.591 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:28.591 EAL: Restoring previous memory policy: 0 00:06:28.591 EAL: request: mp_malloc_sync 00:06:28.591 EAL: No shared files mode enabled, IPC is disabled 00:06:28.591 EAL: Heap on socket 0 was expanded by 2MB 00:06:28.591 EAL: PCI device 0000:3d:01.0 on NUMA socket 0 00:06:28.591 EAL: probe driver: 8086:37c9 qat 00:06:28.591 EAL: PCI memory mapped at 0x202001000000 00:06:28.591 EAL: PCI memory mapped at 0x202001001000 00:06:28.591 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:06:28.591 EAL: PCI device 0000:3d:01.1 on NUMA socket 0 00:06:28.591 EAL: probe driver: 8086:37c9 qat 00:06:28.591 EAL: PCI memory mapped at 0x202001002000 00:06:28.591 EAL: PCI memory mapped at 0x202001003000 00:06:28.591 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:06:28.591 EAL: PCI device 0000:3d:01.2 on NUMA socket 0 00:06:28.591 EAL: probe driver: 8086:37c9 qat 00:06:28.591 EAL: PCI memory mapped at 0x202001004000 00:06:28.591 EAL: PCI memory mapped at 0x202001005000 00:06:28.591 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:06:28.591 EAL: PCI device 0000:3d:01.3 on NUMA socket 0 00:06:28.591 EAL: probe driver: 8086:37c9 qat 00:06:28.591 EAL: PCI memory mapped at 0x202001006000 00:06:28.591 EAL: PCI memory mapped at 0x202001007000 00:06:28.591 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:06:28.591 EAL: PCI device 0000:3d:01.4 on NUMA socket 0 00:06:28.591 EAL: probe driver: 8086:37c9 qat 00:06:28.591 EAL: PCI memory mapped at 0x202001008000 00:06:28.591 EAL: PCI memory mapped at 0x202001009000 00:06:28.591 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:06:28.591 EAL: PCI device 0000:3d:01.5 on NUMA socket 0 00:06:28.591 EAL: probe driver: 8086:37c9 qat 00:06:28.591 EAL: PCI memory mapped at 0x20200100a000 00:06:28.591 EAL: PCI memory mapped at 0x20200100b000 00:06:28.591 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:06:28.591 EAL: PCI device 0000:3d:01.6 on NUMA socket 0 00:06:28.591 EAL: probe driver: 8086:37c9 qat 00:06:28.591 EAL: PCI memory mapped at 0x20200100c000 00:06:28.591 EAL: PCI memory mapped at 0x20200100d000 00:06:28.591 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:06:28.591 EAL: PCI device 0000:3d:01.7 on NUMA socket 0 00:06:28.591 EAL: probe driver: 8086:37c9 qat 00:06:28.591 EAL: PCI memory mapped at 0x20200100e000 00:06:28.591 EAL: PCI memory mapped at 0x20200100f000 00:06:28.591 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:06:28.591 EAL: PCI device 0000:3d:02.0 on NUMA socket 0 00:06:28.591 EAL: probe driver: 8086:37c9 qat 00:06:28.591 EAL: PCI memory mapped at 0x202001010000 00:06:28.591 EAL: PCI memory mapped at 0x202001011000 00:06:28.591 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:06:28.591 EAL: PCI device 0000:3d:02.1 on NUMA socket 0 00:06:28.591 EAL: probe driver: 8086:37c9 qat 00:06:28.591 EAL: PCI memory mapped at 0x202001012000 00:06:28.591 EAL: PCI memory mapped at 0x202001013000 00:06:28.591 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:06:28.591 EAL: PCI device 0000:3d:02.2 on NUMA socket 0 00:06:28.591 EAL: probe driver: 8086:37c9 qat 00:06:28.591 EAL: PCI memory mapped at 0x202001014000 00:06:28.591 EAL: PCI memory mapped at 0x202001015000 00:06:28.591 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:06:28.591 EAL: PCI device 0000:3d:02.3 on NUMA socket 0 00:06:28.591 EAL: probe driver: 8086:37c9 qat 00:06:28.591 EAL: PCI memory mapped at 0x202001016000 00:06:28.591 EAL: PCI memory mapped at 0x202001017000 00:06:28.591 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:06:28.591 EAL: PCI device 0000:3d:02.4 on NUMA socket 0 00:06:28.591 EAL: probe driver: 8086:37c9 qat 00:06:28.591 EAL: PCI memory mapped at 0x202001018000 00:06:28.591 EAL: PCI memory mapped at 0x202001019000 00:06:28.591 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:06:28.591 EAL: PCI device 0000:3d:02.5 on NUMA socket 0 00:06:28.591 EAL: probe driver: 8086:37c9 qat 00:06:28.591 EAL: PCI memory mapped at 0x20200101a000 00:06:28.591 EAL: PCI memory mapped at 0x20200101b000 00:06:28.591 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:06:28.591 EAL: PCI device 0000:3d:02.6 on NUMA socket 0 00:06:28.591 EAL: probe driver: 8086:37c9 qat 00:06:28.591 EAL: PCI memory mapped at 0x20200101c000 00:06:28.591 EAL: PCI memory mapped at 0x20200101d000 00:06:28.591 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:06:28.591 EAL: PCI device 0000:3d:02.7 on NUMA socket 0 00:06:28.591 EAL: probe driver: 8086:37c9 qat 00:06:28.591 EAL: PCI memory mapped at 0x20200101e000 00:06:28.591 EAL: PCI memory mapped at 0x20200101f000 00:06:28.591 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:06:28.591 EAL: PCI device 0000:3f:01.0 on NUMA socket 0 00:06:28.591 EAL: probe driver: 8086:37c9 qat 00:06:28.591 EAL: PCI memory mapped at 0x202001020000 00:06:28.591 EAL: PCI memory mapped at 0x202001021000 00:06:28.591 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:06:28.591 EAL: PCI device 0000:3f:01.1 on NUMA socket 0 00:06:28.591 EAL: probe driver: 8086:37c9 qat 00:06:28.591 EAL: PCI memory mapped at 0x202001022000 00:06:28.591 EAL: PCI memory mapped at 0x202001023000 00:06:28.591 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:06:28.591 EAL: PCI device 0000:3f:01.2 on NUMA socket 0 00:06:28.591 EAL: probe driver: 8086:37c9 qat 00:06:28.591 EAL: PCI memory mapped at 0x202001024000 00:06:28.591 EAL: PCI memory mapped at 0x202001025000 00:06:28.591 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:06:28.591 EAL: PCI device 0000:3f:01.3 on NUMA socket 0 00:06:28.591 EAL: probe driver: 8086:37c9 qat 00:06:28.591 EAL: PCI memory mapped at 0x202001026000 00:06:28.591 EAL: PCI memory mapped at 0x202001027000 00:06:28.591 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:06:28.591 EAL: PCI device 0000:3f:01.4 on NUMA socket 0 00:06:28.591 EAL: probe driver: 8086:37c9 qat 00:06:28.591 EAL: PCI memory mapped at 0x202001028000 00:06:28.591 EAL: PCI memory mapped at 0x202001029000 00:06:28.591 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:06:28.591 EAL: PCI device 0000:3f:01.5 on NUMA socket 0 00:06:28.591 EAL: probe driver: 8086:37c9 qat 00:06:28.591 EAL: PCI memory mapped at 0x20200102a000 00:06:28.591 EAL: PCI memory mapped at 0x20200102b000 00:06:28.591 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:06:28.591 EAL: PCI device 0000:3f:01.6 on NUMA socket 0 00:06:28.591 EAL: probe driver: 8086:37c9 qat 00:06:28.591 EAL: PCI memory mapped at 0x20200102c000 00:06:28.591 EAL: PCI memory mapped at 0x20200102d000 00:06:28.591 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:06:28.591 EAL: PCI device 0000:3f:01.7 on NUMA socket 0 00:06:28.591 EAL: probe driver: 8086:37c9 qat 00:06:28.591 EAL: PCI memory mapped at 0x20200102e000 00:06:28.591 EAL: PCI memory mapped at 0x20200102f000 00:06:28.591 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:06:28.591 EAL: PCI device 0000:3f:02.0 on NUMA socket 0 00:06:28.591 EAL: probe driver: 8086:37c9 qat 00:06:28.591 EAL: PCI memory mapped at 0x202001030000 00:06:28.591 EAL: PCI memory mapped at 0x202001031000 00:06:28.591 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:06:28.591 EAL: PCI device 0000:3f:02.1 on NUMA socket 0 00:06:28.591 EAL: probe driver: 8086:37c9 qat 00:06:28.591 EAL: PCI memory mapped at 0x202001032000 00:06:28.591 EAL: PCI memory mapped at 0x202001033000 00:06:28.591 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:06:28.591 EAL: PCI device 0000:3f:02.2 on NUMA socket 0 00:06:28.591 EAL: probe driver: 8086:37c9 qat 00:06:28.591 EAL: PCI memory mapped at 0x202001034000 00:06:28.591 EAL: PCI memory mapped at 0x202001035000 00:06:28.591 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:06:28.591 EAL: PCI device 0000:3f:02.3 on NUMA socket 0 00:06:28.591 EAL: probe driver: 8086:37c9 qat 00:06:28.591 EAL: PCI memory mapped at 0x202001036000 00:06:28.591 EAL: PCI memory mapped at 0x202001037000 00:06:28.592 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:06:28.592 EAL: PCI device 0000:3f:02.4 on NUMA socket 0 00:06:28.592 EAL: probe driver: 8086:37c9 qat 00:06:28.592 EAL: PCI memory mapped at 0x202001038000 00:06:28.592 EAL: PCI memory mapped at 0x202001039000 00:06:28.592 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:06:28.592 EAL: PCI device 0000:3f:02.5 on NUMA socket 0 00:06:28.592 EAL: probe driver: 8086:37c9 qat 00:06:28.592 EAL: PCI memory mapped at 0x20200103a000 00:06:28.592 EAL: PCI memory mapped at 0x20200103b000 00:06:28.592 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:06:28.592 EAL: PCI device 0000:3f:02.6 on NUMA socket 0 00:06:28.592 EAL: probe driver: 8086:37c9 qat 00:06:28.592 EAL: PCI memory mapped at 0x20200103c000 00:06:28.592 EAL: PCI memory mapped at 0x20200103d000 00:06:28.592 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:06:28.592 EAL: PCI device 0000:3f:02.7 on NUMA socket 0 00:06:28.592 EAL: probe driver: 8086:37c9 qat 00:06:28.592 EAL: PCI memory mapped at 0x20200103e000 00:06:28.592 EAL: PCI memory mapped at 0x20200103f000 00:06:28.592 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:06:28.592 EAL: PCI device 0000:da:01.0 on NUMA socket 1 00:06:28.592 EAL: probe driver: 8086:37c9 qat 00:06:28.592 EAL: PCI memory mapped at 0x202001040000 00:06:28.592 EAL: PCI memory mapped at 0x202001041000 00:06:28.592 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:06:28.592 EAL: Trying to obtain current memory policy. 00:06:28.592 EAL: Setting policy MPOL_PREFERRED for socket 1 00:06:28.592 EAL: Restoring previous memory policy: 4 00:06:28.592 EAL: request: mp_malloc_sync 00:06:28.592 EAL: No shared files mode enabled, IPC is disabled 00:06:28.592 EAL: Heap on socket 1 was expanded by 2MB 00:06:28.592 EAL: PCI device 0000:da:01.1 on NUMA socket 1 00:06:28.592 EAL: probe driver: 8086:37c9 qat 00:06:28.592 EAL: PCI memory mapped at 0x202001042000 00:06:28.592 EAL: PCI memory mapped at 0x202001043000 00:06:28.592 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:06:28.592 EAL: PCI device 0000:da:01.2 on NUMA socket 1 00:06:28.592 EAL: probe driver: 8086:37c9 qat 00:06:28.592 EAL: PCI memory mapped at 0x202001044000 00:06:28.592 EAL: PCI memory mapped at 0x202001045000 00:06:28.592 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:06:28.592 EAL: PCI device 0000:da:01.3 on NUMA socket 1 00:06:28.592 EAL: probe driver: 8086:37c9 qat 00:06:28.592 EAL: PCI memory mapped at 0x202001046000 00:06:28.592 EAL: PCI memory mapped at 0x202001047000 00:06:28.592 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:06:28.592 EAL: PCI device 0000:da:01.4 on NUMA socket 1 00:06:28.592 EAL: probe driver: 8086:37c9 qat 00:06:28.592 EAL: PCI memory mapped at 0x202001048000 00:06:28.592 EAL: PCI memory mapped at 0x202001049000 00:06:28.592 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:06:28.592 EAL: PCI device 0000:da:01.5 on NUMA socket 1 00:06:28.592 EAL: probe driver: 8086:37c9 qat 00:06:28.592 EAL: PCI memory mapped at 0x20200104a000 00:06:28.592 EAL: PCI memory mapped at 0x20200104b000 00:06:28.592 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:06:28.592 EAL: PCI device 0000:da:01.6 on NUMA socket 1 00:06:28.592 EAL: probe driver: 8086:37c9 qat 00:06:28.592 EAL: PCI memory mapped at 0x20200104c000 00:06:28.592 EAL: PCI memory mapped at 0x20200104d000 00:06:28.592 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:06:28.592 EAL: PCI device 0000:da:01.7 on NUMA socket 1 00:06:28.592 EAL: probe driver: 8086:37c9 qat 00:06:28.592 EAL: PCI memory mapped at 0x20200104e000 00:06:28.592 EAL: PCI memory mapped at 0x20200104f000 00:06:28.592 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:06:28.592 EAL: PCI device 0000:da:02.0 on NUMA socket 1 00:06:28.592 EAL: probe driver: 8086:37c9 qat 00:06:28.592 EAL: PCI memory mapped at 0x202001050000 00:06:28.592 EAL: PCI memory mapped at 0x202001051000 00:06:28.592 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:06:28.592 EAL: PCI device 0000:da:02.1 on NUMA socket 1 00:06:28.592 EAL: probe driver: 8086:37c9 qat 00:06:28.592 EAL: PCI memory mapped at 0x202001052000 00:06:28.592 EAL: PCI memory mapped at 0x202001053000 00:06:28.592 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:06:28.592 EAL: PCI device 0000:da:02.2 on NUMA socket 1 00:06:28.592 EAL: probe driver: 8086:37c9 qat 00:06:28.592 EAL: PCI memory mapped at 0x202001054000 00:06:28.592 EAL: PCI memory mapped at 0x202001055000 00:06:28.592 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:06:28.592 EAL: PCI device 0000:da:02.3 on NUMA socket 1 00:06:28.592 EAL: probe driver: 8086:37c9 qat 00:06:28.592 EAL: PCI memory mapped at 0x202001056000 00:06:28.592 EAL: PCI memory mapped at 0x202001057000 00:06:28.592 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:06:28.592 EAL: PCI device 0000:da:02.4 on NUMA socket 1 00:06:28.592 EAL: probe driver: 8086:37c9 qat 00:06:28.592 EAL: PCI memory mapped at 0x202001058000 00:06:28.592 EAL: PCI memory mapped at 0x202001059000 00:06:28.592 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:06:28.592 EAL: PCI device 0000:da:02.5 on NUMA socket 1 00:06:28.592 EAL: probe driver: 8086:37c9 qat 00:06:28.592 EAL: PCI memory mapped at 0x20200105a000 00:06:28.592 EAL: PCI memory mapped at 0x20200105b000 00:06:28.592 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:06:28.592 EAL: PCI device 0000:da:02.6 on NUMA socket 1 00:06:28.592 EAL: probe driver: 8086:37c9 qat 00:06:28.592 EAL: PCI memory mapped at 0x20200105c000 00:06:28.592 EAL: PCI memory mapped at 0x20200105d000 00:06:28.592 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:06:28.592 EAL: PCI device 0000:da:02.7 on NUMA socket 1 00:06:28.592 EAL: probe driver: 8086:37c9 qat 00:06:28.592 EAL: PCI memory mapped at 0x20200105e000 00:06:28.592 EAL: PCI memory mapped at 0x20200105f000 00:06:28.592 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:06:28.592 EAL: No shared files mode enabled, IPC is disabled 00:06:28.592 EAL: No shared files mode enabled, IPC is disabled 00:06:28.592 EAL: No PCI address specified using 'addr=' in: bus=pci 00:06:28.592 EAL: Mem event callback 'spdk:(nil)' registered 00:06:28.592 00:06:28.592 00:06:28.592 CUnit - A unit testing framework for C - Version 2.1-3 00:06:28.592 http://cunit.sourceforge.net/ 00:06:28.592 00:06:28.592 00:06:28.592 Suite: components_suite 00:06:28.592 Test: vtophys_malloc_test ...passed 00:06:28.592 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:06:28.592 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:28.592 EAL: Restoring previous memory policy: 4 00:06:28.592 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.592 EAL: request: mp_malloc_sync 00:06:28.592 EAL: No shared files mode enabled, IPC is disabled 00:06:28.592 EAL: Heap on socket 0 was expanded by 4MB 00:06:28.592 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.592 EAL: request: mp_malloc_sync 00:06:28.592 EAL: No shared files mode enabled, IPC is disabled 00:06:28.592 EAL: Heap on socket 0 was shrunk by 4MB 00:06:28.592 EAL: Trying to obtain current memory policy. 00:06:28.592 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:28.592 EAL: Restoring previous memory policy: 4 00:06:28.592 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.592 EAL: request: mp_malloc_sync 00:06:28.592 EAL: No shared files mode enabled, IPC is disabled 00:06:28.592 EAL: Heap on socket 0 was expanded by 6MB 00:06:28.592 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.592 EAL: request: mp_malloc_sync 00:06:28.592 EAL: No shared files mode enabled, IPC is disabled 00:06:28.592 EAL: Heap on socket 0 was shrunk by 6MB 00:06:28.592 EAL: Trying to obtain current memory policy. 00:06:28.592 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:28.592 EAL: Restoring previous memory policy: 4 00:06:28.592 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.592 EAL: request: mp_malloc_sync 00:06:28.592 EAL: No shared files mode enabled, IPC is disabled 00:06:28.592 EAL: Heap on socket 0 was expanded by 10MB 00:06:28.592 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.592 EAL: request: mp_malloc_sync 00:06:28.592 EAL: No shared files mode enabled, IPC is disabled 00:06:28.592 EAL: Heap on socket 0 was shrunk by 10MB 00:06:28.592 EAL: Trying to obtain current memory policy. 00:06:28.592 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:28.592 EAL: Restoring previous memory policy: 4 00:06:28.592 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.592 EAL: request: mp_malloc_sync 00:06:28.592 EAL: No shared files mode enabled, IPC is disabled 00:06:28.592 EAL: Heap on socket 0 was expanded by 18MB 00:06:28.592 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.592 EAL: request: mp_malloc_sync 00:06:28.592 EAL: No shared files mode enabled, IPC is disabled 00:06:28.592 EAL: Heap on socket 0 was shrunk by 18MB 00:06:28.592 EAL: Trying to obtain current memory policy. 00:06:28.592 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:28.592 EAL: Restoring previous memory policy: 4 00:06:28.592 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.592 EAL: request: mp_malloc_sync 00:06:28.592 EAL: No shared files mode enabled, IPC is disabled 00:06:28.592 EAL: Heap on socket 0 was expanded by 34MB 00:06:28.592 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.592 EAL: request: mp_malloc_sync 00:06:28.592 EAL: No shared files mode enabled, IPC is disabled 00:06:28.592 EAL: Heap on socket 0 was shrunk by 34MB 00:06:28.592 EAL: Trying to obtain current memory policy. 00:06:28.592 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:28.592 EAL: Restoring previous memory policy: 4 00:06:28.592 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.592 EAL: request: mp_malloc_sync 00:06:28.592 EAL: No shared files mode enabled, IPC is disabled 00:06:28.592 EAL: Heap on socket 0 was expanded by 66MB 00:06:28.592 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.592 EAL: request: mp_malloc_sync 00:06:28.592 EAL: No shared files mode enabled, IPC is disabled 00:06:28.592 EAL: Heap on socket 0 was shrunk by 66MB 00:06:28.592 EAL: Trying to obtain current memory policy. 00:06:28.592 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:28.592 EAL: Restoring previous memory policy: 4 00:06:28.593 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.593 EAL: request: mp_malloc_sync 00:06:28.593 EAL: No shared files mode enabled, IPC is disabled 00:06:28.593 EAL: Heap on socket 0 was expanded by 130MB 00:06:28.593 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.593 EAL: request: mp_malloc_sync 00:06:28.593 EAL: No shared files mode enabled, IPC is disabled 00:06:28.593 EAL: Heap on socket 0 was shrunk by 130MB 00:06:28.593 EAL: Trying to obtain current memory policy. 00:06:28.593 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:28.852 EAL: Restoring previous memory policy: 4 00:06:28.852 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.852 EAL: request: mp_malloc_sync 00:06:28.852 EAL: No shared files mode enabled, IPC is disabled 00:06:28.852 EAL: Heap on socket 0 was expanded by 258MB 00:06:28.852 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.852 EAL: request: mp_malloc_sync 00:06:28.852 EAL: No shared files mode enabled, IPC is disabled 00:06:28.852 EAL: Heap on socket 0 was shrunk by 258MB 00:06:28.852 EAL: Trying to obtain current memory policy. 00:06:28.852 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:29.112 EAL: Restoring previous memory policy: 4 00:06:29.112 EAL: Calling mem event callback 'spdk:(nil)' 00:06:29.112 EAL: request: mp_malloc_sync 00:06:29.112 EAL: No shared files mode enabled, IPC is disabled 00:06:29.112 EAL: Heap on socket 0 was expanded by 514MB 00:06:29.112 EAL: Calling mem event callback 'spdk:(nil)' 00:06:29.112 EAL: request: mp_malloc_sync 00:06:29.112 EAL: No shared files mode enabled, IPC is disabled 00:06:29.112 EAL: Heap on socket 0 was shrunk by 514MB 00:06:29.112 EAL: Trying to obtain current memory policy. 00:06:29.112 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:29.372 EAL: Restoring previous memory policy: 4 00:06:29.372 EAL: Calling mem event callback 'spdk:(nil)' 00:06:29.372 EAL: request: mp_malloc_sync 00:06:29.372 EAL: No shared files mode enabled, IPC is disabled 00:06:29.372 EAL: Heap on socket 0 was expanded by 1026MB 00:06:29.631 EAL: Calling mem event callback 'spdk:(nil)' 00:06:29.891 EAL: request: mp_malloc_sync 00:06:29.892 EAL: No shared files mode enabled, IPC is disabled 00:06:29.892 EAL: Heap on socket 0 was shrunk by 1026MB 00:06:29.892 passed 00:06:29.892 00:06:29.892 Run Summary: Type Total Ran Passed Failed Inactive 00:06:29.892 suites 1 1 n/a 0 0 00:06:29.892 tests 2 2 2 0 0 00:06:29.892 asserts 6240 6240 6240 0 n/a 00:06:29.892 00:06:29.892 Elapsed time = 1.177 seconds 00:06:29.892 EAL: No shared files mode enabled, IPC is disabled 00:06:29.892 EAL: No shared files mode enabled, IPC is disabled 00:06:29.892 EAL: No shared files mode enabled, IPC is disabled 00:06:29.892 00:06:29.892 real 0m1.375s 00:06:29.892 user 0m0.767s 00:06:29.892 sys 0m0.575s 00:06:29.892 19:42:21 env.env_vtophys -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:29.892 19:42:21 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:06:29.892 ************************************ 00:06:29.892 END TEST env_vtophys 00:06:29.892 ************************************ 00:06:29.892 19:42:21 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:06:29.892 19:42:21 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:29.892 19:42:21 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:29.892 19:42:21 env -- common/autotest_common.sh@10 -- # set +x 00:06:29.892 ************************************ 00:06:29.892 START TEST env_pci 00:06:29.892 ************************************ 00:06:29.892 19:42:21 env.env_pci -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:06:29.892 00:06:29.892 00:06:29.892 CUnit - A unit testing framework for C - Version 2.1-3 00:06:29.892 http://cunit.sourceforge.net/ 00:06:29.892 00:06:29.892 00:06:29.892 Suite: pci 00:06:29.892 Test: pci_hook ...[2024-07-24 19:42:21.381414] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 1330203 has claimed it 00:06:29.892 EAL: Cannot find device (10000:00:01.0) 00:06:29.892 EAL: Failed to attach device on primary process 00:06:29.892 passed 00:06:29.892 00:06:29.892 Run Summary: Type Total Ran Passed Failed Inactive 00:06:29.892 suites 1 1 n/a 0 0 00:06:29.892 tests 1 1 1 0 0 00:06:29.892 asserts 25 25 25 0 n/a 00:06:29.892 00:06:29.892 Elapsed time = 0.042 seconds 00:06:29.892 00:06:29.892 real 0m0.070s 00:06:29.892 user 0m0.018s 00:06:29.892 sys 0m0.052s 00:06:29.892 19:42:21 env.env_pci -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:29.892 19:42:21 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:06:29.892 ************************************ 00:06:29.892 END TEST env_pci 00:06:29.892 ************************************ 00:06:29.892 19:42:21 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:06:29.892 19:42:21 env -- env/env.sh@15 -- # uname 00:06:29.892 19:42:21 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:06:29.892 19:42:21 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:06:29.892 19:42:21 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:29.892 19:42:21 env -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:06:29.892 19:42:21 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:29.892 19:42:21 env -- common/autotest_common.sh@10 -- # set +x 00:06:30.153 ************************************ 00:06:30.153 START TEST env_dpdk_post_init 00:06:30.153 ************************************ 00:06:30.153 19:42:21 env.env_dpdk_post_init -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:30.153 EAL: Detected CPU lcores: 72 00:06:30.153 EAL: Detected NUMA nodes: 2 00:06:30.153 EAL: Detected shared linkage of DPDK 00:06:30.153 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:30.154 EAL: Selected IOVA mode 'PA' 00:06:30.154 EAL: VFIO support initialized 00:06:30.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.154 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:06:30.154 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.154 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.155 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.155 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.155 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.155 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.155 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.155 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.155 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.155 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.155 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.155 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.155 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.155 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.155 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.155 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.155 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.155 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.155 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.155 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.155 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.155 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.155 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:06:30.155 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.155 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:30.155 EAL: Using IOMMU type 1 (Type 1) 00:06:30.155 EAL: Ignore mapping IO port bar(1) 00:06:30.155 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.0 (socket 0) 00:06:30.415 EAL: Ignore mapping IO port bar(1) 00:06:30.415 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.1 (socket 0) 00:06:30.415 EAL: Ignore mapping IO port bar(1) 00:06:30.415 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.2 (socket 0) 00:06:30.415 EAL: Ignore mapping IO port bar(1) 00:06:30.415 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.3 (socket 0) 00:06:30.415 EAL: Ignore mapping IO port bar(1) 00:06:30.415 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.4 (socket 0) 00:06:30.415 EAL: Ignore mapping IO port bar(1) 00:06:30.415 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.5 (socket 0) 00:06:30.415 EAL: Ignore mapping IO port bar(1) 00:06:30.415 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.6 (socket 0) 00:06:30.415 EAL: Ignore mapping IO port bar(1) 00:06:30.415 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.7 (socket 0) 00:06:30.675 EAL: Probe PCI driver: spdk_nvme (8086:0b60) device: 0000:5e:00.0 (socket 0) 00:06:30.675 EAL: Ignore mapping IO port bar(1) 00:06:30.675 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.0 (socket 1) 00:06:30.675 EAL: Ignore mapping IO port bar(1) 00:06:30.675 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.1 (socket 1) 00:06:30.675 EAL: Ignore mapping IO port bar(1) 00:06:30.675 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.2 (socket 1) 00:06:30.675 EAL: Ignore mapping IO port bar(1) 00:06:30.675 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.3 (socket 1) 00:06:30.675 EAL: Ignore mapping IO port bar(1) 00:06:30.675 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.4 (socket 1) 00:06:30.675 EAL: Ignore mapping IO port bar(1) 00:06:30.675 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.5 (socket 1) 00:06:30.675 EAL: Ignore mapping IO port bar(1) 00:06:30.675 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.6 (socket 1) 00:06:30.675 EAL: Ignore mapping IO port bar(1) 00:06:30.675 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.7 (socket 1) 00:06:30.675 EAL: Ignore mapping IO port bar(1) 00:06:30.675 EAL: Ignore mapping IO port bar(5) 00:06:30.675 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:85:05.5 (socket 1) 00:06:30.675 EAL: Ignore mapping IO port bar(1) 00:06:30.675 EAL: Ignore mapping IO port bar(5) 00:06:30.675 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:d7:05.5 (socket 1) 00:06:34.028 EAL: Releasing PCI mapped resource for 0000:5e:00.0 00:06:34.028 EAL: Calling pci_unmap_resource for 0000:5e:00.0 at 0x202001080000 00:06:34.028 Starting DPDK initialization... 00:06:34.028 Starting SPDK post initialization... 00:06:34.028 SPDK NVMe probe 00:06:34.028 Attaching to 0000:5e:00.0 00:06:34.028 Attached to 0000:5e:00.0 00:06:34.028 Cleaning up... 00:06:34.028 00:06:34.028 real 0m3.508s 00:06:34.028 user 0m2.414s 00:06:34.028 sys 0m0.654s 00:06:34.028 19:42:25 env.env_dpdk_post_init -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:34.028 19:42:25 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:06:34.028 ************************************ 00:06:34.028 END TEST env_dpdk_post_init 00:06:34.028 ************************************ 00:06:34.028 19:42:25 env -- env/env.sh@26 -- # uname 00:06:34.028 19:42:25 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:06:34.028 19:42:25 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:34.028 19:42:25 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:34.028 19:42:25 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:34.028 19:42:25 env -- common/autotest_common.sh@10 -- # set +x 00:06:34.028 ************************************ 00:06:34.028 START TEST env_mem_callbacks 00:06:34.028 ************************************ 00:06:34.028 19:42:25 env.env_mem_callbacks -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:34.028 EAL: Detected CPU lcores: 72 00:06:34.028 EAL: Detected NUMA nodes: 2 00:06:34.028 EAL: Detected shared linkage of DPDK 00:06:34.028 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:34.028 EAL: Selected IOVA mode 'PA' 00:06:34.028 EAL: VFIO support initialized 00:06:34.028 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:06:34.028 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:06:34.028 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.028 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:06:34.028 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.028 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:06:34.028 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:06:34.028 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.028 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:06:34.028 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.028 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:06:34.028 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:06:34.028 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.028 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:06:34.028 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.028 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:06:34.028 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:06:34.028 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.028 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:06:34.028 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.028 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:06:34.028 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:06:34.028 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.028 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:06:34.028 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.028 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:06:34.028 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:06:34.028 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.028 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:06:34.028 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:06:34.029 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.029 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:06:34.029 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.030 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.030 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:06:34.030 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.030 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:34.030 00:06:34.030 00:06:34.030 CUnit - A unit testing framework for C - Version 2.1-3 00:06:34.030 http://cunit.sourceforge.net/ 00:06:34.030 00:06:34.030 00:06:34.030 Suite: memory 00:06:34.030 Test: test ... 00:06:34.030 register 0x200000200000 2097152 00:06:34.030 register 0x201000a00000 2097152 00:06:34.030 malloc 3145728 00:06:34.030 register 0x200000400000 4194304 00:06:34.030 buf 0x200000500000 len 3145728 PASSED 00:06:34.030 malloc 64 00:06:34.030 buf 0x2000004fff40 len 64 PASSED 00:06:34.030 malloc 4194304 00:06:34.030 register 0x200000800000 6291456 00:06:34.030 buf 0x200000a00000 len 4194304 PASSED 00:06:34.030 free 0x200000500000 3145728 00:06:34.030 free 0x2000004fff40 64 00:06:34.030 unregister 0x200000400000 4194304 PASSED 00:06:34.030 free 0x200000a00000 4194304 00:06:34.030 unregister 0x200000800000 6291456 PASSED 00:06:34.030 malloc 8388608 00:06:34.030 register 0x200000400000 10485760 00:06:34.030 buf 0x200000600000 len 8388608 PASSED 00:06:34.030 free 0x200000600000 8388608 00:06:34.030 unregister 0x200000400000 10485760 PASSED 00:06:34.030 passed 00:06:34.030 00:06:34.030 Run Summary: Type Total Ran Passed Failed Inactive 00:06:34.030 suites 1 1 n/a 0 0 00:06:34.030 tests 1 1 1 0 0 00:06:34.030 asserts 16 16 16 0 n/a 00:06:34.030 00:06:34.030 Elapsed time = 0.008 seconds 00:06:34.030 00:06:34.030 real 0m0.098s 00:06:34.030 user 0m0.030s 00:06:34.030 sys 0m0.068s 00:06:34.030 19:42:25 env.env_mem_callbacks -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:34.030 19:42:25 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:06:34.030 ************************************ 00:06:34.030 END TEST env_mem_callbacks 00:06:34.030 ************************************ 00:06:34.030 00:06:34.030 real 0m5.796s 00:06:34.030 user 0m3.631s 00:06:34.030 sys 0m1.732s 00:06:34.030 19:42:25 env -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:34.030 19:42:25 env -- common/autotest_common.sh@10 -- # set +x 00:06:34.030 ************************************ 00:06:34.030 END TEST env 00:06:34.030 ************************************ 00:06:34.030 19:42:25 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:06:34.030 19:42:25 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:34.030 19:42:25 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:34.030 19:42:25 -- common/autotest_common.sh@10 -- # set +x 00:06:34.030 ************************************ 00:06:34.030 START TEST rpc 00:06:34.030 ************************************ 00:06:34.030 19:42:25 rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:06:34.030 * Looking for test storage... 00:06:34.030 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:34.030 19:42:25 rpc -- rpc/rpc.sh@65 -- # spdk_pid=1330856 00:06:34.030 19:42:25 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:06:34.030 19:42:25 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:34.030 19:42:25 rpc -- rpc/rpc.sh@67 -- # waitforlisten 1330856 00:06:34.030 19:42:25 rpc -- common/autotest_common.sh@831 -- # '[' -z 1330856 ']' 00:06:34.030 19:42:25 rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:34.030 19:42:25 rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:34.030 19:42:25 rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:34.030 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:34.030 19:42:25 rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:34.030 19:42:25 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:34.030 [2024-07-24 19:42:25.529283] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:06:34.030 [2024-07-24 19:42:25.529353] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1330856 ] 00:06:34.290 [2024-07-24 19:42:25.657218] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:34.290 [2024-07-24 19:42:25.760989] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:06:34.290 [2024-07-24 19:42:25.761040] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 1330856' to capture a snapshot of events at runtime. 00:06:34.290 [2024-07-24 19:42:25.761055] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:06:34.290 [2024-07-24 19:42:25.761067] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:06:34.290 [2024-07-24 19:42:25.761078] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid1330856 for offline analysis/debug. 00:06:34.290 [2024-07-24 19:42:25.761108] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:34.857 19:42:26 rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:34.857 19:42:26 rpc -- common/autotest_common.sh@864 -- # return 0 00:06:34.857 19:42:26 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:34.857 19:42:26 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:34.857 19:42:26 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:06:34.857 19:42:26 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:06:34.857 19:42:26 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:34.857 19:42:26 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:34.857 19:42:26 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:35.117 ************************************ 00:06:35.117 START TEST rpc_integrity 00:06:35.117 ************************************ 00:06:35.117 19:42:26 rpc.rpc_integrity -- common/autotest_common.sh@1125 -- # rpc_integrity 00:06:35.117 19:42:26 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:35.117 19:42:26 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.117 19:42:26 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.117 19:42:26 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.117 19:42:26 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:35.117 19:42:26 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:35.117 19:42:26 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:35.117 19:42:26 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:35.117 19:42:26 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.117 19:42:26 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.117 19:42:26 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.117 19:42:26 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:06:35.117 19:42:26 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:35.117 19:42:26 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.117 19:42:26 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.117 19:42:26 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.117 19:42:26 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:35.117 { 00:06:35.117 "name": "Malloc0", 00:06:35.118 "aliases": [ 00:06:35.118 "df1d48be-8856-4481-896d-96171bfd0e56" 00:06:35.118 ], 00:06:35.118 "product_name": "Malloc disk", 00:06:35.118 "block_size": 512, 00:06:35.118 "num_blocks": 16384, 00:06:35.118 "uuid": "df1d48be-8856-4481-896d-96171bfd0e56", 00:06:35.118 "assigned_rate_limits": { 00:06:35.118 "rw_ios_per_sec": 0, 00:06:35.118 "rw_mbytes_per_sec": 0, 00:06:35.118 "r_mbytes_per_sec": 0, 00:06:35.118 "w_mbytes_per_sec": 0 00:06:35.118 }, 00:06:35.118 "claimed": false, 00:06:35.118 "zoned": false, 00:06:35.118 "supported_io_types": { 00:06:35.118 "read": true, 00:06:35.118 "write": true, 00:06:35.118 "unmap": true, 00:06:35.118 "flush": true, 00:06:35.118 "reset": true, 00:06:35.118 "nvme_admin": false, 00:06:35.118 "nvme_io": false, 00:06:35.118 "nvme_io_md": false, 00:06:35.118 "write_zeroes": true, 00:06:35.118 "zcopy": true, 00:06:35.118 "get_zone_info": false, 00:06:35.118 "zone_management": false, 00:06:35.118 "zone_append": false, 00:06:35.118 "compare": false, 00:06:35.118 "compare_and_write": false, 00:06:35.118 "abort": true, 00:06:35.118 "seek_hole": false, 00:06:35.118 "seek_data": false, 00:06:35.118 "copy": true, 00:06:35.118 "nvme_iov_md": false 00:06:35.118 }, 00:06:35.118 "memory_domains": [ 00:06:35.118 { 00:06:35.118 "dma_device_id": "system", 00:06:35.118 "dma_device_type": 1 00:06:35.118 }, 00:06:35.118 { 00:06:35.118 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:35.118 "dma_device_type": 2 00:06:35.118 } 00:06:35.118 ], 00:06:35.118 "driver_specific": {} 00:06:35.118 } 00:06:35.118 ]' 00:06:35.118 19:42:26 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:35.118 19:42:26 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:35.118 19:42:26 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:06:35.118 19:42:26 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.118 19:42:26 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.118 [2024-07-24 19:42:26.635837] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:06:35.118 [2024-07-24 19:42:26.635877] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:35.118 [2024-07-24 19:42:26.635898] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x274df30 00:06:35.118 [2024-07-24 19:42:26.635910] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:35.118 [2024-07-24 19:42:26.637502] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:35.118 [2024-07-24 19:42:26.637530] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:35.118 Passthru0 00:06:35.118 19:42:26 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.118 19:42:26 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:35.118 19:42:26 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.118 19:42:26 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.118 19:42:26 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.118 19:42:26 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:35.118 { 00:06:35.118 "name": "Malloc0", 00:06:35.118 "aliases": [ 00:06:35.118 "df1d48be-8856-4481-896d-96171bfd0e56" 00:06:35.118 ], 00:06:35.118 "product_name": "Malloc disk", 00:06:35.118 "block_size": 512, 00:06:35.118 "num_blocks": 16384, 00:06:35.118 "uuid": "df1d48be-8856-4481-896d-96171bfd0e56", 00:06:35.118 "assigned_rate_limits": { 00:06:35.118 "rw_ios_per_sec": 0, 00:06:35.118 "rw_mbytes_per_sec": 0, 00:06:35.118 "r_mbytes_per_sec": 0, 00:06:35.118 "w_mbytes_per_sec": 0 00:06:35.118 }, 00:06:35.118 "claimed": true, 00:06:35.118 "claim_type": "exclusive_write", 00:06:35.118 "zoned": false, 00:06:35.118 "supported_io_types": { 00:06:35.118 "read": true, 00:06:35.118 "write": true, 00:06:35.118 "unmap": true, 00:06:35.118 "flush": true, 00:06:35.118 "reset": true, 00:06:35.118 "nvme_admin": false, 00:06:35.118 "nvme_io": false, 00:06:35.118 "nvme_io_md": false, 00:06:35.118 "write_zeroes": true, 00:06:35.118 "zcopy": true, 00:06:35.118 "get_zone_info": false, 00:06:35.118 "zone_management": false, 00:06:35.118 "zone_append": false, 00:06:35.118 "compare": false, 00:06:35.118 "compare_and_write": false, 00:06:35.118 "abort": true, 00:06:35.118 "seek_hole": false, 00:06:35.118 "seek_data": false, 00:06:35.118 "copy": true, 00:06:35.118 "nvme_iov_md": false 00:06:35.118 }, 00:06:35.118 "memory_domains": [ 00:06:35.118 { 00:06:35.118 "dma_device_id": "system", 00:06:35.118 "dma_device_type": 1 00:06:35.118 }, 00:06:35.118 { 00:06:35.118 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:35.118 "dma_device_type": 2 00:06:35.118 } 00:06:35.118 ], 00:06:35.118 "driver_specific": {} 00:06:35.118 }, 00:06:35.118 { 00:06:35.118 "name": "Passthru0", 00:06:35.118 "aliases": [ 00:06:35.118 "cde5673e-8418-59ae-be39-357490f53933" 00:06:35.118 ], 00:06:35.118 "product_name": "passthru", 00:06:35.118 "block_size": 512, 00:06:35.118 "num_blocks": 16384, 00:06:35.118 "uuid": "cde5673e-8418-59ae-be39-357490f53933", 00:06:35.118 "assigned_rate_limits": { 00:06:35.118 "rw_ios_per_sec": 0, 00:06:35.118 "rw_mbytes_per_sec": 0, 00:06:35.118 "r_mbytes_per_sec": 0, 00:06:35.118 "w_mbytes_per_sec": 0 00:06:35.118 }, 00:06:35.118 "claimed": false, 00:06:35.118 "zoned": false, 00:06:35.118 "supported_io_types": { 00:06:35.118 "read": true, 00:06:35.118 "write": true, 00:06:35.118 "unmap": true, 00:06:35.118 "flush": true, 00:06:35.118 "reset": true, 00:06:35.118 "nvme_admin": false, 00:06:35.118 "nvme_io": false, 00:06:35.118 "nvme_io_md": false, 00:06:35.118 "write_zeroes": true, 00:06:35.118 "zcopy": true, 00:06:35.118 "get_zone_info": false, 00:06:35.118 "zone_management": false, 00:06:35.118 "zone_append": false, 00:06:35.118 "compare": false, 00:06:35.118 "compare_and_write": false, 00:06:35.118 "abort": true, 00:06:35.118 "seek_hole": false, 00:06:35.118 "seek_data": false, 00:06:35.118 "copy": true, 00:06:35.118 "nvme_iov_md": false 00:06:35.118 }, 00:06:35.118 "memory_domains": [ 00:06:35.118 { 00:06:35.118 "dma_device_id": "system", 00:06:35.118 "dma_device_type": 1 00:06:35.118 }, 00:06:35.118 { 00:06:35.118 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:35.118 "dma_device_type": 2 00:06:35.118 } 00:06:35.118 ], 00:06:35.118 "driver_specific": { 00:06:35.118 "passthru": { 00:06:35.118 "name": "Passthru0", 00:06:35.118 "base_bdev_name": "Malloc0" 00:06:35.118 } 00:06:35.118 } 00:06:35.118 } 00:06:35.118 ]' 00:06:35.118 19:42:26 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:35.378 19:42:26 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:35.378 19:42:26 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:35.378 19:42:26 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.378 19:42:26 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.378 19:42:26 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.378 19:42:26 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:06:35.378 19:42:26 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.378 19:42:26 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.378 19:42:26 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.378 19:42:26 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:35.378 19:42:26 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.378 19:42:26 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.378 19:42:26 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.378 19:42:26 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:35.378 19:42:26 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:35.378 19:42:26 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:35.378 00:06:35.378 real 0m0.306s 00:06:35.378 user 0m0.183s 00:06:35.378 sys 0m0.062s 00:06:35.378 19:42:26 rpc.rpc_integrity -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:35.378 19:42:26 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.378 ************************************ 00:06:35.378 END TEST rpc_integrity 00:06:35.378 ************************************ 00:06:35.378 19:42:26 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:06:35.378 19:42:26 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:35.378 19:42:26 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:35.378 19:42:26 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:35.378 ************************************ 00:06:35.378 START TEST rpc_plugins 00:06:35.378 ************************************ 00:06:35.378 19:42:26 rpc.rpc_plugins -- common/autotest_common.sh@1125 -- # rpc_plugins 00:06:35.378 19:42:26 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:06:35.378 19:42:26 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.378 19:42:26 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:35.378 19:42:26 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.378 19:42:26 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:06:35.378 19:42:26 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:06:35.378 19:42:26 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.378 19:42:26 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:35.378 19:42:26 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.378 19:42:26 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:06:35.378 { 00:06:35.378 "name": "Malloc1", 00:06:35.378 "aliases": [ 00:06:35.378 "d6ffa5d9-9fb6-4499-9ab3-7415101fdba4" 00:06:35.378 ], 00:06:35.378 "product_name": "Malloc disk", 00:06:35.378 "block_size": 4096, 00:06:35.378 "num_blocks": 256, 00:06:35.378 "uuid": "d6ffa5d9-9fb6-4499-9ab3-7415101fdba4", 00:06:35.378 "assigned_rate_limits": { 00:06:35.378 "rw_ios_per_sec": 0, 00:06:35.378 "rw_mbytes_per_sec": 0, 00:06:35.378 "r_mbytes_per_sec": 0, 00:06:35.378 "w_mbytes_per_sec": 0 00:06:35.378 }, 00:06:35.378 "claimed": false, 00:06:35.378 "zoned": false, 00:06:35.378 "supported_io_types": { 00:06:35.379 "read": true, 00:06:35.379 "write": true, 00:06:35.379 "unmap": true, 00:06:35.379 "flush": true, 00:06:35.379 "reset": true, 00:06:35.379 "nvme_admin": false, 00:06:35.379 "nvme_io": false, 00:06:35.379 "nvme_io_md": false, 00:06:35.379 "write_zeroes": true, 00:06:35.379 "zcopy": true, 00:06:35.379 "get_zone_info": false, 00:06:35.379 "zone_management": false, 00:06:35.379 "zone_append": false, 00:06:35.379 "compare": false, 00:06:35.379 "compare_and_write": false, 00:06:35.379 "abort": true, 00:06:35.379 "seek_hole": false, 00:06:35.379 "seek_data": false, 00:06:35.379 "copy": true, 00:06:35.379 "nvme_iov_md": false 00:06:35.379 }, 00:06:35.379 "memory_domains": [ 00:06:35.379 { 00:06:35.379 "dma_device_id": "system", 00:06:35.379 "dma_device_type": 1 00:06:35.379 }, 00:06:35.379 { 00:06:35.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:35.379 "dma_device_type": 2 00:06:35.379 } 00:06:35.379 ], 00:06:35.379 "driver_specific": {} 00:06:35.379 } 00:06:35.379 ]' 00:06:35.379 19:42:26 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:06:35.379 19:42:26 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:06:35.379 19:42:26 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:06:35.379 19:42:26 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.379 19:42:26 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:35.638 19:42:26 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.638 19:42:26 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:06:35.638 19:42:26 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.638 19:42:26 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:35.638 19:42:26 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.638 19:42:26 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:06:35.638 19:42:26 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:06:35.638 19:42:27 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:06:35.638 00:06:35.638 real 0m0.146s 00:06:35.638 user 0m0.093s 00:06:35.638 sys 0m0.022s 00:06:35.638 19:42:27 rpc.rpc_plugins -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:35.638 19:42:27 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:35.638 ************************************ 00:06:35.638 END TEST rpc_plugins 00:06:35.638 ************************************ 00:06:35.638 19:42:27 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:06:35.638 19:42:27 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:35.638 19:42:27 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:35.638 19:42:27 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:35.638 ************************************ 00:06:35.638 START TEST rpc_trace_cmd_test 00:06:35.638 ************************************ 00:06:35.638 19:42:27 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1125 -- # rpc_trace_cmd_test 00:06:35.638 19:42:27 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:06:35.638 19:42:27 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:06:35.638 19:42:27 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.638 19:42:27 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:35.638 19:42:27 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.638 19:42:27 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:06:35.638 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid1330856", 00:06:35.638 "tpoint_group_mask": "0x8", 00:06:35.638 "iscsi_conn": { 00:06:35.638 "mask": "0x2", 00:06:35.638 "tpoint_mask": "0x0" 00:06:35.638 }, 00:06:35.638 "scsi": { 00:06:35.638 "mask": "0x4", 00:06:35.638 "tpoint_mask": "0x0" 00:06:35.638 }, 00:06:35.638 "bdev": { 00:06:35.638 "mask": "0x8", 00:06:35.638 "tpoint_mask": "0xffffffffffffffff" 00:06:35.638 }, 00:06:35.638 "nvmf_rdma": { 00:06:35.638 "mask": "0x10", 00:06:35.638 "tpoint_mask": "0x0" 00:06:35.638 }, 00:06:35.638 "nvmf_tcp": { 00:06:35.638 "mask": "0x20", 00:06:35.638 "tpoint_mask": "0x0" 00:06:35.638 }, 00:06:35.638 "ftl": { 00:06:35.638 "mask": "0x40", 00:06:35.638 "tpoint_mask": "0x0" 00:06:35.638 }, 00:06:35.638 "blobfs": { 00:06:35.638 "mask": "0x80", 00:06:35.638 "tpoint_mask": "0x0" 00:06:35.638 }, 00:06:35.638 "dsa": { 00:06:35.638 "mask": "0x200", 00:06:35.638 "tpoint_mask": "0x0" 00:06:35.638 }, 00:06:35.638 "thread": { 00:06:35.638 "mask": "0x400", 00:06:35.638 "tpoint_mask": "0x0" 00:06:35.638 }, 00:06:35.638 "nvme_pcie": { 00:06:35.638 "mask": "0x800", 00:06:35.638 "tpoint_mask": "0x0" 00:06:35.638 }, 00:06:35.638 "iaa": { 00:06:35.638 "mask": "0x1000", 00:06:35.638 "tpoint_mask": "0x0" 00:06:35.638 }, 00:06:35.638 "nvme_tcp": { 00:06:35.638 "mask": "0x2000", 00:06:35.638 "tpoint_mask": "0x0" 00:06:35.638 }, 00:06:35.638 "bdev_nvme": { 00:06:35.638 "mask": "0x4000", 00:06:35.638 "tpoint_mask": "0x0" 00:06:35.638 }, 00:06:35.638 "sock": { 00:06:35.638 "mask": "0x8000", 00:06:35.638 "tpoint_mask": "0x0" 00:06:35.638 } 00:06:35.638 }' 00:06:35.638 19:42:27 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:06:35.638 19:42:27 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:06:35.638 19:42:27 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:06:35.638 19:42:27 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:06:35.898 19:42:27 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:06:35.898 19:42:27 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:06:35.898 19:42:27 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:06:35.898 19:42:27 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:06:35.898 19:42:27 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:06:35.898 19:42:27 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:06:35.898 00:06:35.898 real 0m0.243s 00:06:35.898 user 0m0.197s 00:06:35.898 sys 0m0.037s 00:06:35.898 19:42:27 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:35.898 19:42:27 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:35.898 ************************************ 00:06:35.898 END TEST rpc_trace_cmd_test 00:06:35.898 ************************************ 00:06:35.898 19:42:27 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:06:35.898 19:42:27 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:06:35.898 19:42:27 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:06:35.898 19:42:27 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:35.898 19:42:27 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:35.898 19:42:27 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:35.898 ************************************ 00:06:35.898 START TEST rpc_daemon_integrity 00:06:35.898 ************************************ 00:06:35.898 19:42:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1125 -- # rpc_integrity 00:06:35.898 19:42:27 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:35.898 19:42:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.898 19:42:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.898 19:42:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.898 19:42:27 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:35.898 19:42:27 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:36.158 19:42:27 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:36.158 19:42:27 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:36.158 19:42:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:36.158 19:42:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.158 19:42:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:36.158 19:42:27 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:06:36.158 19:42:27 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:36.158 19:42:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:36.158 19:42:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.158 19:42:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:36.158 19:42:27 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:36.158 { 00:06:36.158 "name": "Malloc2", 00:06:36.158 "aliases": [ 00:06:36.158 "7591d1dc-0c45-4784-84d1-c8a6d883a73a" 00:06:36.158 ], 00:06:36.158 "product_name": "Malloc disk", 00:06:36.158 "block_size": 512, 00:06:36.158 "num_blocks": 16384, 00:06:36.158 "uuid": "7591d1dc-0c45-4784-84d1-c8a6d883a73a", 00:06:36.158 "assigned_rate_limits": { 00:06:36.158 "rw_ios_per_sec": 0, 00:06:36.158 "rw_mbytes_per_sec": 0, 00:06:36.158 "r_mbytes_per_sec": 0, 00:06:36.158 "w_mbytes_per_sec": 0 00:06:36.158 }, 00:06:36.158 "claimed": false, 00:06:36.158 "zoned": false, 00:06:36.158 "supported_io_types": { 00:06:36.158 "read": true, 00:06:36.158 "write": true, 00:06:36.158 "unmap": true, 00:06:36.158 "flush": true, 00:06:36.158 "reset": true, 00:06:36.158 "nvme_admin": false, 00:06:36.158 "nvme_io": false, 00:06:36.158 "nvme_io_md": false, 00:06:36.158 "write_zeroes": true, 00:06:36.158 "zcopy": true, 00:06:36.158 "get_zone_info": false, 00:06:36.158 "zone_management": false, 00:06:36.158 "zone_append": false, 00:06:36.158 "compare": false, 00:06:36.158 "compare_and_write": false, 00:06:36.158 "abort": true, 00:06:36.158 "seek_hole": false, 00:06:36.158 "seek_data": false, 00:06:36.158 "copy": true, 00:06:36.158 "nvme_iov_md": false 00:06:36.158 }, 00:06:36.158 "memory_domains": [ 00:06:36.158 { 00:06:36.158 "dma_device_id": "system", 00:06:36.158 "dma_device_type": 1 00:06:36.158 }, 00:06:36.158 { 00:06:36.158 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:36.158 "dma_device_type": 2 00:06:36.158 } 00:06:36.158 ], 00:06:36.158 "driver_specific": {} 00:06:36.158 } 00:06:36.158 ]' 00:06:36.158 19:42:27 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:36.158 19:42:27 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:36.158 19:42:27 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:06:36.158 19:42:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:36.158 19:42:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.158 [2024-07-24 19:42:27.574491] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:06:36.158 [2024-07-24 19:42:27.574527] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:36.158 [2024-07-24 19:42:27.574546] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28f7460 00:06:36.158 [2024-07-24 19:42:27.574558] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:36.158 [2024-07-24 19:42:27.575922] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:36.158 [2024-07-24 19:42:27.575950] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:36.158 Passthru0 00:06:36.158 19:42:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:36.158 19:42:27 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:36.158 19:42:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:36.158 19:42:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.158 19:42:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:36.158 19:42:27 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:36.158 { 00:06:36.159 "name": "Malloc2", 00:06:36.159 "aliases": [ 00:06:36.159 "7591d1dc-0c45-4784-84d1-c8a6d883a73a" 00:06:36.159 ], 00:06:36.159 "product_name": "Malloc disk", 00:06:36.159 "block_size": 512, 00:06:36.159 "num_blocks": 16384, 00:06:36.159 "uuid": "7591d1dc-0c45-4784-84d1-c8a6d883a73a", 00:06:36.159 "assigned_rate_limits": { 00:06:36.159 "rw_ios_per_sec": 0, 00:06:36.159 "rw_mbytes_per_sec": 0, 00:06:36.159 "r_mbytes_per_sec": 0, 00:06:36.159 "w_mbytes_per_sec": 0 00:06:36.159 }, 00:06:36.159 "claimed": true, 00:06:36.159 "claim_type": "exclusive_write", 00:06:36.159 "zoned": false, 00:06:36.159 "supported_io_types": { 00:06:36.159 "read": true, 00:06:36.159 "write": true, 00:06:36.159 "unmap": true, 00:06:36.159 "flush": true, 00:06:36.159 "reset": true, 00:06:36.159 "nvme_admin": false, 00:06:36.159 "nvme_io": false, 00:06:36.159 "nvme_io_md": false, 00:06:36.159 "write_zeroes": true, 00:06:36.159 "zcopy": true, 00:06:36.159 "get_zone_info": false, 00:06:36.159 "zone_management": false, 00:06:36.159 "zone_append": false, 00:06:36.159 "compare": false, 00:06:36.159 "compare_and_write": false, 00:06:36.159 "abort": true, 00:06:36.159 "seek_hole": false, 00:06:36.159 "seek_data": false, 00:06:36.159 "copy": true, 00:06:36.159 "nvme_iov_md": false 00:06:36.159 }, 00:06:36.159 "memory_domains": [ 00:06:36.159 { 00:06:36.159 "dma_device_id": "system", 00:06:36.159 "dma_device_type": 1 00:06:36.159 }, 00:06:36.159 { 00:06:36.159 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:36.159 "dma_device_type": 2 00:06:36.159 } 00:06:36.159 ], 00:06:36.159 "driver_specific": {} 00:06:36.159 }, 00:06:36.159 { 00:06:36.159 "name": "Passthru0", 00:06:36.159 "aliases": [ 00:06:36.159 "63bf4b06-822d-5a83-9610-6583dee6159f" 00:06:36.159 ], 00:06:36.159 "product_name": "passthru", 00:06:36.159 "block_size": 512, 00:06:36.159 "num_blocks": 16384, 00:06:36.159 "uuid": "63bf4b06-822d-5a83-9610-6583dee6159f", 00:06:36.159 "assigned_rate_limits": { 00:06:36.159 "rw_ios_per_sec": 0, 00:06:36.159 "rw_mbytes_per_sec": 0, 00:06:36.159 "r_mbytes_per_sec": 0, 00:06:36.159 "w_mbytes_per_sec": 0 00:06:36.159 }, 00:06:36.159 "claimed": false, 00:06:36.159 "zoned": false, 00:06:36.159 "supported_io_types": { 00:06:36.159 "read": true, 00:06:36.159 "write": true, 00:06:36.159 "unmap": true, 00:06:36.159 "flush": true, 00:06:36.159 "reset": true, 00:06:36.159 "nvme_admin": false, 00:06:36.159 "nvme_io": false, 00:06:36.159 "nvme_io_md": false, 00:06:36.159 "write_zeroes": true, 00:06:36.159 "zcopy": true, 00:06:36.159 "get_zone_info": false, 00:06:36.159 "zone_management": false, 00:06:36.159 "zone_append": false, 00:06:36.159 "compare": false, 00:06:36.159 "compare_and_write": false, 00:06:36.159 "abort": true, 00:06:36.159 "seek_hole": false, 00:06:36.159 "seek_data": false, 00:06:36.159 "copy": true, 00:06:36.159 "nvme_iov_md": false 00:06:36.159 }, 00:06:36.159 "memory_domains": [ 00:06:36.159 { 00:06:36.159 "dma_device_id": "system", 00:06:36.159 "dma_device_type": 1 00:06:36.159 }, 00:06:36.159 { 00:06:36.159 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:36.159 "dma_device_type": 2 00:06:36.159 } 00:06:36.159 ], 00:06:36.159 "driver_specific": { 00:06:36.159 "passthru": { 00:06:36.159 "name": "Passthru0", 00:06:36.159 "base_bdev_name": "Malloc2" 00:06:36.159 } 00:06:36.159 } 00:06:36.159 } 00:06:36.159 ]' 00:06:36.159 19:42:27 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:36.159 19:42:27 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:36.159 19:42:27 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:36.159 19:42:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:36.159 19:42:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.159 19:42:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:36.159 19:42:27 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:06:36.159 19:42:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:36.159 19:42:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.159 19:42:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:36.159 19:42:27 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:36.159 19:42:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:36.159 19:42:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.159 19:42:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:36.159 19:42:27 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:36.159 19:42:27 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:36.159 19:42:27 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:36.159 00:06:36.159 real 0m0.300s 00:06:36.159 user 0m0.186s 00:06:36.159 sys 0m0.055s 00:06:36.159 19:42:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:36.159 19:42:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.159 ************************************ 00:06:36.159 END TEST rpc_daemon_integrity 00:06:36.159 ************************************ 00:06:36.419 19:42:27 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:06:36.419 19:42:27 rpc -- rpc/rpc.sh@84 -- # killprocess 1330856 00:06:36.419 19:42:27 rpc -- common/autotest_common.sh@950 -- # '[' -z 1330856 ']' 00:06:36.419 19:42:27 rpc -- common/autotest_common.sh@954 -- # kill -0 1330856 00:06:36.419 19:42:27 rpc -- common/autotest_common.sh@955 -- # uname 00:06:36.419 19:42:27 rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:36.419 19:42:27 rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1330856 00:06:36.419 19:42:27 rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:36.419 19:42:27 rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:36.419 19:42:27 rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1330856' 00:06:36.419 killing process with pid 1330856 00:06:36.419 19:42:27 rpc -- common/autotest_common.sh@969 -- # kill 1330856 00:06:36.419 19:42:27 rpc -- common/autotest_common.sh@974 -- # wait 1330856 00:06:36.679 00:06:36.679 real 0m2.830s 00:06:36.679 user 0m3.576s 00:06:36.679 sys 0m0.958s 00:06:36.679 19:42:28 rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:36.679 19:42:28 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:36.679 ************************************ 00:06:36.679 END TEST rpc 00:06:36.679 ************************************ 00:06:36.679 19:42:28 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:36.679 19:42:28 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:36.679 19:42:28 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:36.679 19:42:28 -- common/autotest_common.sh@10 -- # set +x 00:06:36.679 ************************************ 00:06:36.679 START TEST skip_rpc 00:06:36.679 ************************************ 00:06:36.679 19:42:28 skip_rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:36.938 * Looking for test storage... 00:06:36.938 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:36.938 19:42:28 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:36.938 19:42:28 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:36.938 19:42:28 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:06:36.938 19:42:28 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:36.938 19:42:28 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:36.938 19:42:28 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:36.938 ************************************ 00:06:36.938 START TEST skip_rpc 00:06:36.938 ************************************ 00:06:36.938 19:42:28 skip_rpc.skip_rpc -- common/autotest_common.sh@1125 -- # test_skip_rpc 00:06:36.938 19:42:28 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=1331385 00:06:36.938 19:42:28 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:36.938 19:42:28 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:06:36.938 19:42:28 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:06:36.938 [2024-07-24 19:42:28.477300] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:06:36.938 [2024-07-24 19:42:28.477364] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1331385 ] 00:06:37.198 [2024-07-24 19:42:28.608161] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.198 [2024-07-24 19:42:28.708332] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # local es=0 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd spdk_get_version 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- common/autotest_common.sh@653 -- # rpc_cmd spdk_get_version 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- common/autotest_common.sh@653 -- # es=1 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 1331385 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- common/autotest_common.sh@950 -- # '[' -z 1331385 ']' 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # kill -0 1331385 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # uname 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1331385 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1331385' 00:06:42.475 killing process with pid 1331385 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- common/autotest_common.sh@969 -- # kill 1331385 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- common/autotest_common.sh@974 -- # wait 1331385 00:06:42.475 00:06:42.475 real 0m5.460s 00:06:42.475 user 0m5.098s 00:06:42.475 sys 0m0.381s 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:42.475 19:42:33 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:42.475 ************************************ 00:06:42.475 END TEST skip_rpc 00:06:42.475 ************************************ 00:06:42.475 19:42:33 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:06:42.475 19:42:33 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:42.475 19:42:33 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:42.475 19:42:33 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:42.475 ************************************ 00:06:42.475 START TEST skip_rpc_with_json 00:06:42.475 ************************************ 00:06:42.475 19:42:33 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1125 -- # test_skip_rpc_with_json 00:06:42.475 19:42:33 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:06:42.475 19:42:33 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=1332109 00:06:42.475 19:42:33 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:42.475 19:42:33 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:42.475 19:42:33 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 1332109 00:06:42.475 19:42:33 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@831 -- # '[' -z 1332109 ']' 00:06:42.475 19:42:33 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:42.475 19:42:33 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:42.475 19:42:33 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:42.475 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:42.475 19:42:33 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:42.475 19:42:33 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:42.475 [2024-07-24 19:42:34.013486] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:06:42.475 [2024-07-24 19:42:34.013539] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1332109 ] 00:06:42.735 [2024-07-24 19:42:34.128280] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:42.735 [2024-07-24 19:42:34.230208] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:43.673 19:42:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:43.673 19:42:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@864 -- # return 0 00:06:43.673 19:42:34 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:06:43.673 19:42:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:43.673 19:42:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:43.673 [2024-07-24 19:42:34.959011] nvmf_rpc.c:2569:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:06:43.673 request: 00:06:43.673 { 00:06:43.673 "trtype": "tcp", 00:06:43.673 "method": "nvmf_get_transports", 00:06:43.673 "req_id": 1 00:06:43.673 } 00:06:43.673 Got JSON-RPC error response 00:06:43.673 response: 00:06:43.673 { 00:06:43.673 "code": -19, 00:06:43.673 "message": "No such device" 00:06:43.673 } 00:06:43.673 19:42:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:06:43.673 19:42:34 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:06:43.673 19:42:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:43.673 19:42:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:43.673 [2024-07-24 19:42:34.971151] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:43.673 19:42:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:43.673 19:42:34 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:06:43.673 19:42:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:43.673 19:42:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:43.673 19:42:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:43.673 19:42:35 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:43.673 { 00:06:43.673 "subsystems": [ 00:06:43.673 { 00:06:43.673 "subsystem": "keyring", 00:06:43.673 "config": [] 00:06:43.673 }, 00:06:43.673 { 00:06:43.673 "subsystem": "iobuf", 00:06:43.673 "config": [ 00:06:43.673 { 00:06:43.673 "method": "iobuf_set_options", 00:06:43.673 "params": { 00:06:43.673 "small_pool_count": 8192, 00:06:43.673 "large_pool_count": 1024, 00:06:43.673 "small_bufsize": 8192, 00:06:43.673 "large_bufsize": 135168 00:06:43.673 } 00:06:43.673 } 00:06:43.673 ] 00:06:43.673 }, 00:06:43.673 { 00:06:43.673 "subsystem": "sock", 00:06:43.673 "config": [ 00:06:43.673 { 00:06:43.673 "method": "sock_set_default_impl", 00:06:43.673 "params": { 00:06:43.673 "impl_name": "posix" 00:06:43.673 } 00:06:43.673 }, 00:06:43.673 { 00:06:43.673 "method": "sock_impl_set_options", 00:06:43.673 "params": { 00:06:43.673 "impl_name": "ssl", 00:06:43.673 "recv_buf_size": 4096, 00:06:43.673 "send_buf_size": 4096, 00:06:43.673 "enable_recv_pipe": true, 00:06:43.673 "enable_quickack": false, 00:06:43.673 "enable_placement_id": 0, 00:06:43.673 "enable_zerocopy_send_server": true, 00:06:43.673 "enable_zerocopy_send_client": false, 00:06:43.673 "zerocopy_threshold": 0, 00:06:43.673 "tls_version": 0, 00:06:43.673 "enable_ktls": false 00:06:43.673 } 00:06:43.673 }, 00:06:43.673 { 00:06:43.673 "method": "sock_impl_set_options", 00:06:43.673 "params": { 00:06:43.673 "impl_name": "posix", 00:06:43.673 "recv_buf_size": 2097152, 00:06:43.673 "send_buf_size": 2097152, 00:06:43.673 "enable_recv_pipe": true, 00:06:43.673 "enable_quickack": false, 00:06:43.673 "enable_placement_id": 0, 00:06:43.673 "enable_zerocopy_send_server": true, 00:06:43.673 "enable_zerocopy_send_client": false, 00:06:43.673 "zerocopy_threshold": 0, 00:06:43.673 "tls_version": 0, 00:06:43.673 "enable_ktls": false 00:06:43.673 } 00:06:43.673 } 00:06:43.673 ] 00:06:43.673 }, 00:06:43.673 { 00:06:43.673 "subsystem": "vmd", 00:06:43.673 "config": [] 00:06:43.673 }, 00:06:43.673 { 00:06:43.673 "subsystem": "accel", 00:06:43.673 "config": [ 00:06:43.673 { 00:06:43.673 "method": "accel_set_options", 00:06:43.673 "params": { 00:06:43.673 "small_cache_size": 128, 00:06:43.673 "large_cache_size": 16, 00:06:43.673 "task_count": 2048, 00:06:43.673 "sequence_count": 2048, 00:06:43.673 "buf_count": 2048 00:06:43.673 } 00:06:43.673 } 00:06:43.673 ] 00:06:43.673 }, 00:06:43.673 { 00:06:43.673 "subsystem": "bdev", 00:06:43.673 "config": [ 00:06:43.673 { 00:06:43.673 "method": "bdev_set_options", 00:06:43.673 "params": { 00:06:43.673 "bdev_io_pool_size": 65535, 00:06:43.673 "bdev_io_cache_size": 256, 00:06:43.673 "bdev_auto_examine": true, 00:06:43.673 "iobuf_small_cache_size": 128, 00:06:43.673 "iobuf_large_cache_size": 16 00:06:43.673 } 00:06:43.673 }, 00:06:43.673 { 00:06:43.673 "method": "bdev_raid_set_options", 00:06:43.673 "params": { 00:06:43.673 "process_window_size_kb": 1024, 00:06:43.673 "process_max_bandwidth_mb_sec": 0 00:06:43.673 } 00:06:43.673 }, 00:06:43.673 { 00:06:43.673 "method": "bdev_iscsi_set_options", 00:06:43.673 "params": { 00:06:43.673 "timeout_sec": 30 00:06:43.673 } 00:06:43.673 }, 00:06:43.673 { 00:06:43.673 "method": "bdev_nvme_set_options", 00:06:43.673 "params": { 00:06:43.673 "action_on_timeout": "none", 00:06:43.673 "timeout_us": 0, 00:06:43.673 "timeout_admin_us": 0, 00:06:43.673 "keep_alive_timeout_ms": 10000, 00:06:43.673 "arbitration_burst": 0, 00:06:43.673 "low_priority_weight": 0, 00:06:43.673 "medium_priority_weight": 0, 00:06:43.673 "high_priority_weight": 0, 00:06:43.673 "nvme_adminq_poll_period_us": 10000, 00:06:43.673 "nvme_ioq_poll_period_us": 0, 00:06:43.673 "io_queue_requests": 0, 00:06:43.673 "delay_cmd_submit": true, 00:06:43.673 "transport_retry_count": 4, 00:06:43.673 "bdev_retry_count": 3, 00:06:43.673 "transport_ack_timeout": 0, 00:06:43.673 "ctrlr_loss_timeout_sec": 0, 00:06:43.673 "reconnect_delay_sec": 0, 00:06:43.673 "fast_io_fail_timeout_sec": 0, 00:06:43.673 "disable_auto_failback": false, 00:06:43.673 "generate_uuids": false, 00:06:43.673 "transport_tos": 0, 00:06:43.673 "nvme_error_stat": false, 00:06:43.673 "rdma_srq_size": 0, 00:06:43.673 "io_path_stat": false, 00:06:43.673 "allow_accel_sequence": false, 00:06:43.673 "rdma_max_cq_size": 0, 00:06:43.673 "rdma_cm_event_timeout_ms": 0, 00:06:43.673 "dhchap_digests": [ 00:06:43.673 "sha256", 00:06:43.673 "sha384", 00:06:43.673 "sha512" 00:06:43.673 ], 00:06:43.673 "dhchap_dhgroups": [ 00:06:43.673 "null", 00:06:43.673 "ffdhe2048", 00:06:43.673 "ffdhe3072", 00:06:43.673 "ffdhe4096", 00:06:43.673 "ffdhe6144", 00:06:43.673 "ffdhe8192" 00:06:43.673 ] 00:06:43.673 } 00:06:43.673 }, 00:06:43.673 { 00:06:43.673 "method": "bdev_nvme_set_hotplug", 00:06:43.673 "params": { 00:06:43.673 "period_us": 100000, 00:06:43.673 "enable": false 00:06:43.673 } 00:06:43.673 }, 00:06:43.673 { 00:06:43.673 "method": "bdev_wait_for_examine" 00:06:43.673 } 00:06:43.673 ] 00:06:43.673 }, 00:06:43.673 { 00:06:43.673 "subsystem": "scsi", 00:06:43.673 "config": null 00:06:43.673 }, 00:06:43.673 { 00:06:43.674 "subsystem": "scheduler", 00:06:43.674 "config": [ 00:06:43.674 { 00:06:43.674 "method": "framework_set_scheduler", 00:06:43.674 "params": { 00:06:43.674 "name": "static" 00:06:43.674 } 00:06:43.674 } 00:06:43.674 ] 00:06:43.674 }, 00:06:43.674 { 00:06:43.674 "subsystem": "vhost_scsi", 00:06:43.674 "config": [] 00:06:43.674 }, 00:06:43.674 { 00:06:43.674 "subsystem": "vhost_blk", 00:06:43.674 "config": [] 00:06:43.674 }, 00:06:43.674 { 00:06:43.674 "subsystem": "ublk", 00:06:43.674 "config": [] 00:06:43.674 }, 00:06:43.674 { 00:06:43.674 "subsystem": "nbd", 00:06:43.674 "config": [] 00:06:43.674 }, 00:06:43.674 { 00:06:43.674 "subsystem": "nvmf", 00:06:43.674 "config": [ 00:06:43.674 { 00:06:43.674 "method": "nvmf_set_config", 00:06:43.674 "params": { 00:06:43.674 "discovery_filter": "match_any", 00:06:43.674 "admin_cmd_passthru": { 00:06:43.674 "identify_ctrlr": false 00:06:43.674 } 00:06:43.674 } 00:06:43.674 }, 00:06:43.674 { 00:06:43.674 "method": "nvmf_set_max_subsystems", 00:06:43.674 "params": { 00:06:43.674 "max_subsystems": 1024 00:06:43.674 } 00:06:43.674 }, 00:06:43.674 { 00:06:43.674 "method": "nvmf_set_crdt", 00:06:43.674 "params": { 00:06:43.674 "crdt1": 0, 00:06:43.674 "crdt2": 0, 00:06:43.674 "crdt3": 0 00:06:43.674 } 00:06:43.674 }, 00:06:43.674 { 00:06:43.674 "method": "nvmf_create_transport", 00:06:43.674 "params": { 00:06:43.674 "trtype": "TCP", 00:06:43.674 "max_queue_depth": 128, 00:06:43.674 "max_io_qpairs_per_ctrlr": 127, 00:06:43.674 "in_capsule_data_size": 4096, 00:06:43.674 "max_io_size": 131072, 00:06:43.674 "io_unit_size": 131072, 00:06:43.674 "max_aq_depth": 128, 00:06:43.674 "num_shared_buffers": 511, 00:06:43.674 "buf_cache_size": 4294967295, 00:06:43.674 "dif_insert_or_strip": false, 00:06:43.674 "zcopy": false, 00:06:43.674 "c2h_success": true, 00:06:43.674 "sock_priority": 0, 00:06:43.674 "abort_timeout_sec": 1, 00:06:43.674 "ack_timeout": 0, 00:06:43.674 "data_wr_pool_size": 0 00:06:43.674 } 00:06:43.674 } 00:06:43.674 ] 00:06:43.674 }, 00:06:43.674 { 00:06:43.674 "subsystem": "iscsi", 00:06:43.674 "config": [ 00:06:43.674 { 00:06:43.674 "method": "iscsi_set_options", 00:06:43.674 "params": { 00:06:43.674 "node_base": "iqn.2016-06.io.spdk", 00:06:43.674 "max_sessions": 128, 00:06:43.674 "max_connections_per_session": 2, 00:06:43.674 "max_queue_depth": 64, 00:06:43.674 "default_time2wait": 2, 00:06:43.674 "default_time2retain": 20, 00:06:43.674 "first_burst_length": 8192, 00:06:43.674 "immediate_data": true, 00:06:43.674 "allow_duplicated_isid": false, 00:06:43.674 "error_recovery_level": 0, 00:06:43.674 "nop_timeout": 60, 00:06:43.674 "nop_in_interval": 30, 00:06:43.674 "disable_chap": false, 00:06:43.674 "require_chap": false, 00:06:43.674 "mutual_chap": false, 00:06:43.674 "chap_group": 0, 00:06:43.674 "max_large_datain_per_connection": 64, 00:06:43.674 "max_r2t_per_connection": 4, 00:06:43.674 "pdu_pool_size": 36864, 00:06:43.674 "immediate_data_pool_size": 16384, 00:06:43.674 "data_out_pool_size": 2048 00:06:43.674 } 00:06:43.674 } 00:06:43.674 ] 00:06:43.674 } 00:06:43.674 ] 00:06:43.674 } 00:06:43.674 19:42:35 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:06:43.674 19:42:35 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 1332109 00:06:43.674 19:42:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@950 -- # '[' -z 1332109 ']' 00:06:43.674 19:42:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # kill -0 1332109 00:06:43.674 19:42:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # uname 00:06:43.674 19:42:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:43.674 19:42:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1332109 00:06:43.674 19:42:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:43.674 19:42:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:43.674 19:42:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1332109' 00:06:43.674 killing process with pid 1332109 00:06:43.674 19:42:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@969 -- # kill 1332109 00:06:43.674 19:42:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@974 -- # wait 1332109 00:06:44.243 19:42:35 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=1332359 00:06:44.243 19:42:35 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:06:44.243 19:42:35 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:49.569 19:42:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 1332359 00:06:49.569 19:42:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@950 -- # '[' -z 1332359 ']' 00:06:49.569 19:42:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # kill -0 1332359 00:06:49.569 19:42:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # uname 00:06:49.569 19:42:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:49.569 19:42:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1332359 00:06:49.569 19:42:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:49.569 19:42:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:49.569 19:42:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1332359' 00:06:49.569 killing process with pid 1332359 00:06:49.569 19:42:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@969 -- # kill 1332359 00:06:49.569 19:42:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@974 -- # wait 1332359 00:06:49.569 19:42:41 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:49.569 19:42:41 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:49.569 00:06:49.569 real 0m7.072s 00:06:49.569 user 0m6.788s 00:06:49.569 sys 0m0.850s 00:06:49.569 19:42:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:49.569 19:42:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:49.569 ************************************ 00:06:49.569 END TEST skip_rpc_with_json 00:06:49.569 ************************************ 00:06:49.569 19:42:41 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:06:49.569 19:42:41 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:49.569 19:42:41 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:49.569 19:42:41 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:49.569 ************************************ 00:06:49.569 START TEST skip_rpc_with_delay 00:06:49.569 ************************************ 00:06:49.569 19:42:41 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1125 -- # test_skip_rpc_with_delay 00:06:49.569 19:42:41 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:49.569 19:42:41 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # local es=0 00:06:49.569 19:42:41 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:49.569 19:42:41 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:49.569 19:42:41 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:49.569 19:42:41 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:49.569 19:42:41 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:49.569 19:42:41 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:49.569 19:42:41 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:49.569 19:42:41 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:49.569 19:42:41 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:49.569 19:42:41 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:49.828 [2024-07-24 19:42:41.182013] app.c: 832:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:06:49.828 [2024-07-24 19:42:41.182107] app.c: 711:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:06:49.828 19:42:41 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@653 -- # es=1 00:06:49.828 19:42:41 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:49.828 19:42:41 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:49.828 19:42:41 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:49.828 00:06:49.828 real 0m0.093s 00:06:49.828 user 0m0.056s 00:06:49.828 sys 0m0.037s 00:06:49.828 19:42:41 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:49.828 19:42:41 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:06:49.828 ************************************ 00:06:49.828 END TEST skip_rpc_with_delay 00:06:49.828 ************************************ 00:06:49.828 19:42:41 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:06:49.828 19:42:41 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:06:49.828 19:42:41 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:06:49.828 19:42:41 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:49.828 19:42:41 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:49.828 19:42:41 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:49.828 ************************************ 00:06:49.828 START TEST exit_on_failed_rpc_init 00:06:49.828 ************************************ 00:06:49.828 19:42:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1125 -- # test_exit_on_failed_rpc_init 00:06:49.828 19:42:41 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=1333213 00:06:49.828 19:42:41 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 1333213 00:06:49.828 19:42:41 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:49.828 19:42:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@831 -- # '[' -z 1333213 ']' 00:06:49.829 19:42:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:49.829 19:42:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:49.829 19:42:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:49.829 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:49.829 19:42:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:49.829 19:42:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:49.829 [2024-07-24 19:42:41.363650] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:06:49.829 [2024-07-24 19:42:41.363715] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1333213 ] 00:06:50.088 [2024-07-24 19:42:41.495385] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:50.088 [2024-07-24 19:42:41.602898] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.347 19:42:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:50.347 19:42:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@864 -- # return 0 00:06:50.347 19:42:41 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:50.347 19:42:41 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:50.347 19:42:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # local es=0 00:06:50.347 19:42:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:50.347 19:42:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:50.347 19:42:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:50.347 19:42:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:50.347 19:42:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:50.347 19:42:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:50.347 19:42:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:50.347 19:42:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:50.347 19:42:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:50.347 19:42:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:50.347 [2024-07-24 19:42:41.918296] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:06:50.347 [2024-07-24 19:42:41.918365] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1333232 ] 00:06:50.607 [2024-07-24 19:42:42.052172] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:50.607 [2024-07-24 19:42:42.167176] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:50.607 [2024-07-24 19:42:42.167273] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:06:50.607 [2024-07-24 19:42:42.167295] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:06:50.607 [2024-07-24 19:42:42.167311] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:50.866 19:42:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@653 -- # es=234 00:06:50.866 19:42:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:50.866 19:42:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@662 -- # es=106 00:06:50.866 19:42:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@663 -- # case "$es" in 00:06:50.866 19:42:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@670 -- # es=1 00:06:50.866 19:42:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:50.866 19:42:42 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:06:50.866 19:42:42 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 1333213 00:06:50.866 19:42:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@950 -- # '[' -z 1333213 ']' 00:06:50.866 19:42:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # kill -0 1333213 00:06:50.866 19:42:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # uname 00:06:50.866 19:42:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:50.866 19:42:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1333213 00:06:50.866 19:42:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:50.866 19:42:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:50.866 19:42:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1333213' 00:06:50.866 killing process with pid 1333213 00:06:50.866 19:42:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@969 -- # kill 1333213 00:06:50.866 19:42:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@974 -- # wait 1333213 00:06:51.435 00:06:51.435 real 0m1.428s 00:06:51.435 user 0m1.801s 00:06:51.435 sys 0m0.603s 00:06:51.435 19:42:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:51.435 19:42:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:51.435 ************************************ 00:06:51.435 END TEST exit_on_failed_rpc_init 00:06:51.435 ************************************ 00:06:51.435 19:42:42 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:51.435 00:06:51.435 real 0m14.505s 00:06:51.435 user 0m13.894s 00:06:51.435 sys 0m2.207s 00:06:51.435 19:42:42 skip_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:51.435 19:42:42 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:51.435 ************************************ 00:06:51.435 END TEST skip_rpc 00:06:51.435 ************************************ 00:06:51.435 19:42:42 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:51.435 19:42:42 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:51.435 19:42:42 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:51.435 19:42:42 -- common/autotest_common.sh@10 -- # set +x 00:06:51.435 ************************************ 00:06:51.435 START TEST rpc_client 00:06:51.435 ************************************ 00:06:51.435 19:42:42 rpc_client -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:51.435 * Looking for test storage... 00:06:51.435 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:06:51.435 19:42:42 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:06:51.435 OK 00:06:51.435 19:42:42 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:06:51.435 00:06:51.435 real 0m0.140s 00:06:51.435 user 0m0.063s 00:06:51.435 sys 0m0.087s 00:06:51.435 19:42:42 rpc_client -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:51.435 19:42:42 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:06:51.435 ************************************ 00:06:51.435 END TEST rpc_client 00:06:51.435 ************************************ 00:06:51.695 19:42:43 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:06:51.695 19:42:43 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:51.695 19:42:43 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:51.695 19:42:43 -- common/autotest_common.sh@10 -- # set +x 00:06:51.695 ************************************ 00:06:51.695 START TEST json_config 00:06:51.695 ************************************ 00:06:51.695 19:42:43 json_config -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:06:51.695 19:42:43 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:06:51.695 19:42:43 json_config -- nvmf/common.sh@7 -- # uname -s 00:06:51.695 19:42:43 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:51.695 19:42:43 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:51.695 19:42:43 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:51.695 19:42:43 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:51.695 19:42:43 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:51.695 19:42:43 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:51.695 19:42:43 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:51.695 19:42:43 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:51.695 19:42:43 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:51.695 19:42:43 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:51.695 19:42:43 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:06:51.695 19:42:43 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:06:51.695 19:42:43 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:51.695 19:42:43 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:51.695 19:42:43 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:51.695 19:42:43 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:51.695 19:42:43 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:06:51.695 19:42:43 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:51.695 19:42:43 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:51.695 19:42:43 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:51.695 19:42:43 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:51.695 19:42:43 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:51.695 19:42:43 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:51.695 19:42:43 json_config -- paths/export.sh@5 -- # export PATH 00:06:51.695 19:42:43 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:51.695 19:42:43 json_config -- nvmf/common.sh@47 -- # : 0 00:06:51.695 19:42:43 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:51.695 19:42:43 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:51.695 19:42:43 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:51.695 19:42:43 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:51.695 19:42:43 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:51.695 19:42:43 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:51.695 19:42:43 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:51.695 19:42:43 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:51.695 19:42:43 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:06:51.695 19:42:43 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:06:51.695 19:42:43 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:06:51.695 19:42:43 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:06:51.695 19:42:43 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:06:51.695 19:42:43 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:06:51.695 19:42:43 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:06:51.695 19:42:43 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:06:51.695 19:42:43 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:06:51.695 19:42:43 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:06:51.695 19:42:43 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:06:51.695 19:42:43 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:06:51.695 19:42:43 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:06:51.695 19:42:43 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:06:51.695 19:42:43 json_config -- json_config/json_config.sh@359 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:51.695 19:42:43 json_config -- json_config/json_config.sh@360 -- # echo 'INFO: JSON configuration test init' 00:06:51.695 INFO: JSON configuration test init 00:06:51.695 19:42:43 json_config -- json_config/json_config.sh@361 -- # json_config_test_init 00:06:51.695 19:42:43 json_config -- json_config/json_config.sh@266 -- # timing_enter json_config_test_init 00:06:51.695 19:42:43 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:06:51.695 19:42:43 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:51.695 19:42:43 json_config -- json_config/json_config.sh@267 -- # timing_enter json_config_setup_target 00:06:51.695 19:42:43 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:06:51.695 19:42:43 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:51.695 19:42:43 json_config -- json_config/json_config.sh@269 -- # json_config_test_start_app target --wait-for-rpc 00:06:51.695 19:42:43 json_config -- json_config/common.sh@9 -- # local app=target 00:06:51.695 19:42:43 json_config -- json_config/common.sh@10 -- # shift 00:06:51.695 19:42:43 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:51.695 19:42:43 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:51.695 19:42:43 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:06:51.695 19:42:43 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:51.695 19:42:43 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:51.695 19:42:43 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=1333516 00:06:51.695 19:42:43 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:51.695 Waiting for target to run... 00:06:51.695 19:42:43 json_config -- json_config/common.sh@25 -- # waitforlisten 1333516 /var/tmp/spdk_tgt.sock 00:06:51.695 19:42:43 json_config -- common/autotest_common.sh@831 -- # '[' -z 1333516 ']' 00:06:51.695 19:42:43 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:06:51.695 19:42:43 json_config -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:51.695 19:42:43 json_config -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:51.695 19:42:43 json_config -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:51.695 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:51.695 19:42:43 json_config -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:51.695 19:42:43 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:51.695 [2024-07-24 19:42:43.271821] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:06:51.695 [2024-07-24 19:42:43.271907] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1333516 ] 00:06:52.632 [2024-07-24 19:42:43.918882] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:52.632 [2024-07-24 19:42:44.021918] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.632 19:42:44 json_config -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:52.632 19:42:44 json_config -- common/autotest_common.sh@864 -- # return 0 00:06:52.632 19:42:44 json_config -- json_config/common.sh@26 -- # echo '' 00:06:52.632 00:06:52.632 19:42:44 json_config -- json_config/json_config.sh@273 -- # create_accel_config 00:06:52.632 19:42:44 json_config -- json_config/json_config.sh@97 -- # timing_enter create_accel_config 00:06:52.632 19:42:44 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:06:52.632 19:42:44 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:52.632 19:42:44 json_config -- json_config/json_config.sh@99 -- # [[ 1 -eq 1 ]] 00:06:52.632 19:42:44 json_config -- json_config/json_config.sh@100 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:06:52.632 19:42:44 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:06:52.891 19:42:44 json_config -- json_config/json_config.sh@101 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:06:52.891 19:42:44 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:06:53.150 [2024-07-24 19:42:44.676029] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:06:53.150 19:42:44 json_config -- json_config/json_config.sh@102 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:06:53.150 19:42:44 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:06:53.409 [2024-07-24 19:42:44.920657] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:06:53.409 19:42:44 json_config -- json_config/json_config.sh@105 -- # timing_exit create_accel_config 00:06:53.409 19:42:44 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:06:53.409 19:42:44 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:53.409 19:42:44 json_config -- json_config/json_config.sh@277 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:06:53.409 19:42:44 json_config -- json_config/json_config.sh@278 -- # tgt_rpc load_config 00:06:53.409 19:42:44 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:06:53.668 [2024-07-24 19:42:45.234179] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:06:56.957 19:42:47 json_config -- json_config/json_config.sh@280 -- # tgt_check_notification_types 00:06:56.957 19:42:47 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:06:56.957 19:42:47 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:06:56.957 19:42:47 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:56.957 19:42:47 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:06:56.957 19:42:47 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:06:56.957 19:42:47 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:06:56.957 19:42:47 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:06:56.957 19:42:47 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:06:56.957 19:42:47 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@48 -- # local get_types 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@50 -- # local type_diff 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@51 -- # echo bdev_register bdev_unregister bdev_register bdev_unregister 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@51 -- # tr ' ' '\n' 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@51 -- # sort 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@51 -- # uniq -u 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@51 -- # type_diff= 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@53 -- # [[ -n '' ]] 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@58 -- # timing_exit tgt_check_notification_types 00:06:56.957 19:42:48 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:06:56.957 19:42:48 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@59 -- # return 0 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@282 -- # [[ 1 -eq 1 ]] 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@283 -- # create_bdev_subsystem_config 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@109 -- # timing_enter create_bdev_subsystem_config 00:06:56.957 19:42:48 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:06:56.957 19:42:48 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@111 -- # expected_notifications=() 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@111 -- # local expected_notifications 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@115 -- # expected_notifications+=($(get_notifications)) 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@115 -- # get_notifications 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@63 -- # local ev_type ev_ctx event_id 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@62 -- # tgt_rpc notify_get_notifications -i 0 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@62 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:06:56.957 19:42:48 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@117 -- # [[ 1 -eq 1 ]] 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@118 -- # local lvol_store_base_bdev=Nvme0n1 00:06:56.957 19:42:48 json_config -- json_config/json_config.sh@120 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:06:56.957 19:42:48 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:06:57.217 Nvme0n1p0 Nvme0n1p1 00:06:57.217 19:42:48 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_split_create Malloc0 3 00:06:57.217 19:42:48 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:06:57.476 [2024-07-24 19:42:48.863412] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:57.476 [2024-07-24 19:42:48.863475] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:57.476 00:06:57.476 19:42:48 json_config -- json_config/json_config.sh@122 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:06:57.476 19:42:48 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:06:57.735 Malloc3 00:06:57.735 19:42:49 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:06:57.735 19:42:49 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:06:57.994 [2024-07-24 19:42:49.348793] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:57.994 [2024-07-24 19:42:49.348848] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:57.994 [2024-07-24 19:42:49.348870] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x184d960 00:06:57.994 [2024-07-24 19:42:49.348883] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:57.994 [2024-07-24 19:42:49.350559] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:57.994 [2024-07-24 19:42:49.350589] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:06:57.994 PTBdevFromMalloc3 00:06:57.994 19:42:49 json_config -- json_config/json_config.sh@125 -- # tgt_rpc bdev_null_create Null0 32 512 00:06:57.994 19:42:49 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:06:58.253 Null0 00:06:58.253 19:42:49 json_config -- json_config/json_config.sh@127 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:06:58.254 19:42:49 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:06:58.254 Malloc0 00:06:58.513 19:42:49 json_config -- json_config/json_config.sh@128 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:06:58.513 19:42:49 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:06:58.513 Malloc1 00:06:58.513 19:42:50 json_config -- json_config/json_config.sh@141 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:06:58.513 19:42:50 json_config -- json_config/json_config.sh@144 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:06:59.082 102400+0 records in 00:06:59.082 102400+0 records out 00:06:59.082 104857600 bytes (105 MB, 100 MiB) copied, 0.31141 s, 337 MB/s 00:06:59.082 19:42:50 json_config -- json_config/json_config.sh@145 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:06:59.082 19:42:50 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:06:59.082 aio_disk 00:06:59.341 19:42:50 json_config -- json_config/json_config.sh@146 -- # expected_notifications+=(bdev_register:aio_disk) 00:06:59.341 19:42:50 json_config -- json_config/json_config.sh@151 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:06:59.341 19:42:50 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:07:04.615 3f47f515-e727-4b5b-8d83-d59128ee460c 00:07:04.615 19:42:55 json_config -- json_config/json_config.sh@158 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:07:04.615 19:42:55 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:07:04.615 19:42:55 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:07:04.615 19:42:55 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:07:04.615 19:42:55 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:07:04.615 19:42:55 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:07:04.615 19:42:55 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:07:04.615 19:42:56 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:07:04.615 19:42:56 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:07:04.874 19:42:56 json_config -- json_config/json_config.sh@161 -- # [[ 1 -eq 1 ]] 00:07:04.874 19:42:56 json_config -- json_config/json_config.sh@162 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:07:04.874 19:42:56 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:07:05.133 MallocForCryptoBdev 00:07:05.133 19:42:56 json_config -- json_config/json_config.sh@163 -- # lspci -d:37c8 00:07:05.133 19:42:56 json_config -- json_config/json_config.sh@163 -- # wc -l 00:07:05.133 19:42:56 json_config -- json_config/json_config.sh@163 -- # [[ 3 -eq 0 ]] 00:07:05.133 19:42:56 json_config -- json_config/json_config.sh@166 -- # local crypto_driver=crypto_qat 00:07:05.133 19:42:56 json_config -- json_config/json_config.sh@169 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:07:05.133 19:42:56 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:07:05.392 [2024-07-24 19:42:56.815152] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:07:05.392 CryptoMallocBdev 00:07:05.392 19:42:56 json_config -- json_config/json_config.sh@173 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:07:05.392 19:42:56 json_config -- json_config/json_config.sh@176 -- # [[ 0 -eq 1 ]] 00:07:05.392 19:42:56 json_config -- json_config/json_config.sh@182 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:762ba923-7b28-4aec-adb1-24c8377f4168 bdev_register:469335a3-b945-4401-a17c-d908d49b0e76 bdev_register:da318af4-8820-4f71-99f9-4aaa58494c55 bdev_register:113d8c6a-d6a2-4b4d-8f3c-e709aa7ddd8e bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:07:05.392 19:42:56 json_config -- json_config/json_config.sh@71 -- # local events_to_check 00:07:05.392 19:42:56 json_config -- json_config/json_config.sh@72 -- # local recorded_events 00:07:05.392 19:42:56 json_config -- json_config/json_config.sh@75 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:07:05.392 19:42:56 json_config -- json_config/json_config.sh@75 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:762ba923-7b28-4aec-adb1-24c8377f4168 bdev_register:469335a3-b945-4401-a17c-d908d49b0e76 bdev_register:da318af4-8820-4f71-99f9-4aaa58494c55 bdev_register:113d8c6a-d6a2-4b4d-8f3c-e709aa7ddd8e bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:07:05.392 19:42:56 json_config -- json_config/json_config.sh@75 -- # sort 00:07:05.392 19:42:56 json_config -- json_config/json_config.sh@76 -- # recorded_events=($(get_notifications | sort)) 00:07:05.392 19:42:56 json_config -- json_config/json_config.sh@76 -- # get_notifications 00:07:05.392 19:42:56 json_config -- json_config/json_config.sh@63 -- # local ev_type ev_ctx event_id 00:07:05.392 19:42:56 json_config -- json_config/json_config.sh@76 -- # sort 00:07:05.392 19:42:56 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:05.392 19:42:56 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:05.392 19:42:56 json_config -- json_config/json_config.sh@62 -- # tgt_rpc notify_get_notifications -i 0 00:07:05.392 19:42:56 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:07:05.392 19:42:56 json_config -- json_config/json_config.sh@62 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1p1 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1p0 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc3 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:PTBdevFromMalloc3 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Null0 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0p2 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0p1 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0p0 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc1 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:aio_disk 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:762ba923-7b28-4aec-adb1-24c8377f4168 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:469335a3-b945-4401-a17c-d908d49b0e76 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:da318af4-8820-4f71-99f9-4aaa58494c55 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:113d8c6a-d6a2-4b4d-8f3c-e709aa7ddd8e 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:MallocForCryptoBdev 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:CryptoMallocBdev 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@78 -- # [[ bdev_register:113d8c6a-d6a2-4b4d-8f3c-e709aa7ddd8e bdev_register:469335a3-b945-4401-a17c-d908d49b0e76 bdev_register:762ba923-7b28-4aec-adb1-24c8377f4168 bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:da318af4-8820-4f71-99f9-4aaa58494c55 bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\1\1\3\d\8\c\6\a\-\d\6\a\2\-\4\b\4\d\-\8\f\3\c\-\e\7\0\9\a\a\7\d\d\d\8\e\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\4\6\9\3\3\5\a\3\-\b\9\4\5\-\4\4\0\1\-\a\1\7\c\-\d\9\0\8\d\4\9\b\0\e\7\6\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\7\6\2\b\a\9\2\3\-\7\b\2\8\-\4\a\e\c\-\a\d\b\1\-\2\4\c\8\3\7\7\f\4\1\6\8\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\d\a\3\1\8\a\f\4\-\8\8\2\0\-\4\f\7\1\-\9\9\f\9\-\4\a\a\a\5\8\4\9\4\c\5\5\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@90 -- # cat 00:07:05.652 19:42:57 json_config -- json_config/json_config.sh@90 -- # printf ' %s\n' bdev_register:113d8c6a-d6a2-4b4d-8f3c-e709aa7ddd8e bdev_register:469335a3-b945-4401-a17c-d908d49b0e76 bdev_register:762ba923-7b28-4aec-adb1-24c8377f4168 bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:da318af4-8820-4f71-99f9-4aaa58494c55 bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:07:05.652 Expected events matched: 00:07:05.652 bdev_register:113d8c6a-d6a2-4b4d-8f3c-e709aa7ddd8e 00:07:05.652 bdev_register:469335a3-b945-4401-a17c-d908d49b0e76 00:07:05.652 bdev_register:762ba923-7b28-4aec-adb1-24c8377f4168 00:07:05.652 bdev_register:aio_disk 00:07:05.652 bdev_register:CryptoMallocBdev 00:07:05.652 bdev_register:da318af4-8820-4f71-99f9-4aaa58494c55 00:07:05.652 bdev_register:Malloc0 00:07:05.652 bdev_register:Malloc0p0 00:07:05.652 bdev_register:Malloc0p1 00:07:05.652 bdev_register:Malloc0p2 00:07:05.652 bdev_register:Malloc1 00:07:05.652 bdev_register:Malloc3 00:07:05.652 bdev_register:MallocForCryptoBdev 00:07:05.652 bdev_register:Null0 00:07:05.652 bdev_register:Nvme0n1 00:07:05.652 bdev_register:Nvme0n1p0 00:07:05.652 bdev_register:Nvme0n1p1 00:07:05.652 bdev_register:PTBdevFromMalloc3 00:07:05.653 19:42:57 json_config -- json_config/json_config.sh@184 -- # timing_exit create_bdev_subsystem_config 00:07:05.653 19:42:57 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:07:05.653 19:42:57 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:05.653 19:42:57 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:07:05.653 19:42:57 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:07:05.653 19:42:57 json_config -- json_config/json_config.sh@294 -- # [[ 0 -eq 1 ]] 00:07:05.653 19:42:57 json_config -- json_config/json_config.sh@297 -- # timing_exit json_config_setup_target 00:07:05.653 19:42:57 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:07:05.653 19:42:57 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:05.653 19:42:57 json_config -- json_config/json_config.sh@299 -- # [[ 0 -eq 1 ]] 00:07:05.653 19:42:57 json_config -- json_config/json_config.sh@304 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:07:05.653 19:42:57 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:07:05.912 MallocBdevForConfigChangeCheck 00:07:05.912 19:42:57 json_config -- json_config/json_config.sh@306 -- # timing_exit json_config_test_init 00:07:05.912 19:42:57 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:07:05.912 19:42:57 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:05.912 19:42:57 json_config -- json_config/json_config.sh@363 -- # tgt_rpc save_config 00:07:05.912 19:42:57 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:06.482 19:42:57 json_config -- json_config/json_config.sh@365 -- # echo 'INFO: shutting down applications...' 00:07:06.482 INFO: shutting down applications... 00:07:06.482 19:42:57 json_config -- json_config/json_config.sh@366 -- # [[ 0 -eq 1 ]] 00:07:06.482 19:42:57 json_config -- json_config/json_config.sh@372 -- # json_config_clear target 00:07:06.482 19:42:57 json_config -- json_config/json_config.sh@336 -- # [[ -n 22 ]] 00:07:06.482 19:42:57 json_config -- json_config/json_config.sh@337 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:07:06.482 [2024-07-24 19:42:57.998840] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:07:09.773 Calling clear_iscsi_subsystem 00:07:09.773 Calling clear_nvmf_subsystem 00:07:09.773 Calling clear_nbd_subsystem 00:07:09.773 Calling clear_ublk_subsystem 00:07:09.773 Calling clear_vhost_blk_subsystem 00:07:09.773 Calling clear_vhost_scsi_subsystem 00:07:09.773 Calling clear_bdev_subsystem 00:07:09.773 19:43:00 json_config -- json_config/json_config.sh@341 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:07:09.773 19:43:00 json_config -- json_config/json_config.sh@347 -- # count=100 00:07:09.773 19:43:00 json_config -- json_config/json_config.sh@348 -- # '[' 100 -gt 0 ']' 00:07:09.773 19:43:00 json_config -- json_config/json_config.sh@349 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:09.773 19:43:00 json_config -- json_config/json_config.sh@349 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:07:09.773 19:43:00 json_config -- json_config/json_config.sh@349 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:07:09.773 19:43:01 json_config -- json_config/json_config.sh@349 -- # break 00:07:09.773 19:43:01 json_config -- json_config/json_config.sh@354 -- # '[' 100 -eq 0 ']' 00:07:09.773 19:43:01 json_config -- json_config/json_config.sh@373 -- # json_config_test_shutdown_app target 00:07:09.773 19:43:01 json_config -- json_config/common.sh@31 -- # local app=target 00:07:09.773 19:43:01 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:07:09.773 19:43:01 json_config -- json_config/common.sh@35 -- # [[ -n 1333516 ]] 00:07:09.773 19:43:01 json_config -- json_config/common.sh@38 -- # kill -SIGINT 1333516 00:07:09.773 19:43:01 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:07:09.773 19:43:01 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:09.773 19:43:01 json_config -- json_config/common.sh@41 -- # kill -0 1333516 00:07:09.773 19:43:01 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:07:10.342 19:43:01 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:07:10.342 19:43:01 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:10.342 19:43:01 json_config -- json_config/common.sh@41 -- # kill -0 1333516 00:07:10.342 19:43:01 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:07:10.342 19:43:01 json_config -- json_config/common.sh@43 -- # break 00:07:10.342 19:43:01 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:07:10.342 19:43:01 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:07:10.342 SPDK target shutdown done 00:07:10.342 19:43:01 json_config -- json_config/json_config.sh@375 -- # echo 'INFO: relaunching applications...' 00:07:10.342 INFO: relaunching applications... 00:07:10.342 19:43:01 json_config -- json_config/json_config.sh@376 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:10.342 19:43:01 json_config -- json_config/common.sh@9 -- # local app=target 00:07:10.342 19:43:01 json_config -- json_config/common.sh@10 -- # shift 00:07:10.342 19:43:01 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:07:10.342 19:43:01 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:07:10.342 19:43:01 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:07:10.342 19:43:01 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:10.342 19:43:01 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:10.342 19:43:01 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=1336123 00:07:10.342 19:43:01 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:07:10.342 Waiting for target to run... 00:07:10.342 19:43:01 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:10.342 19:43:01 json_config -- json_config/common.sh@25 -- # waitforlisten 1336123 /var/tmp/spdk_tgt.sock 00:07:10.342 19:43:01 json_config -- common/autotest_common.sh@831 -- # '[' -z 1336123 ']' 00:07:10.342 19:43:01 json_config -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:07:10.342 19:43:01 json_config -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:10.342 19:43:01 json_config -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:07:10.342 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:07:10.342 19:43:01 json_config -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:10.342 19:43:01 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:10.342 [2024-07-24 19:43:01.865973] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:07:10.342 [2024-07-24 19:43:01.866038] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1336123 ] 00:07:10.911 [2024-07-24 19:43:02.430301] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:11.171 [2024-07-24 19:43:02.538865] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.171 [2024-07-24 19:43:02.593117] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:07:11.171 [2024-07-24 19:43:02.601154] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:07:11.171 [2024-07-24 19:43:02.609172] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:07:11.171 [2024-07-24 19:43:02.690428] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:07:13.776 [2024-07-24 19:43:04.903857] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:13.776 [2024-07-24 19:43:04.903923] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:13.776 [2024-07-24 19:43:04.903938] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:13.776 [2024-07-24 19:43:04.911874] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:07:13.776 [2024-07-24 19:43:04.911901] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:07:13.776 [2024-07-24 19:43:04.919887] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:07:13.776 [2024-07-24 19:43:04.919912] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:07:13.776 [2024-07-24 19:43:04.927922] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:07:13.776 [2024-07-24 19:43:04.927951] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:07:13.776 [2024-07-24 19:43:04.927965] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:13.776 [2024-07-24 19:43:05.305208] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:13.776 [2024-07-24 19:43:05.305256] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:13.776 [2024-07-24 19:43:05.305276] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2847280 00:07:13.776 [2024-07-24 19:43:05.305288] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:13.776 [2024-07-24 19:43:05.305605] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:13.776 [2024-07-24 19:43:05.305623] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:07:14.035 19:43:05 json_config -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:14.035 19:43:05 json_config -- common/autotest_common.sh@864 -- # return 0 00:07:14.035 19:43:05 json_config -- json_config/common.sh@26 -- # echo '' 00:07:14.035 00:07:14.035 19:43:05 json_config -- json_config/json_config.sh@377 -- # [[ 0 -eq 1 ]] 00:07:14.035 19:43:05 json_config -- json_config/json_config.sh@381 -- # echo 'INFO: Checking if target configuration is the same...' 00:07:14.035 INFO: Checking if target configuration is the same... 00:07:14.035 19:43:05 json_config -- json_config/json_config.sh@382 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:14.035 19:43:05 json_config -- json_config/json_config.sh@382 -- # tgt_rpc save_config 00:07:14.035 19:43:05 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:14.035 + '[' 2 -ne 2 ']' 00:07:14.035 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:07:14.035 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:07:14.035 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:14.035 +++ basename /dev/fd/62 00:07:14.035 ++ mktemp /tmp/62.XXX 00:07:14.035 + tmp_file_1=/tmp/62.jw5 00:07:14.035 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:14.035 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:07:14.035 + tmp_file_2=/tmp/spdk_tgt_config.json.9hj 00:07:14.035 + ret=0 00:07:14.035 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:14.294 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:14.553 + diff -u /tmp/62.jw5 /tmp/spdk_tgt_config.json.9hj 00:07:14.553 + echo 'INFO: JSON config files are the same' 00:07:14.553 INFO: JSON config files are the same 00:07:14.553 + rm /tmp/62.jw5 /tmp/spdk_tgt_config.json.9hj 00:07:14.553 + exit 0 00:07:14.553 19:43:05 json_config -- json_config/json_config.sh@383 -- # [[ 0 -eq 1 ]] 00:07:14.553 19:43:05 json_config -- json_config/json_config.sh@388 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:07:14.553 INFO: changing configuration and checking if this can be detected... 00:07:14.553 19:43:05 json_config -- json_config/json_config.sh@390 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:07:14.553 19:43:05 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:07:14.553 19:43:06 json_config -- json_config/json_config.sh@391 -- # tgt_rpc save_config 00:07:14.553 19:43:06 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:14.553 19:43:06 json_config -- json_config/json_config.sh@391 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:14.553 + '[' 2 -ne 2 ']' 00:07:14.553 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:07:14.553 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:07:14.553 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:14.553 +++ basename /dev/fd/62 00:07:14.553 ++ mktemp /tmp/62.XXX 00:07:14.553 + tmp_file_1=/tmp/62.LIm 00:07:14.553 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:14.812 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:07:14.812 + tmp_file_2=/tmp/spdk_tgt_config.json.XAx 00:07:14.812 + ret=0 00:07:14.812 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:15.071 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:15.071 + diff -u /tmp/62.LIm /tmp/spdk_tgt_config.json.XAx 00:07:15.071 + ret=1 00:07:15.071 + echo '=== Start of file: /tmp/62.LIm ===' 00:07:15.071 + cat /tmp/62.LIm 00:07:15.071 + echo '=== End of file: /tmp/62.LIm ===' 00:07:15.071 + echo '' 00:07:15.071 + echo '=== Start of file: /tmp/spdk_tgt_config.json.XAx ===' 00:07:15.071 + cat /tmp/spdk_tgt_config.json.XAx 00:07:15.071 + echo '=== End of file: /tmp/spdk_tgt_config.json.XAx ===' 00:07:15.071 + echo '' 00:07:15.071 + rm /tmp/62.LIm /tmp/spdk_tgt_config.json.XAx 00:07:15.071 + exit 1 00:07:15.071 19:43:06 json_config -- json_config/json_config.sh@395 -- # echo 'INFO: configuration change detected.' 00:07:15.071 INFO: configuration change detected. 00:07:15.071 19:43:06 json_config -- json_config/json_config.sh@398 -- # json_config_test_fini 00:07:15.071 19:43:06 json_config -- json_config/json_config.sh@310 -- # timing_enter json_config_test_fini 00:07:15.071 19:43:06 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:07:15.071 19:43:06 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:15.071 19:43:06 json_config -- json_config/json_config.sh@311 -- # local ret=0 00:07:15.071 19:43:06 json_config -- json_config/json_config.sh@313 -- # [[ -n '' ]] 00:07:15.071 19:43:06 json_config -- json_config/json_config.sh@321 -- # [[ -n 1336123 ]] 00:07:15.071 19:43:06 json_config -- json_config/json_config.sh@324 -- # cleanup_bdev_subsystem_config 00:07:15.071 19:43:06 json_config -- json_config/json_config.sh@188 -- # timing_enter cleanup_bdev_subsystem_config 00:07:15.071 19:43:06 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:07:15.071 19:43:06 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:15.071 19:43:06 json_config -- json_config/json_config.sh@190 -- # [[ 1 -eq 1 ]] 00:07:15.071 19:43:06 json_config -- json_config/json_config.sh@191 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:07:15.071 19:43:06 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:07:15.330 19:43:06 json_config -- json_config/json_config.sh@192 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:07:15.330 19:43:06 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:07:15.588 19:43:06 json_config -- json_config/json_config.sh@193 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:07:15.588 19:43:06 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:07:15.588 19:43:07 json_config -- json_config/json_config.sh@194 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:07:15.588 19:43:07 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:07:15.848 19:43:07 json_config -- json_config/json_config.sh@197 -- # uname -s 00:07:15.848 19:43:07 json_config -- json_config/json_config.sh@197 -- # [[ Linux = Linux ]] 00:07:15.848 19:43:07 json_config -- json_config/json_config.sh@198 -- # rm -f /sample_aio 00:07:15.848 19:43:07 json_config -- json_config/json_config.sh@201 -- # [[ 0 -eq 1 ]] 00:07:15.848 19:43:07 json_config -- json_config/json_config.sh@205 -- # timing_exit cleanup_bdev_subsystem_config 00:07:15.848 19:43:07 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:07:15.848 19:43:07 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:15.848 19:43:07 json_config -- json_config/json_config.sh@327 -- # killprocess 1336123 00:07:15.848 19:43:07 json_config -- common/autotest_common.sh@950 -- # '[' -z 1336123 ']' 00:07:15.848 19:43:07 json_config -- common/autotest_common.sh@954 -- # kill -0 1336123 00:07:15.848 19:43:07 json_config -- common/autotest_common.sh@955 -- # uname 00:07:15.848 19:43:07 json_config -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:15.848 19:43:07 json_config -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1336123 00:07:15.848 19:43:07 json_config -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:15.848 19:43:07 json_config -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:15.848 19:43:07 json_config -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1336123' 00:07:15.848 killing process with pid 1336123 00:07:15.848 19:43:07 json_config -- common/autotest_common.sh@969 -- # kill 1336123 00:07:15.848 19:43:07 json_config -- common/autotest_common.sh@974 -- # wait 1336123 00:07:19.135 19:43:10 json_config -- json_config/json_config.sh@330 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:19.135 19:43:10 json_config -- json_config/json_config.sh@331 -- # timing_exit json_config_test_fini 00:07:19.135 19:43:10 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:07:19.135 19:43:10 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:19.135 19:43:10 json_config -- json_config/json_config.sh@332 -- # return 0 00:07:19.135 19:43:10 json_config -- json_config/json_config.sh@400 -- # echo 'INFO: Success' 00:07:19.135 INFO: Success 00:07:19.135 00:07:19.135 real 0m27.623s 00:07:19.135 user 0m32.908s 00:07:19.135 sys 0m4.155s 00:07:19.135 19:43:10 json_config -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:19.135 19:43:10 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:19.135 ************************************ 00:07:19.135 END TEST json_config 00:07:19.135 ************************************ 00:07:19.394 19:43:10 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:19.394 19:43:10 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:19.394 19:43:10 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:19.394 19:43:10 -- common/autotest_common.sh@10 -- # set +x 00:07:19.394 ************************************ 00:07:19.394 START TEST json_config_extra_key 00:07:19.394 ************************************ 00:07:19.394 19:43:10 json_config_extra_key -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:19.394 19:43:10 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:07:19.394 19:43:10 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:07:19.394 19:43:10 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:19.394 19:43:10 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:19.394 19:43:10 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:19.395 19:43:10 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:19.395 19:43:10 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:19.395 19:43:10 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:19.395 19:43:10 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:19.395 19:43:10 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:19.395 19:43:10 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:19.395 19:43:10 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:19.395 19:43:10 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:07:19.395 19:43:10 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:07:19.395 19:43:10 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:19.395 19:43:10 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:19.395 19:43:10 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:19.395 19:43:10 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:19.395 19:43:10 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:07:19.395 19:43:10 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:19.395 19:43:10 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:19.395 19:43:10 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:19.395 19:43:10 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:19.395 19:43:10 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:19.395 19:43:10 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:19.395 19:43:10 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:07:19.395 19:43:10 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:19.395 19:43:10 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:07:19.395 19:43:10 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:07:19.395 19:43:10 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:07:19.395 19:43:10 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:07:19.395 19:43:10 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:19.395 19:43:10 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:19.395 19:43:10 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:07:19.395 19:43:10 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:07:19.395 19:43:10 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:07:19.395 19:43:10 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:07:19.395 19:43:10 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:07:19.395 19:43:10 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:07:19.395 19:43:10 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:07:19.395 19:43:10 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:07:19.395 19:43:10 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:07:19.395 19:43:10 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:07:19.395 19:43:10 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:07:19.395 19:43:10 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:07:19.395 19:43:10 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:07:19.395 19:43:10 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:07:19.395 INFO: launching applications... 00:07:19.395 19:43:10 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:07:19.395 19:43:10 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:07:19.395 19:43:10 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:07:19.395 19:43:10 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:07:19.395 19:43:10 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:07:19.395 19:43:10 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:07:19.395 19:43:10 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:19.395 19:43:10 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:19.395 19:43:10 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=1337453 00:07:19.395 19:43:10 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:07:19.395 Waiting for target to run... 00:07:19.395 19:43:10 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 1337453 /var/tmp/spdk_tgt.sock 00:07:19.395 19:43:10 json_config_extra_key -- common/autotest_common.sh@831 -- # '[' -z 1337453 ']' 00:07:19.395 19:43:10 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:07:19.395 19:43:10 json_config_extra_key -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:07:19.395 19:43:10 json_config_extra_key -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:19.395 19:43:10 json_config_extra_key -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:07:19.395 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:07:19.395 19:43:10 json_config_extra_key -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:19.395 19:43:10 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:19.395 [2024-07-24 19:43:10.957285] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:07:19.395 [2024-07-24 19:43:10.957348] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1337453 ] 00:07:19.964 [2024-07-24 19:43:11.516989] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:20.224 [2024-07-24 19:43:11.627891] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:20.224 19:43:11 json_config_extra_key -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:20.224 19:43:11 json_config_extra_key -- common/autotest_common.sh@864 -- # return 0 00:07:20.224 19:43:11 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:07:20.224 00:07:20.224 19:43:11 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:07:20.224 INFO: shutting down applications... 00:07:20.224 19:43:11 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:07:20.224 19:43:11 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:07:20.224 19:43:11 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:07:20.224 19:43:11 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 1337453 ]] 00:07:20.224 19:43:11 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 1337453 00:07:20.224 19:43:11 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:07:20.224 19:43:11 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:20.224 19:43:11 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1337453 00:07:20.224 19:43:11 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:07:20.793 19:43:12 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:07:20.793 19:43:12 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:20.793 19:43:12 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1337453 00:07:20.793 19:43:12 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:07:20.793 19:43:12 json_config_extra_key -- json_config/common.sh@43 -- # break 00:07:20.793 19:43:12 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:07:20.793 19:43:12 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:07:20.793 SPDK target shutdown done 00:07:20.793 19:43:12 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:07:20.793 Success 00:07:20.793 00:07:20.793 real 0m1.482s 00:07:20.793 user 0m0.862s 00:07:20.793 sys 0m0.686s 00:07:20.793 19:43:12 json_config_extra_key -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:20.793 19:43:12 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:20.793 ************************************ 00:07:20.793 END TEST json_config_extra_key 00:07:20.793 ************************************ 00:07:20.793 19:43:12 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:20.793 19:43:12 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:20.793 19:43:12 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:20.793 19:43:12 -- common/autotest_common.sh@10 -- # set +x 00:07:20.793 ************************************ 00:07:20.793 START TEST alias_rpc 00:07:20.793 ************************************ 00:07:20.793 19:43:12 alias_rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:21.071 * Looking for test storage... 00:07:21.071 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:07:21.071 19:43:12 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:21.071 19:43:12 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=1337702 00:07:21.071 19:43:12 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 1337702 00:07:21.071 19:43:12 alias_rpc -- common/autotest_common.sh@831 -- # '[' -z 1337702 ']' 00:07:21.071 19:43:12 alias_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:21.071 19:43:12 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:21.071 19:43:12 alias_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:21.071 19:43:12 alias_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:21.071 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:21.071 19:43:12 alias_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:21.071 19:43:12 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:21.071 [2024-07-24 19:43:12.527023] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:07:21.071 [2024-07-24 19:43:12.527096] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1337702 ] 00:07:21.071 [2024-07-24 19:43:12.655093] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:21.330 [2024-07-24 19:43:12.758763] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.267 19:43:13 alias_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:22.267 19:43:13 alias_rpc -- common/autotest_common.sh@864 -- # return 0 00:07:22.267 19:43:13 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:07:22.834 19:43:14 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 1337702 00:07:22.834 19:43:14 alias_rpc -- common/autotest_common.sh@950 -- # '[' -z 1337702 ']' 00:07:22.834 19:43:14 alias_rpc -- common/autotest_common.sh@954 -- # kill -0 1337702 00:07:22.834 19:43:14 alias_rpc -- common/autotest_common.sh@955 -- # uname 00:07:22.834 19:43:14 alias_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:22.834 19:43:14 alias_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1337702 00:07:22.834 19:43:14 alias_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:22.834 19:43:14 alias_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:22.834 19:43:14 alias_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1337702' 00:07:22.834 killing process with pid 1337702 00:07:22.834 19:43:14 alias_rpc -- common/autotest_common.sh@969 -- # kill 1337702 00:07:22.834 19:43:14 alias_rpc -- common/autotest_common.sh@974 -- # wait 1337702 00:07:23.402 00:07:23.402 real 0m2.354s 00:07:23.402 user 0m2.977s 00:07:23.402 sys 0m0.658s 00:07:23.402 19:43:14 alias_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:23.402 19:43:14 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:23.402 ************************************ 00:07:23.402 END TEST alias_rpc 00:07:23.402 ************************************ 00:07:23.402 19:43:14 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:07:23.402 19:43:14 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:23.402 19:43:14 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:23.402 19:43:14 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:23.403 19:43:14 -- common/autotest_common.sh@10 -- # set +x 00:07:23.403 ************************************ 00:07:23.403 START TEST spdkcli_tcp 00:07:23.403 ************************************ 00:07:23.403 19:43:14 spdkcli_tcp -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:23.403 * Looking for test storage... 00:07:23.403 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:07:23.403 19:43:14 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:07:23.403 19:43:14 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:07:23.403 19:43:14 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:07:23.403 19:43:14 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:07:23.403 19:43:14 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:07:23.403 19:43:14 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:07:23.403 19:43:14 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:07:23.403 19:43:14 spdkcli_tcp -- common/autotest_common.sh@724 -- # xtrace_disable 00:07:23.403 19:43:14 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:23.403 19:43:14 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=1338087 00:07:23.403 19:43:14 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 1338087 00:07:23.403 19:43:14 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:07:23.403 19:43:14 spdkcli_tcp -- common/autotest_common.sh@831 -- # '[' -z 1338087 ']' 00:07:23.403 19:43:14 spdkcli_tcp -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:23.403 19:43:14 spdkcli_tcp -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:23.403 19:43:14 spdkcli_tcp -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:23.403 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:23.403 19:43:14 spdkcli_tcp -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:23.403 19:43:14 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:23.403 [2024-07-24 19:43:14.963041] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:07:23.403 [2024-07-24 19:43:14.963102] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1338087 ] 00:07:23.662 [2024-07-24 19:43:15.077344] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:23.662 [2024-07-24 19:43:15.180164] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:23.662 [2024-07-24 19:43:15.180173] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.598 19:43:15 spdkcli_tcp -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:24.598 19:43:15 spdkcli_tcp -- common/autotest_common.sh@864 -- # return 0 00:07:24.598 19:43:15 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=1338121 00:07:24.598 19:43:15 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:07:24.598 19:43:15 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:07:24.598 [ 00:07:24.598 "bdev_malloc_delete", 00:07:24.598 "bdev_malloc_create", 00:07:24.598 "bdev_null_resize", 00:07:24.598 "bdev_null_delete", 00:07:24.598 "bdev_null_create", 00:07:24.598 "bdev_nvme_cuse_unregister", 00:07:24.598 "bdev_nvme_cuse_register", 00:07:24.598 "bdev_opal_new_user", 00:07:24.598 "bdev_opal_set_lock_state", 00:07:24.598 "bdev_opal_delete", 00:07:24.598 "bdev_opal_get_info", 00:07:24.598 "bdev_opal_create", 00:07:24.598 "bdev_nvme_opal_revert", 00:07:24.598 "bdev_nvme_opal_init", 00:07:24.598 "bdev_nvme_send_cmd", 00:07:24.598 "bdev_nvme_get_path_iostat", 00:07:24.598 "bdev_nvme_get_mdns_discovery_info", 00:07:24.598 "bdev_nvme_stop_mdns_discovery", 00:07:24.598 "bdev_nvme_start_mdns_discovery", 00:07:24.598 "bdev_nvme_set_multipath_policy", 00:07:24.598 "bdev_nvme_set_preferred_path", 00:07:24.598 "bdev_nvme_get_io_paths", 00:07:24.598 "bdev_nvme_remove_error_injection", 00:07:24.598 "bdev_nvme_add_error_injection", 00:07:24.598 "bdev_nvme_get_discovery_info", 00:07:24.598 "bdev_nvme_stop_discovery", 00:07:24.598 "bdev_nvme_start_discovery", 00:07:24.598 "bdev_nvme_get_controller_health_info", 00:07:24.598 "bdev_nvme_disable_controller", 00:07:24.598 "bdev_nvme_enable_controller", 00:07:24.598 "bdev_nvme_reset_controller", 00:07:24.598 "bdev_nvme_get_transport_statistics", 00:07:24.598 "bdev_nvme_apply_firmware", 00:07:24.598 "bdev_nvme_detach_controller", 00:07:24.598 "bdev_nvme_get_controllers", 00:07:24.598 "bdev_nvme_attach_controller", 00:07:24.598 "bdev_nvme_set_hotplug", 00:07:24.598 "bdev_nvme_set_options", 00:07:24.598 "bdev_passthru_delete", 00:07:24.598 "bdev_passthru_create", 00:07:24.598 "bdev_lvol_set_parent_bdev", 00:07:24.598 "bdev_lvol_set_parent", 00:07:24.598 "bdev_lvol_check_shallow_copy", 00:07:24.598 "bdev_lvol_start_shallow_copy", 00:07:24.598 "bdev_lvol_grow_lvstore", 00:07:24.598 "bdev_lvol_get_lvols", 00:07:24.598 "bdev_lvol_get_lvstores", 00:07:24.598 "bdev_lvol_delete", 00:07:24.598 "bdev_lvol_set_read_only", 00:07:24.598 "bdev_lvol_resize", 00:07:24.598 "bdev_lvol_decouple_parent", 00:07:24.598 "bdev_lvol_inflate", 00:07:24.598 "bdev_lvol_rename", 00:07:24.598 "bdev_lvol_clone_bdev", 00:07:24.598 "bdev_lvol_clone", 00:07:24.598 "bdev_lvol_snapshot", 00:07:24.598 "bdev_lvol_create", 00:07:24.598 "bdev_lvol_delete_lvstore", 00:07:24.598 "bdev_lvol_rename_lvstore", 00:07:24.598 "bdev_lvol_create_lvstore", 00:07:24.598 "bdev_raid_set_options", 00:07:24.599 "bdev_raid_remove_base_bdev", 00:07:24.599 "bdev_raid_add_base_bdev", 00:07:24.599 "bdev_raid_delete", 00:07:24.599 "bdev_raid_create", 00:07:24.599 "bdev_raid_get_bdevs", 00:07:24.599 "bdev_error_inject_error", 00:07:24.599 "bdev_error_delete", 00:07:24.599 "bdev_error_create", 00:07:24.599 "bdev_split_delete", 00:07:24.599 "bdev_split_create", 00:07:24.599 "bdev_delay_delete", 00:07:24.599 "bdev_delay_create", 00:07:24.599 "bdev_delay_update_latency", 00:07:24.599 "bdev_zone_block_delete", 00:07:24.599 "bdev_zone_block_create", 00:07:24.599 "blobfs_create", 00:07:24.599 "blobfs_detect", 00:07:24.599 "blobfs_set_cache_size", 00:07:24.599 "bdev_crypto_delete", 00:07:24.599 "bdev_crypto_create", 00:07:24.599 "bdev_compress_delete", 00:07:24.599 "bdev_compress_create", 00:07:24.599 "bdev_compress_get_orphans", 00:07:24.599 "bdev_aio_delete", 00:07:24.599 "bdev_aio_rescan", 00:07:24.599 "bdev_aio_create", 00:07:24.599 "bdev_ftl_set_property", 00:07:24.599 "bdev_ftl_get_properties", 00:07:24.599 "bdev_ftl_get_stats", 00:07:24.599 "bdev_ftl_unmap", 00:07:24.599 "bdev_ftl_unload", 00:07:24.599 "bdev_ftl_delete", 00:07:24.599 "bdev_ftl_load", 00:07:24.599 "bdev_ftl_create", 00:07:24.599 "bdev_virtio_attach_controller", 00:07:24.599 "bdev_virtio_scsi_get_devices", 00:07:24.599 "bdev_virtio_detach_controller", 00:07:24.599 "bdev_virtio_blk_set_hotplug", 00:07:24.599 "bdev_iscsi_delete", 00:07:24.599 "bdev_iscsi_create", 00:07:24.599 "bdev_iscsi_set_options", 00:07:24.599 "accel_error_inject_error", 00:07:24.599 "ioat_scan_accel_module", 00:07:24.599 "dsa_scan_accel_module", 00:07:24.599 "iaa_scan_accel_module", 00:07:24.599 "dpdk_cryptodev_get_driver", 00:07:24.599 "dpdk_cryptodev_set_driver", 00:07:24.599 "dpdk_cryptodev_scan_accel_module", 00:07:24.599 "compressdev_scan_accel_module", 00:07:24.599 "keyring_file_remove_key", 00:07:24.599 "keyring_file_add_key", 00:07:24.599 "keyring_linux_set_options", 00:07:24.599 "iscsi_get_histogram", 00:07:24.599 "iscsi_enable_histogram", 00:07:24.599 "iscsi_set_options", 00:07:24.599 "iscsi_get_auth_groups", 00:07:24.599 "iscsi_auth_group_remove_secret", 00:07:24.599 "iscsi_auth_group_add_secret", 00:07:24.599 "iscsi_delete_auth_group", 00:07:24.599 "iscsi_create_auth_group", 00:07:24.599 "iscsi_set_discovery_auth", 00:07:24.599 "iscsi_get_options", 00:07:24.599 "iscsi_target_node_request_logout", 00:07:24.599 "iscsi_target_node_set_redirect", 00:07:24.599 "iscsi_target_node_set_auth", 00:07:24.599 "iscsi_target_node_add_lun", 00:07:24.599 "iscsi_get_stats", 00:07:24.599 "iscsi_get_connections", 00:07:24.599 "iscsi_portal_group_set_auth", 00:07:24.599 "iscsi_start_portal_group", 00:07:24.599 "iscsi_delete_portal_group", 00:07:24.599 "iscsi_create_portal_group", 00:07:24.599 "iscsi_get_portal_groups", 00:07:24.599 "iscsi_delete_target_node", 00:07:24.599 "iscsi_target_node_remove_pg_ig_maps", 00:07:24.599 "iscsi_target_node_add_pg_ig_maps", 00:07:24.599 "iscsi_create_target_node", 00:07:24.599 "iscsi_get_target_nodes", 00:07:24.599 "iscsi_delete_initiator_group", 00:07:24.599 "iscsi_initiator_group_remove_initiators", 00:07:24.599 "iscsi_initiator_group_add_initiators", 00:07:24.599 "iscsi_create_initiator_group", 00:07:24.599 "iscsi_get_initiator_groups", 00:07:24.599 "nvmf_set_crdt", 00:07:24.599 "nvmf_set_config", 00:07:24.599 "nvmf_set_max_subsystems", 00:07:24.599 "nvmf_stop_mdns_prr", 00:07:24.599 "nvmf_publish_mdns_prr", 00:07:24.599 "nvmf_subsystem_get_listeners", 00:07:24.599 "nvmf_subsystem_get_qpairs", 00:07:24.599 "nvmf_subsystem_get_controllers", 00:07:24.599 "nvmf_get_stats", 00:07:24.599 "nvmf_get_transports", 00:07:24.599 "nvmf_create_transport", 00:07:24.599 "nvmf_get_targets", 00:07:24.599 "nvmf_delete_target", 00:07:24.599 "nvmf_create_target", 00:07:24.599 "nvmf_subsystem_allow_any_host", 00:07:24.599 "nvmf_subsystem_remove_host", 00:07:24.599 "nvmf_subsystem_add_host", 00:07:24.599 "nvmf_ns_remove_host", 00:07:24.599 "nvmf_ns_add_host", 00:07:24.599 "nvmf_subsystem_remove_ns", 00:07:24.599 "nvmf_subsystem_add_ns", 00:07:24.599 "nvmf_subsystem_listener_set_ana_state", 00:07:24.599 "nvmf_discovery_get_referrals", 00:07:24.599 "nvmf_discovery_remove_referral", 00:07:24.599 "nvmf_discovery_add_referral", 00:07:24.599 "nvmf_subsystem_remove_listener", 00:07:24.599 "nvmf_subsystem_add_listener", 00:07:24.599 "nvmf_delete_subsystem", 00:07:24.599 "nvmf_create_subsystem", 00:07:24.599 "nvmf_get_subsystems", 00:07:24.599 "env_dpdk_get_mem_stats", 00:07:24.599 "nbd_get_disks", 00:07:24.599 "nbd_stop_disk", 00:07:24.599 "nbd_start_disk", 00:07:24.599 "ublk_recover_disk", 00:07:24.599 "ublk_get_disks", 00:07:24.599 "ublk_stop_disk", 00:07:24.599 "ublk_start_disk", 00:07:24.599 "ublk_destroy_target", 00:07:24.599 "ublk_create_target", 00:07:24.599 "virtio_blk_create_transport", 00:07:24.599 "virtio_blk_get_transports", 00:07:24.599 "vhost_controller_set_coalescing", 00:07:24.599 "vhost_get_controllers", 00:07:24.599 "vhost_delete_controller", 00:07:24.599 "vhost_create_blk_controller", 00:07:24.599 "vhost_scsi_controller_remove_target", 00:07:24.599 "vhost_scsi_controller_add_target", 00:07:24.599 "vhost_start_scsi_controller", 00:07:24.599 "vhost_create_scsi_controller", 00:07:24.599 "thread_set_cpumask", 00:07:24.599 "framework_get_governor", 00:07:24.599 "framework_get_scheduler", 00:07:24.599 "framework_set_scheduler", 00:07:24.599 "framework_get_reactors", 00:07:24.599 "thread_get_io_channels", 00:07:24.599 "thread_get_pollers", 00:07:24.599 "thread_get_stats", 00:07:24.599 "framework_monitor_context_switch", 00:07:24.599 "spdk_kill_instance", 00:07:24.599 "log_enable_timestamps", 00:07:24.599 "log_get_flags", 00:07:24.599 "log_clear_flag", 00:07:24.599 "log_set_flag", 00:07:24.599 "log_get_level", 00:07:24.599 "log_set_level", 00:07:24.599 "log_get_print_level", 00:07:24.599 "log_set_print_level", 00:07:24.599 "framework_enable_cpumask_locks", 00:07:24.599 "framework_disable_cpumask_locks", 00:07:24.599 "framework_wait_init", 00:07:24.599 "framework_start_init", 00:07:24.599 "scsi_get_devices", 00:07:24.599 "bdev_get_histogram", 00:07:24.599 "bdev_enable_histogram", 00:07:24.599 "bdev_set_qos_limit", 00:07:24.599 "bdev_set_qd_sampling_period", 00:07:24.599 "bdev_get_bdevs", 00:07:24.599 "bdev_reset_iostat", 00:07:24.599 "bdev_get_iostat", 00:07:24.599 "bdev_examine", 00:07:24.599 "bdev_wait_for_examine", 00:07:24.599 "bdev_set_options", 00:07:24.599 "notify_get_notifications", 00:07:24.599 "notify_get_types", 00:07:24.599 "accel_get_stats", 00:07:24.599 "accel_set_options", 00:07:24.599 "accel_set_driver", 00:07:24.599 "accel_crypto_key_destroy", 00:07:24.599 "accel_crypto_keys_get", 00:07:24.599 "accel_crypto_key_create", 00:07:24.599 "accel_assign_opc", 00:07:24.599 "accel_get_module_info", 00:07:24.599 "accel_get_opc_assignments", 00:07:24.599 "vmd_rescan", 00:07:24.599 "vmd_remove_device", 00:07:24.599 "vmd_enable", 00:07:24.599 "sock_get_default_impl", 00:07:24.599 "sock_set_default_impl", 00:07:24.599 "sock_impl_set_options", 00:07:24.599 "sock_impl_get_options", 00:07:24.599 "iobuf_get_stats", 00:07:24.599 "iobuf_set_options", 00:07:24.599 "framework_get_pci_devices", 00:07:24.599 "framework_get_config", 00:07:24.600 "framework_get_subsystems", 00:07:24.600 "trace_get_info", 00:07:24.600 "trace_get_tpoint_group_mask", 00:07:24.600 "trace_disable_tpoint_group", 00:07:24.600 "trace_enable_tpoint_group", 00:07:24.600 "trace_clear_tpoint_mask", 00:07:24.600 "trace_set_tpoint_mask", 00:07:24.600 "keyring_get_keys", 00:07:24.600 "spdk_get_version", 00:07:24.600 "rpc_get_methods" 00:07:24.600 ] 00:07:24.600 19:43:16 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:07:24.600 19:43:16 spdkcli_tcp -- common/autotest_common.sh@730 -- # xtrace_disable 00:07:24.600 19:43:16 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:24.858 19:43:16 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:07:24.858 19:43:16 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 1338087 00:07:24.858 19:43:16 spdkcli_tcp -- common/autotest_common.sh@950 -- # '[' -z 1338087 ']' 00:07:24.858 19:43:16 spdkcli_tcp -- common/autotest_common.sh@954 -- # kill -0 1338087 00:07:24.858 19:43:16 spdkcli_tcp -- common/autotest_common.sh@955 -- # uname 00:07:24.858 19:43:16 spdkcli_tcp -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:24.858 19:43:16 spdkcli_tcp -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1338087 00:07:24.858 19:43:16 spdkcli_tcp -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:24.858 19:43:16 spdkcli_tcp -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:24.858 19:43:16 spdkcli_tcp -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1338087' 00:07:24.858 killing process with pid 1338087 00:07:24.858 19:43:16 spdkcli_tcp -- common/autotest_common.sh@969 -- # kill 1338087 00:07:24.858 19:43:16 spdkcli_tcp -- common/autotest_common.sh@974 -- # wait 1338087 00:07:25.117 00:07:25.117 real 0m1.866s 00:07:25.117 user 0m3.439s 00:07:25.117 sys 0m0.596s 00:07:25.117 19:43:16 spdkcli_tcp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:25.117 19:43:16 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:25.117 ************************************ 00:07:25.117 END TEST spdkcli_tcp 00:07:25.117 ************************************ 00:07:25.117 19:43:16 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:25.117 19:43:16 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:25.117 19:43:16 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:25.117 19:43:16 -- common/autotest_common.sh@10 -- # set +x 00:07:25.378 ************************************ 00:07:25.378 START TEST dpdk_mem_utility 00:07:25.378 ************************************ 00:07:25.378 19:43:16 dpdk_mem_utility -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:25.378 * Looking for test storage... 00:07:25.378 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:07:25.378 19:43:16 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:25.378 19:43:16 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=1338355 00:07:25.378 19:43:16 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 1338355 00:07:25.378 19:43:16 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:25.378 19:43:16 dpdk_mem_utility -- common/autotest_common.sh@831 -- # '[' -z 1338355 ']' 00:07:25.378 19:43:16 dpdk_mem_utility -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:25.378 19:43:16 dpdk_mem_utility -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:25.378 19:43:16 dpdk_mem_utility -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:25.378 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:25.378 19:43:16 dpdk_mem_utility -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:25.378 19:43:16 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:25.378 [2024-07-24 19:43:16.915239] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:07:25.378 [2024-07-24 19:43:16.915312] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1338355 ] 00:07:25.637 [2024-07-24 19:43:17.044398] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:25.637 [2024-07-24 19:43:17.141419] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.579 19:43:17 dpdk_mem_utility -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:26.579 19:43:17 dpdk_mem_utility -- common/autotest_common.sh@864 -- # return 0 00:07:26.579 19:43:17 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:07:26.579 19:43:17 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:07:26.579 19:43:17 dpdk_mem_utility -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:26.579 19:43:17 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:26.579 { 00:07:26.579 "filename": "/tmp/spdk_mem_dump.txt" 00:07:26.579 } 00:07:26.579 19:43:17 dpdk_mem_utility -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:26.579 19:43:17 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:26.579 DPDK memory size 816.000000 MiB in 2 heap(s) 00:07:26.579 2 heaps totaling size 816.000000 MiB 00:07:26.579 size: 814.000000 MiB heap id: 0 00:07:26.579 size: 2.000000 MiB heap id: 1 00:07:26.579 end heaps---------- 00:07:26.579 8 mempools totaling size 598.116089 MiB 00:07:26.579 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:07:26.579 size: 158.602051 MiB name: PDU_data_out_Pool 00:07:26.579 size: 84.521057 MiB name: bdev_io_1338355 00:07:26.580 size: 51.011292 MiB name: evtpool_1338355 00:07:26.580 size: 50.003479 MiB name: msgpool_1338355 00:07:26.580 size: 21.763794 MiB name: PDU_Pool 00:07:26.580 size: 19.513306 MiB name: SCSI_TASK_Pool 00:07:26.580 size: 0.026123 MiB name: Session_Pool 00:07:26.580 end mempools------- 00:07:26.580 201 memzones totaling size 4.176453 MiB 00:07:26.580 size: 1.000366 MiB name: RG_ring_0_1338355 00:07:26.580 size: 1.000366 MiB name: RG_ring_1_1338355 00:07:26.580 size: 1.000366 MiB name: RG_ring_4_1338355 00:07:26.580 size: 1.000366 MiB name: RG_ring_5_1338355 00:07:26.580 size: 0.125366 MiB name: RG_ring_2_1338355 00:07:26.580 size: 0.015991 MiB name: RG_ring_3_1338355 00:07:26.580 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:07:26.580 size: 0.000305 MiB name: 0000:3d:01.0_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3d:01.1_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3d:01.2_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3d:01.3_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3d:01.4_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3d:01.5_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3d:01.6_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3d:01.7_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3d:02.0_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3d:02.1_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3d:02.2_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3d:02.3_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3d:02.4_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3d:02.5_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3d:02.6_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3d:02.7_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3f:01.0_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3f:01.1_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3f:01.2_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3f:01.3_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3f:01.4_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3f:01.5_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3f:01.6_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3f:01.7_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3f:02.0_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3f:02.1_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3f:02.2_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3f:02.3_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3f:02.4_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3f:02.5_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3f:02.6_qat 00:07:26.580 size: 0.000305 MiB name: 0000:3f:02.7_qat 00:07:26.580 size: 0.000305 MiB name: 0000:da:01.0_qat 00:07:26.580 size: 0.000305 MiB name: 0000:da:01.1_qat 00:07:26.580 size: 0.000305 MiB name: 0000:da:01.2_qat 00:07:26.580 size: 0.000305 MiB name: 0000:da:01.3_qat 00:07:26.580 size: 0.000305 MiB name: 0000:da:01.4_qat 00:07:26.580 size: 0.000305 MiB name: 0000:da:01.5_qat 00:07:26.580 size: 0.000305 MiB name: 0000:da:01.6_qat 00:07:26.580 size: 0.000305 MiB name: 0000:da:01.7_qat 00:07:26.580 size: 0.000305 MiB name: 0000:da:02.0_qat 00:07:26.580 size: 0.000305 MiB name: 0000:da:02.1_qat 00:07:26.580 size: 0.000305 MiB name: 0000:da:02.2_qat 00:07:26.580 size: 0.000305 MiB name: 0000:da:02.3_qat 00:07:26.580 size: 0.000305 MiB name: 0000:da:02.4_qat 00:07:26.580 size: 0.000305 MiB name: 0000:da:02.5_qat 00:07:26.580 size: 0.000305 MiB name: 0000:da:02.6_qat 00:07:26.580 size: 0.000305 MiB name: 0000:da:02.7_qat 00:07:26.580 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_0 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_1 00:07:26.580 size: 0.000122 MiB name: rte_compressdev_data_0 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_2 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_3 00:07:26.580 size: 0.000122 MiB name: rte_compressdev_data_1 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_4 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_5 00:07:26.580 size: 0.000122 MiB name: rte_compressdev_data_2 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_6 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_7 00:07:26.580 size: 0.000122 MiB name: rte_compressdev_data_3 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_8 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_9 00:07:26.580 size: 0.000122 MiB name: rte_compressdev_data_4 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_10 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_11 00:07:26.580 size: 0.000122 MiB name: rte_compressdev_data_5 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_12 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_13 00:07:26.580 size: 0.000122 MiB name: rte_compressdev_data_6 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_14 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_15 00:07:26.580 size: 0.000122 MiB name: rte_compressdev_data_7 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_16 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_17 00:07:26.580 size: 0.000122 MiB name: rte_compressdev_data_8 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_18 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_19 00:07:26.580 size: 0.000122 MiB name: rte_compressdev_data_9 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_20 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_21 00:07:26.580 size: 0.000122 MiB name: rte_compressdev_data_10 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_22 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_23 00:07:26.580 size: 0.000122 MiB name: rte_compressdev_data_11 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_24 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_25 00:07:26.580 size: 0.000122 MiB name: rte_compressdev_data_12 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_26 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_27 00:07:26.580 size: 0.000122 MiB name: rte_compressdev_data_13 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_28 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_29 00:07:26.580 size: 0.000122 MiB name: rte_compressdev_data_14 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_30 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_31 00:07:26.580 size: 0.000122 MiB name: rte_compressdev_data_15 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_32 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_33 00:07:26.580 size: 0.000122 MiB name: rte_compressdev_data_16 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_34 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_35 00:07:26.580 size: 0.000122 MiB name: rte_compressdev_data_17 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_36 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_37 00:07:26.580 size: 0.000122 MiB name: rte_compressdev_data_18 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_38 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_39 00:07:26.580 size: 0.000122 MiB name: rte_compressdev_data_19 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_40 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_41 00:07:26.580 size: 0.000122 MiB name: rte_compressdev_data_20 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_42 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_43 00:07:26.580 size: 0.000122 MiB name: rte_compressdev_data_21 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_44 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_45 00:07:26.580 size: 0.000122 MiB name: rte_compressdev_data_22 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_46 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_47 00:07:26.580 size: 0.000122 MiB name: rte_compressdev_data_23 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_48 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_49 00:07:26.580 size: 0.000122 MiB name: rte_compressdev_data_24 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_50 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_51 00:07:26.580 size: 0.000122 MiB name: rte_compressdev_data_25 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_52 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_53 00:07:26.580 size: 0.000122 MiB name: rte_compressdev_data_26 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_54 00:07:26.580 size: 0.000122 MiB name: rte_cryptodev_data_55 00:07:26.581 size: 0.000122 MiB name: rte_compressdev_data_27 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_56 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_57 00:07:26.581 size: 0.000122 MiB name: rte_compressdev_data_28 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_58 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_59 00:07:26.581 size: 0.000122 MiB name: rte_compressdev_data_29 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_60 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_61 00:07:26.581 size: 0.000122 MiB name: rte_compressdev_data_30 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_62 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_63 00:07:26.581 size: 0.000122 MiB name: rte_compressdev_data_31 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_64 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_65 00:07:26.581 size: 0.000122 MiB name: rte_compressdev_data_32 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_66 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_67 00:07:26.581 size: 0.000122 MiB name: rte_compressdev_data_33 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_68 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_69 00:07:26.581 size: 0.000122 MiB name: rte_compressdev_data_34 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_70 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_71 00:07:26.581 size: 0.000122 MiB name: rte_compressdev_data_35 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_72 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_73 00:07:26.581 size: 0.000122 MiB name: rte_compressdev_data_36 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_74 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_75 00:07:26.581 size: 0.000122 MiB name: rte_compressdev_data_37 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_76 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_77 00:07:26.581 size: 0.000122 MiB name: rte_compressdev_data_38 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_78 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_79 00:07:26.581 size: 0.000122 MiB name: rte_compressdev_data_39 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_80 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_81 00:07:26.581 size: 0.000122 MiB name: rte_compressdev_data_40 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_82 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_83 00:07:26.581 size: 0.000122 MiB name: rte_compressdev_data_41 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_84 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_85 00:07:26.581 size: 0.000122 MiB name: rte_compressdev_data_42 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_86 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_87 00:07:26.581 size: 0.000122 MiB name: rte_compressdev_data_43 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_88 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_89 00:07:26.581 size: 0.000122 MiB name: rte_compressdev_data_44 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_90 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_91 00:07:26.581 size: 0.000122 MiB name: rte_compressdev_data_45 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_92 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_93 00:07:26.581 size: 0.000122 MiB name: rte_compressdev_data_46 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_94 00:07:26.581 size: 0.000122 MiB name: rte_cryptodev_data_95 00:07:26.581 size: 0.000122 MiB name: rte_compressdev_data_47 00:07:26.581 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:07:26.581 end memzones------- 00:07:26.581 19:43:17 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:07:26.581 heap id: 0 total size: 814.000000 MiB number of busy elements: 554 number of free elements: 14 00:07:26.581 list of free elements. size: 11.808411 MiB 00:07:26.581 element at address: 0x200000400000 with size: 1.999512 MiB 00:07:26.581 element at address: 0x200018e00000 with size: 0.999878 MiB 00:07:26.581 element at address: 0x200019000000 with size: 0.999878 MiB 00:07:26.581 element at address: 0x200003e00000 with size: 0.996460 MiB 00:07:26.581 element at address: 0x200031c00000 with size: 0.994446 MiB 00:07:26.581 element at address: 0x200013800000 with size: 0.978882 MiB 00:07:26.581 element at address: 0x200007000000 with size: 0.959839 MiB 00:07:26.581 element at address: 0x200019200000 with size: 0.937256 MiB 00:07:26.581 element at address: 0x20001aa00000 with size: 0.580322 MiB 00:07:26.581 element at address: 0x200003a00000 with size: 0.498535 MiB 00:07:26.581 element at address: 0x20000b200000 with size: 0.491272 MiB 00:07:26.581 element at address: 0x200000800000 with size: 0.486877 MiB 00:07:26.581 element at address: 0x200019400000 with size: 0.485840 MiB 00:07:26.581 element at address: 0x200027e00000 with size: 0.399414 MiB 00:07:26.581 list of standard malloc elements. size: 199.883301 MiB 00:07:26.581 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:07:26.581 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:07:26.581 element at address: 0x200018efff80 with size: 1.000122 MiB 00:07:26.581 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:07:26.581 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:07:26.581 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:07:26.581 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:07:26.581 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:07:26.581 element at address: 0x200000330b40 with size: 0.004395 MiB 00:07:26.581 element at address: 0x2000003340c0 with size: 0.004395 MiB 00:07:26.581 element at address: 0x200000337640 with size: 0.004395 MiB 00:07:26.581 element at address: 0x20000033abc0 with size: 0.004395 MiB 00:07:26.581 element at address: 0x20000033e140 with size: 0.004395 MiB 00:07:26.581 element at address: 0x2000003416c0 with size: 0.004395 MiB 00:07:26.581 element at address: 0x200000344c40 with size: 0.004395 MiB 00:07:26.581 element at address: 0x2000003481c0 with size: 0.004395 MiB 00:07:26.581 element at address: 0x20000034b740 with size: 0.004395 MiB 00:07:26.581 element at address: 0x20000034ecc0 with size: 0.004395 MiB 00:07:26.581 element at address: 0x200000352240 with size: 0.004395 MiB 00:07:26.581 element at address: 0x2000003557c0 with size: 0.004395 MiB 00:07:26.581 element at address: 0x200000358d40 with size: 0.004395 MiB 00:07:26.581 element at address: 0x20000035c2c0 with size: 0.004395 MiB 00:07:26.581 element at address: 0x20000035f840 with size: 0.004395 MiB 00:07:26.581 element at address: 0x200000362dc0 with size: 0.004395 MiB 00:07:26.581 element at address: 0x200000366880 with size: 0.004395 MiB 00:07:26.581 element at address: 0x20000036a340 with size: 0.004395 MiB 00:07:26.581 element at address: 0x20000036de00 with size: 0.004395 MiB 00:07:26.581 element at address: 0x2000003718c0 with size: 0.004395 MiB 00:07:26.581 element at address: 0x200000375380 with size: 0.004395 MiB 00:07:26.581 element at address: 0x200000378e40 with size: 0.004395 MiB 00:07:26.581 element at address: 0x20000037c900 with size: 0.004395 MiB 00:07:26.581 element at address: 0x2000003803c0 with size: 0.004395 MiB 00:07:26.581 element at address: 0x200000383e80 with size: 0.004395 MiB 00:07:26.581 element at address: 0x200000387940 with size: 0.004395 MiB 00:07:26.581 element at address: 0x20000038b400 with size: 0.004395 MiB 00:07:26.581 element at address: 0x20000038eec0 with size: 0.004395 MiB 00:07:26.581 element at address: 0x200000392980 with size: 0.004395 MiB 00:07:26.581 element at address: 0x200000396440 with size: 0.004395 MiB 00:07:26.581 element at address: 0x200000399f00 with size: 0.004395 MiB 00:07:26.581 element at address: 0x20000039d9c0 with size: 0.004395 MiB 00:07:26.581 element at address: 0x2000003a1480 with size: 0.004395 MiB 00:07:26.581 element at address: 0x2000003a4f40 with size: 0.004395 MiB 00:07:26.581 element at address: 0x2000003a8a00 with size: 0.004395 MiB 00:07:26.581 element at address: 0x2000003ac4c0 with size: 0.004395 MiB 00:07:26.581 element at address: 0x2000003aff80 with size: 0.004395 MiB 00:07:26.581 element at address: 0x2000003b3a40 with size: 0.004395 MiB 00:07:26.581 element at address: 0x2000003b7500 with size: 0.004395 MiB 00:07:26.581 element at address: 0x2000003bafc0 with size: 0.004395 MiB 00:07:26.581 element at address: 0x2000003bea80 with size: 0.004395 MiB 00:07:26.581 element at address: 0x2000003c2540 with size: 0.004395 MiB 00:07:26.581 element at address: 0x2000003c6000 with size: 0.004395 MiB 00:07:26.581 element at address: 0x2000003c9ac0 with size: 0.004395 MiB 00:07:26.581 element at address: 0x2000003cd580 with size: 0.004395 MiB 00:07:26.581 element at address: 0x2000003d1040 with size: 0.004395 MiB 00:07:26.581 element at address: 0x2000003d4b00 with size: 0.004395 MiB 00:07:26.581 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:07:26.581 element at address: 0x20000032ea40 with size: 0.004028 MiB 00:07:26.582 element at address: 0x20000032fac0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000331fc0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000333040 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000335540 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003365c0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000338ac0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000339b40 with size: 0.004028 MiB 00:07:26.582 element at address: 0x20000033c040 with size: 0.004028 MiB 00:07:26.582 element at address: 0x20000033d0c0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x20000033f5c0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000340640 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000342b40 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000343bc0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003460c0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000347140 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000349640 with size: 0.004028 MiB 00:07:26.582 element at address: 0x20000034a6c0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x20000034cbc0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x20000034dc40 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000350140 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003511c0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003536c0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000354740 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000356c40 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000357cc0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x20000035a1c0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x20000035b240 with size: 0.004028 MiB 00:07:26.582 element at address: 0x20000035d740 with size: 0.004028 MiB 00:07:26.582 element at address: 0x20000035e7c0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000360cc0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000361d40 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000364780 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000365800 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000368240 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003692c0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x20000036bd00 with size: 0.004028 MiB 00:07:26.582 element at address: 0x20000036cd80 with size: 0.004028 MiB 00:07:26.582 element at address: 0x20000036f7c0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000370840 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000373280 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000374300 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000376d40 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000377dc0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x20000037a800 with size: 0.004028 MiB 00:07:26.582 element at address: 0x20000037b880 with size: 0.004028 MiB 00:07:26.582 element at address: 0x20000037e2c0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x20000037f340 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000381d80 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000382e00 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000385840 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003868c0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000389300 with size: 0.004028 MiB 00:07:26.582 element at address: 0x20000038a380 with size: 0.004028 MiB 00:07:26.582 element at address: 0x20000038cdc0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x20000038de40 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000390880 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000391900 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000394340 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003953c0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000397e00 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000398e80 with size: 0.004028 MiB 00:07:26.582 element at address: 0x20000039b8c0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x20000039c940 with size: 0.004028 MiB 00:07:26.582 element at address: 0x20000039f380 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003a0400 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003a2e40 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003a3ec0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003a6900 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003a7980 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003aa3c0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003ab440 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003ade80 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003aef00 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003b1940 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003b29c0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003b5400 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003b6480 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003b8ec0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003b9f40 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003bc980 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003bda00 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003c0440 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003c14c0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003c3f00 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003c4f80 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003c79c0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003c8a40 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003cb480 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003cc500 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003cef40 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003cffc0 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003d2a00 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003d3a80 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:07:26.582 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:07:26.582 element at address: 0x200000204e00 with size: 0.000305 MiB 00:07:26.582 element at address: 0x200000200000 with size: 0.000183 MiB 00:07:26.582 element at address: 0x2000002000c0 with size: 0.000183 MiB 00:07:26.582 element at address: 0x200000200180 with size: 0.000183 MiB 00:07:26.582 element at address: 0x200000200240 with size: 0.000183 MiB 00:07:26.582 element at address: 0x200000200300 with size: 0.000183 MiB 00:07:26.582 element at address: 0x2000002003c0 with size: 0.000183 MiB 00:07:26.582 element at address: 0x200000200480 with size: 0.000183 MiB 00:07:26.582 element at address: 0x200000200540 with size: 0.000183 MiB 00:07:26.582 element at address: 0x200000200600 with size: 0.000183 MiB 00:07:26.582 element at address: 0x2000002006c0 with size: 0.000183 MiB 00:07:26.582 element at address: 0x200000200780 with size: 0.000183 MiB 00:07:26.582 element at address: 0x200000200840 with size: 0.000183 MiB 00:07:26.582 element at address: 0x200000200900 with size: 0.000183 MiB 00:07:26.582 element at address: 0x2000002009c0 with size: 0.000183 MiB 00:07:26.582 element at address: 0x200000200a80 with size: 0.000183 MiB 00:07:26.582 element at address: 0x200000200b40 with size: 0.000183 MiB 00:07:26.582 element at address: 0x200000200c00 with size: 0.000183 MiB 00:07:26.582 element at address: 0x200000200cc0 with size: 0.000183 MiB 00:07:26.582 element at address: 0x200000200d80 with size: 0.000183 MiB 00:07:26.582 element at address: 0x200000200e40 with size: 0.000183 MiB 00:07:26.582 element at address: 0x200000200f00 with size: 0.000183 MiB 00:07:26.582 element at address: 0x200000200fc0 with size: 0.000183 MiB 00:07:26.582 element at address: 0x200000201080 with size: 0.000183 MiB 00:07:26.582 element at address: 0x200000201140 with size: 0.000183 MiB 00:07:26.582 element at address: 0x200000201200 with size: 0.000183 MiB 00:07:26.582 element at address: 0x2000002012c0 with size: 0.000183 MiB 00:07:26.582 element at address: 0x200000201380 with size: 0.000183 MiB 00:07:26.582 element at address: 0x200000201440 with size: 0.000183 MiB 00:07:26.582 element at address: 0x200000201500 with size: 0.000183 MiB 00:07:26.582 element at address: 0x2000002015c0 with size: 0.000183 MiB 00:07:26.582 element at address: 0x200000201680 with size: 0.000183 MiB 00:07:26.582 element at address: 0x200000201740 with size: 0.000183 MiB 00:07:26.582 element at address: 0x200000201800 with size: 0.000183 MiB 00:07:26.582 element at address: 0x2000002018c0 with size: 0.000183 MiB 00:07:26.582 element at address: 0x200000201980 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000201a40 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000201b00 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000201bc0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000201c80 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000201d40 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000201e00 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000201ec0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000201f80 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000202040 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000202100 with size: 0.000183 MiB 00:07:26.583 element at address: 0x2000002021c0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000202280 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000202340 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000202400 with size: 0.000183 MiB 00:07:26.583 element at address: 0x2000002024c0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000202580 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000202640 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000202700 with size: 0.000183 MiB 00:07:26.583 element at address: 0x2000002027c0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000202880 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000202940 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000202a00 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000202ac0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000202b80 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000202c40 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000202d00 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000202dc0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000202e80 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000202f40 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000203000 with size: 0.000183 MiB 00:07:26.583 element at address: 0x2000002030c0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000203180 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000203240 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000203300 with size: 0.000183 MiB 00:07:26.583 element at address: 0x2000002033c0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000203480 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000203540 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000203600 with size: 0.000183 MiB 00:07:26.583 element at address: 0x2000002036c0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000203780 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000203840 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000203900 with size: 0.000183 MiB 00:07:26.583 element at address: 0x2000002039c0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000203a80 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000203b40 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000203c00 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000203cc0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000203d80 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000203e40 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000203f00 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000203fc0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000204080 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000204140 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000204200 with size: 0.000183 MiB 00:07:26.583 element at address: 0x2000002042c0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000204380 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000204440 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000204500 with size: 0.000183 MiB 00:07:26.583 element at address: 0x2000002045c0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000204680 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000204740 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000204800 with size: 0.000183 MiB 00:07:26.583 element at address: 0x2000002048c0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000204980 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000204a40 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000204b00 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000204bc0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000204c80 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000204d40 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000204f40 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000205000 with size: 0.000183 MiB 00:07:26.583 element at address: 0x2000002050c0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000205180 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000205240 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000205300 with size: 0.000183 MiB 00:07:26.583 element at address: 0x2000002053c0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000205480 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000205540 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000205600 with size: 0.000183 MiB 00:07:26.583 element at address: 0x2000002056c0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000205780 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000205840 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000205900 with size: 0.000183 MiB 00:07:26.583 element at address: 0x2000002059c0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000205a80 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000205b40 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000205c00 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000205cc0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000205d80 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000205e40 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000205f00 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000205fc0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000206080 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000206140 with size: 0.000183 MiB 00:07:26.583 element at address: 0x200000206200 with size: 0.000183 MiB 00:07:26.583 element at address: 0x2000002062c0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x2000002064c0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000020a780 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022aa40 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022ab00 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022abc0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022ac80 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022ad40 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022ae00 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022aec0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022af80 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022b040 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022b100 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022b1c0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022b280 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022b340 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022b400 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022b4c0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022b580 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022b640 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022b700 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022b7c0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022b9c0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022ba80 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022bb40 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022bc00 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022bcc0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022bd80 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022be40 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022bf00 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022bfc0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022c080 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022c140 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022c200 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022c2c0 with size: 0.000183 MiB 00:07:26.583 element at address: 0x20000022c380 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000022c440 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000022c500 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000032e700 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000032e7c0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000331d40 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003352c0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000338840 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000033bdc0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000033f340 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003428c0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000345e40 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003493c0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000034c940 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000034fec0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000353440 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003569c0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000359f40 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000035d4c0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000360a40 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000363fc0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000364180 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000364240 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000364400 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000367a80 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000367c40 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000367d00 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000367ec0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000036b540 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000036b700 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000036b7c0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000036b980 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000036f000 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000036f1c0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000036f280 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000036f440 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000372ac0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000372c80 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000372d40 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000372f00 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000376580 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000376740 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000376800 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003769c0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000037a040 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000037a200 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000037a2c0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000037a480 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000037db00 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000037dcc0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000037dd80 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000037df40 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003815c0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000381780 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000381840 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000381a00 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000385080 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000385240 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000385300 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003854c0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000388b40 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000388d00 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000388dc0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000388f80 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000038c600 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000038c7c0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000038c880 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000038ca40 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003900c0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000390280 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000390340 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000390500 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000393b80 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000393d40 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000393e00 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000393fc0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000397640 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000397800 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003978c0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200000397a80 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000039b100 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000039b2c0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000039b380 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000039b540 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000039ebc0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000039ed80 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000039ee40 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000039f000 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003a2680 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003a2840 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003a2900 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003a2ac0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003a6140 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003a6300 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003a63c0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003a6580 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003a9c00 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003a9dc0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003a9e80 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003aa040 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003ad6c0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003ad880 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003ad940 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003adb00 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003b1180 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003b1340 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003b1400 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003b15c0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003b4c40 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003b4e00 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003b4ec0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003b5080 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003b8700 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003b88c0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003b8980 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003b8b40 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003bc1c0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003bc380 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003bc440 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003bc600 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003bfc80 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003bfe40 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003bff00 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003c00c0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003c3740 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003c3900 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003c39c0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003c3b80 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003c7200 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003c73c0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003c7480 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003c7640 with size: 0.000183 MiB 00:07:26.585 element at address: 0x2000003cacc0 with size: 0.000183 MiB 00:07:26.585 element at address: 0x2000003cae80 with size: 0.000183 MiB 00:07:26.585 element at address: 0x2000003caf40 with size: 0.000183 MiB 00:07:26.585 element at address: 0x2000003cb100 with size: 0.000183 MiB 00:07:26.585 element at address: 0x2000003ce780 with size: 0.000183 MiB 00:07:26.585 element at address: 0x2000003ce940 with size: 0.000183 MiB 00:07:26.585 element at address: 0x2000003cea00 with size: 0.000183 MiB 00:07:26.585 element at address: 0x2000003cebc0 with size: 0.000183 MiB 00:07:26.585 element at address: 0x2000003d2240 with size: 0.000183 MiB 00:07:26.585 element at address: 0x2000003d2400 with size: 0.000183 MiB 00:07:26.585 element at address: 0x2000003d24c0 with size: 0.000183 MiB 00:07:26.585 element at address: 0x2000003d2680 with size: 0.000183 MiB 00:07:26.585 element at address: 0x2000003d5dc0 with size: 0.000183 MiB 00:07:26.585 element at address: 0x2000003d64c0 with size: 0.000183 MiB 00:07:26.585 element at address: 0x2000003d6580 with size: 0.000183 MiB 00:07:26.585 element at address: 0x2000003d6880 with size: 0.000183 MiB 00:07:26.585 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:07:26.585 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:07:26.585 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:07:26.585 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:07:26.585 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:07:26.585 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:07:26.585 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:07:26.585 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:07:26.585 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:07:26.585 element at address: 0x20001aa94900 with size: 0.000183 MiB 00:07:26.585 element at address: 0x20001aa949c0 with size: 0.000183 MiB 00:07:26.585 element at address: 0x20001aa94a80 with size: 0.000183 MiB 00:07:26.585 element at address: 0x20001aa94b40 with size: 0.000183 MiB 00:07:26.585 element at address: 0x20001aa94c00 with size: 0.000183 MiB 00:07:26.585 element at address: 0x20001aa94cc0 with size: 0.000183 MiB 00:07:26.585 element at address: 0x20001aa94d80 with size: 0.000183 MiB 00:07:26.585 element at address: 0x20001aa94e40 with size: 0.000183 MiB 00:07:26.585 element at address: 0x20001aa94f00 with size: 0.000183 MiB 00:07:26.585 element at address: 0x20001aa94fc0 with size: 0.000183 MiB 00:07:26.585 element at address: 0x20001aa95080 with size: 0.000183 MiB 00:07:26.585 element at address: 0x20001aa95140 with size: 0.000183 MiB 00:07:26.585 element at address: 0x20001aa95200 with size: 0.000183 MiB 00:07:26.585 element at address: 0x20001aa952c0 with size: 0.000183 MiB 00:07:26.585 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:07:26.585 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e66400 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e664c0 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6d0c0 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6d2c0 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6d380 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6d440 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6d500 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6d5c0 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6d680 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6d740 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6d800 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6d8c0 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6d980 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6da40 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6db00 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6dbc0 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6dc80 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6dd40 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6de00 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6dec0 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6df80 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6e040 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6e100 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6e1c0 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:07:26.585 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:07:26.585 list of memzone associated elements. size: 602.308289 MiB 00:07:26.585 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:07:26.585 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:07:26.585 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:07:26.586 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:07:26.586 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:07:26.586 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_1338355_0 00:07:26.586 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:07:26.586 associated memzone info: size: 48.002930 MiB name: MP_evtpool_1338355_0 00:07:26.586 element at address: 0x200003fff380 with size: 48.003052 MiB 00:07:26.586 associated memzone info: size: 48.002930 MiB name: MP_msgpool_1338355_0 00:07:26.586 element at address: 0x2000195be940 with size: 20.255554 MiB 00:07:26.586 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:07:26.586 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:07:26.586 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:07:26.586 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:07:26.586 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_1338355 00:07:26.586 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:07:26.586 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_1338355 00:07:26.586 element at address: 0x20000022c5c0 with size: 1.008118 MiB 00:07:26.586 associated memzone info: size: 1.007996 MiB name: MP_evtpool_1338355 00:07:26.586 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:07:26.586 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:07:26.586 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:07:26.586 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:07:26.586 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:07:26.586 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:07:26.586 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:07:26.586 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:07:26.586 element at address: 0x200003eff180 with size: 1.000488 MiB 00:07:26.586 associated memzone info: size: 1.000366 MiB name: RG_ring_0_1338355 00:07:26.586 element at address: 0x200003affc00 with size: 1.000488 MiB 00:07:26.586 associated memzone info: size: 1.000366 MiB name: RG_ring_1_1338355 00:07:26.586 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:07:26.586 associated memzone info: size: 1.000366 MiB name: RG_ring_4_1338355 00:07:26.586 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:07:26.586 associated memzone info: size: 1.000366 MiB name: RG_ring_5_1338355 00:07:26.586 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:07:26.586 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_1338355 00:07:26.586 element at address: 0x20000b27dc40 with size: 0.500488 MiB 00:07:26.586 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:07:26.586 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:07:26.586 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:07:26.586 element at address: 0x20001947c600 with size: 0.250488 MiB 00:07:26.586 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:07:26.586 element at address: 0x20000020a840 with size: 0.125488 MiB 00:07:26.586 associated memzone info: size: 0.125366 MiB name: RG_ring_2_1338355 00:07:26.586 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:07:26.586 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:07:26.586 element at address: 0x200027e66580 with size: 0.023743 MiB 00:07:26.586 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:07:26.586 element at address: 0x200000206580 with size: 0.016113 MiB 00:07:26.586 associated memzone info: size: 0.015991 MiB name: RG_ring_3_1338355 00:07:26.586 element at address: 0x200027e6c6c0 with size: 0.002441 MiB 00:07:26.586 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:07:26.586 element at address: 0x2000003d5f80 with size: 0.001282 MiB 00:07:26.586 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:07:26.586 element at address: 0x2000003d6a40 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.0_qat 00:07:26.586 element at address: 0x2000003d2840 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.1_qat 00:07:26.586 element at address: 0x2000003ced80 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.2_qat 00:07:26.586 element at address: 0x2000003cb2c0 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.3_qat 00:07:26.586 element at address: 0x2000003c7800 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.4_qat 00:07:26.586 element at address: 0x2000003c3d40 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.5_qat 00:07:26.586 element at address: 0x2000003c0280 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.6_qat 00:07:26.586 element at address: 0x2000003bc7c0 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.7_qat 00:07:26.586 element at address: 0x2000003b8d00 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.0_qat 00:07:26.586 element at address: 0x2000003b5240 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.1_qat 00:07:26.586 element at address: 0x2000003b1780 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.2_qat 00:07:26.586 element at address: 0x2000003adcc0 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.3_qat 00:07:26.586 element at address: 0x2000003aa200 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.4_qat 00:07:26.586 element at address: 0x2000003a6740 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.5_qat 00:07:26.586 element at address: 0x2000003a2c80 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.6_qat 00:07:26.586 element at address: 0x20000039f1c0 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.7_qat 00:07:26.586 element at address: 0x20000039b700 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.0_qat 00:07:26.586 element at address: 0x200000397c40 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.1_qat 00:07:26.586 element at address: 0x200000394180 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.2_qat 00:07:26.586 element at address: 0x2000003906c0 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.3_qat 00:07:26.586 element at address: 0x20000038cc00 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.4_qat 00:07:26.586 element at address: 0x200000389140 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.5_qat 00:07:26.586 element at address: 0x200000385680 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.6_qat 00:07:26.586 element at address: 0x200000381bc0 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.7_qat 00:07:26.586 element at address: 0x20000037e100 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.0_qat 00:07:26.586 element at address: 0x20000037a640 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.1_qat 00:07:26.586 element at address: 0x200000376b80 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.2_qat 00:07:26.586 element at address: 0x2000003730c0 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.3_qat 00:07:26.586 element at address: 0x20000036f600 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.4_qat 00:07:26.586 element at address: 0x20000036bb40 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.5_qat 00:07:26.586 element at address: 0x200000368080 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.6_qat 00:07:26.586 element at address: 0x2000003645c0 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.7_qat 00:07:26.586 element at address: 0x200000360b00 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:da:01.0_qat 00:07:26.586 element at address: 0x20000035d580 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:da:01.1_qat 00:07:26.586 element at address: 0x20000035a000 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:da:01.2_qat 00:07:26.586 element at address: 0x200000356a80 with size: 0.000427 MiB 00:07:26.586 associated memzone info: size: 0.000305 MiB name: 0000:da:01.3_qat 00:07:26.586 element at address: 0x200000353500 with size: 0.000427 MiB 00:07:26.587 associated memzone info: size: 0.000305 MiB name: 0000:da:01.4_qat 00:07:26.587 element at address: 0x20000034ff80 with size: 0.000427 MiB 00:07:26.587 associated memzone info: size: 0.000305 MiB name: 0000:da:01.5_qat 00:07:26.587 element at address: 0x20000034ca00 with size: 0.000427 MiB 00:07:26.587 associated memzone info: size: 0.000305 MiB name: 0000:da:01.6_qat 00:07:26.587 element at address: 0x200000349480 with size: 0.000427 MiB 00:07:26.587 associated memzone info: size: 0.000305 MiB name: 0000:da:01.7_qat 00:07:26.587 element at address: 0x200000345f00 with size: 0.000427 MiB 00:07:26.587 associated memzone info: size: 0.000305 MiB name: 0000:da:02.0_qat 00:07:26.587 element at address: 0x200000342980 with size: 0.000427 MiB 00:07:26.587 associated memzone info: size: 0.000305 MiB name: 0000:da:02.1_qat 00:07:26.587 element at address: 0x20000033f400 with size: 0.000427 MiB 00:07:26.587 associated memzone info: size: 0.000305 MiB name: 0000:da:02.2_qat 00:07:26.587 element at address: 0x20000033be80 with size: 0.000427 MiB 00:07:26.587 associated memzone info: size: 0.000305 MiB name: 0000:da:02.3_qat 00:07:26.587 element at address: 0x200000338900 with size: 0.000427 MiB 00:07:26.587 associated memzone info: size: 0.000305 MiB name: 0000:da:02.4_qat 00:07:26.587 element at address: 0x200000335380 with size: 0.000427 MiB 00:07:26.587 associated memzone info: size: 0.000305 MiB name: 0000:da:02.5_qat 00:07:26.587 element at address: 0x200000331e00 with size: 0.000427 MiB 00:07:26.587 associated memzone info: size: 0.000305 MiB name: 0000:da:02.6_qat 00:07:26.587 element at address: 0x20000032e880 with size: 0.000427 MiB 00:07:26.587 associated memzone info: size: 0.000305 MiB name: 0000:da:02.7_qat 00:07:26.587 element at address: 0x2000003d6740 with size: 0.000305 MiB 00:07:26.587 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:07:26.587 element at address: 0x20000022b880 with size: 0.000305 MiB 00:07:26.587 associated memzone info: size: 0.000183 MiB name: MP_msgpool_1338355 00:07:26.587 element at address: 0x200000206380 with size: 0.000305 MiB 00:07:26.587 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_1338355 00:07:26.587 element at address: 0x200027e6d180 with size: 0.000305 MiB 00:07:26.587 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:07:26.587 element at address: 0x2000003d6940 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_0 00:07:26.587 element at address: 0x2000003d6640 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_1 00:07:26.587 element at address: 0x2000003d5e80 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_0 00:07:26.587 element at address: 0x2000003d2740 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_2 00:07:26.587 element at address: 0x2000003d2580 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_3 00:07:26.587 element at address: 0x2000003d2300 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_1 00:07:26.587 element at address: 0x2000003cec80 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_4 00:07:26.587 element at address: 0x2000003ceac0 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_5 00:07:26.587 element at address: 0x2000003ce840 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_2 00:07:26.587 element at address: 0x2000003cb1c0 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_6 00:07:26.587 element at address: 0x2000003cb000 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_7 00:07:26.587 element at address: 0x2000003cad80 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_3 00:07:26.587 element at address: 0x2000003c7700 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_8 00:07:26.587 element at address: 0x2000003c7540 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_9 00:07:26.587 element at address: 0x2000003c72c0 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_4 00:07:26.587 element at address: 0x2000003c3c40 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_10 00:07:26.587 element at address: 0x2000003c3a80 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_11 00:07:26.587 element at address: 0x2000003c3800 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_5 00:07:26.587 element at address: 0x2000003c0180 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_12 00:07:26.587 element at address: 0x2000003bffc0 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_13 00:07:26.587 element at address: 0x2000003bfd40 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_6 00:07:26.587 element at address: 0x2000003bc6c0 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_14 00:07:26.587 element at address: 0x2000003bc500 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_15 00:07:26.587 element at address: 0x2000003bc280 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_7 00:07:26.587 element at address: 0x2000003b8c00 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_16 00:07:26.587 element at address: 0x2000003b8a40 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_17 00:07:26.587 element at address: 0x2000003b87c0 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_8 00:07:26.587 element at address: 0x2000003b5140 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_18 00:07:26.587 element at address: 0x2000003b4f80 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_19 00:07:26.587 element at address: 0x2000003b4d00 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_9 00:07:26.587 element at address: 0x2000003b1680 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_20 00:07:26.587 element at address: 0x2000003b14c0 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_21 00:07:26.587 element at address: 0x2000003b1240 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_10 00:07:26.587 element at address: 0x2000003adbc0 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_22 00:07:26.587 element at address: 0x2000003ada00 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_23 00:07:26.587 element at address: 0x2000003ad780 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_11 00:07:26.587 element at address: 0x2000003aa100 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_24 00:07:26.587 element at address: 0x2000003a9f40 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_25 00:07:26.587 element at address: 0x2000003a9cc0 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_12 00:07:26.587 element at address: 0x2000003a6640 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_26 00:07:26.587 element at address: 0x2000003a6480 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_27 00:07:26.587 element at address: 0x2000003a6200 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_13 00:07:26.587 element at address: 0x2000003a2b80 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_28 00:07:26.587 element at address: 0x2000003a29c0 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_29 00:07:26.587 element at address: 0x2000003a2740 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_14 00:07:26.587 element at address: 0x20000039f0c0 with size: 0.000244 MiB 00:07:26.587 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_30 00:07:26.587 element at address: 0x20000039ef00 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_31 00:07:26.588 element at address: 0x20000039ec80 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_15 00:07:26.588 element at address: 0x20000039b600 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_32 00:07:26.588 element at address: 0x20000039b440 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_33 00:07:26.588 element at address: 0x20000039b1c0 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_16 00:07:26.588 element at address: 0x200000397b40 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_34 00:07:26.588 element at address: 0x200000397980 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_35 00:07:26.588 element at address: 0x200000397700 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_17 00:07:26.588 element at address: 0x200000394080 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_36 00:07:26.588 element at address: 0x200000393ec0 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_37 00:07:26.588 element at address: 0x200000393c40 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_18 00:07:26.588 element at address: 0x2000003905c0 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_38 00:07:26.588 element at address: 0x200000390400 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_39 00:07:26.588 element at address: 0x200000390180 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_19 00:07:26.588 element at address: 0x20000038cb00 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_40 00:07:26.588 element at address: 0x20000038c940 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_41 00:07:26.588 element at address: 0x20000038c6c0 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_20 00:07:26.588 element at address: 0x200000389040 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_42 00:07:26.588 element at address: 0x200000388e80 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_43 00:07:26.588 element at address: 0x200000388c00 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_21 00:07:26.588 element at address: 0x200000385580 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_44 00:07:26.588 element at address: 0x2000003853c0 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_45 00:07:26.588 element at address: 0x200000385140 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_22 00:07:26.588 element at address: 0x200000381ac0 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_46 00:07:26.588 element at address: 0x200000381900 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_47 00:07:26.588 element at address: 0x200000381680 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_23 00:07:26.588 element at address: 0x20000037e000 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_48 00:07:26.588 element at address: 0x20000037de40 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_49 00:07:26.588 element at address: 0x20000037dbc0 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_24 00:07:26.588 element at address: 0x20000037a540 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_50 00:07:26.588 element at address: 0x20000037a380 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_51 00:07:26.588 element at address: 0x20000037a100 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_25 00:07:26.588 element at address: 0x200000376a80 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_52 00:07:26.588 element at address: 0x2000003768c0 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_53 00:07:26.588 element at address: 0x200000376640 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_26 00:07:26.588 element at address: 0x200000372fc0 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_54 00:07:26.588 element at address: 0x200000372e00 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_55 00:07:26.588 element at address: 0x200000372b80 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_27 00:07:26.588 element at address: 0x20000036f500 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_56 00:07:26.588 element at address: 0x20000036f340 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_57 00:07:26.588 element at address: 0x20000036f0c0 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_28 00:07:26.588 element at address: 0x20000036ba40 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_58 00:07:26.588 element at address: 0x20000036b880 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_59 00:07:26.588 element at address: 0x20000036b600 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_29 00:07:26.588 element at address: 0x200000367f80 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_60 00:07:26.588 element at address: 0x200000367dc0 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_61 00:07:26.588 element at address: 0x200000367b40 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_30 00:07:26.588 element at address: 0x2000003644c0 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_62 00:07:26.588 element at address: 0x200000364300 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_63 00:07:26.588 element at address: 0x200000364080 with size: 0.000244 MiB 00:07:26.588 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_31 00:07:26.588 element at address: 0x2000003d5d00 with size: 0.000183 MiB 00:07:26.588 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:07:26.588 19:43:18 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:07:26.588 19:43:18 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 1338355 00:07:26.588 19:43:18 dpdk_mem_utility -- common/autotest_common.sh@950 -- # '[' -z 1338355 ']' 00:07:26.588 19:43:18 dpdk_mem_utility -- common/autotest_common.sh@954 -- # kill -0 1338355 00:07:26.588 19:43:18 dpdk_mem_utility -- common/autotest_common.sh@955 -- # uname 00:07:26.588 19:43:18 dpdk_mem_utility -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:26.588 19:43:18 dpdk_mem_utility -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1338355 00:07:26.588 19:43:18 dpdk_mem_utility -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:26.588 19:43:18 dpdk_mem_utility -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:26.588 19:43:18 dpdk_mem_utility -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1338355' 00:07:26.588 killing process with pid 1338355 00:07:26.588 19:43:18 dpdk_mem_utility -- common/autotest_common.sh@969 -- # kill 1338355 00:07:26.588 19:43:18 dpdk_mem_utility -- common/autotest_common.sh@974 -- # wait 1338355 00:07:27.158 00:07:27.158 real 0m1.762s 00:07:27.158 user 0m1.936s 00:07:27.158 sys 0m0.559s 00:07:27.158 19:43:18 dpdk_mem_utility -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:27.158 19:43:18 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:27.158 ************************************ 00:07:27.158 END TEST dpdk_mem_utility 00:07:27.158 ************************************ 00:07:27.158 19:43:18 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:07:27.158 19:43:18 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:27.158 19:43:18 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:27.158 19:43:18 -- common/autotest_common.sh@10 -- # set +x 00:07:27.158 ************************************ 00:07:27.158 START TEST event 00:07:27.158 ************************************ 00:07:27.158 19:43:18 event -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:07:27.158 * Looking for test storage... 00:07:27.158 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:07:27.158 19:43:18 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:07:27.158 19:43:18 event -- bdev/nbd_common.sh@6 -- # set -e 00:07:27.158 19:43:18 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:27.158 19:43:18 event -- common/autotest_common.sh@1101 -- # '[' 6 -le 1 ']' 00:07:27.158 19:43:18 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:27.158 19:43:18 event -- common/autotest_common.sh@10 -- # set +x 00:07:27.158 ************************************ 00:07:27.158 START TEST event_perf 00:07:27.158 ************************************ 00:07:27.158 19:43:18 event.event_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:27.418 Running I/O for 1 seconds...[2024-07-24 19:43:18.761844] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:07:27.418 [2024-07-24 19:43:18.761914] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1338636 ] 00:07:27.418 [2024-07-24 19:43:18.885234] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:27.418 [2024-07-24 19:43:18.991157] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:27.418 [2024-07-24 19:43:18.991257] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:27.418 [2024-07-24 19:43:18.991258] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.418 [2024-07-24 19:43:18.991216] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:28.800 Running I/O for 1 seconds... 00:07:28.800 lcore 0: 104017 00:07:28.800 lcore 1: 104019 00:07:28.800 lcore 2: 104021 00:07:28.800 lcore 3: 104018 00:07:28.800 done. 00:07:28.800 00:07:28.800 real 0m1.353s 00:07:28.800 user 0m4.213s 00:07:28.800 sys 0m0.128s 00:07:28.800 19:43:20 event.event_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:28.800 19:43:20 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:07:28.800 ************************************ 00:07:28.800 END TEST event_perf 00:07:28.800 ************************************ 00:07:28.800 19:43:20 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:28.800 19:43:20 event -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:07:28.800 19:43:20 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:28.800 19:43:20 event -- common/autotest_common.sh@10 -- # set +x 00:07:28.800 ************************************ 00:07:28.800 START TEST event_reactor 00:07:28.800 ************************************ 00:07:28.800 19:43:20 event.event_reactor -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:28.800 [2024-07-24 19:43:20.192619] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:07:28.800 [2024-07-24 19:43:20.192698] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1338875 ] 00:07:28.800 [2024-07-24 19:43:20.324196] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:29.060 [2024-07-24 19:43:20.429875] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.999 test_start 00:07:29.999 oneshot 00:07:29.999 tick 100 00:07:29.999 tick 100 00:07:29.999 tick 250 00:07:29.999 tick 100 00:07:29.999 tick 100 00:07:29.999 tick 250 00:07:29.999 tick 100 00:07:29.999 tick 500 00:07:29.999 tick 100 00:07:29.999 tick 100 00:07:29.999 tick 250 00:07:29.999 tick 100 00:07:29.999 tick 100 00:07:29.999 test_end 00:07:29.999 00:07:29.999 real 0m1.355s 00:07:29.999 user 0m1.206s 00:07:29.999 sys 0m0.143s 00:07:29.999 19:43:21 event.event_reactor -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:29.999 19:43:21 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:07:29.999 ************************************ 00:07:29.999 END TEST event_reactor 00:07:29.999 ************************************ 00:07:29.999 19:43:21 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:29.999 19:43:21 event -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:07:29.999 19:43:21 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:29.999 19:43:21 event -- common/autotest_common.sh@10 -- # set +x 00:07:30.259 ************************************ 00:07:30.259 START TEST event_reactor_perf 00:07:30.259 ************************************ 00:07:30.259 19:43:21 event.event_reactor_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:30.259 [2024-07-24 19:43:21.636460] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:07:30.259 [2024-07-24 19:43:21.636528] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1339111 ] 00:07:30.259 [2024-07-24 19:43:21.767267] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:30.519 [2024-07-24 19:43:21.873164] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.460 test_start 00:07:31.460 test_end 00:07:31.460 Performance: 328384 events per second 00:07:31.460 00:07:31.460 real 0m1.364s 00:07:31.460 user 0m1.217s 00:07:31.460 sys 0m0.140s 00:07:31.460 19:43:22 event.event_reactor_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:31.460 19:43:22 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:07:31.460 ************************************ 00:07:31.460 END TEST event_reactor_perf 00:07:31.460 ************************************ 00:07:31.460 19:43:23 event -- event/event.sh@49 -- # uname -s 00:07:31.460 19:43:23 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:07:31.460 19:43:23 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:31.460 19:43:23 event -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:31.460 19:43:23 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:31.460 19:43:23 event -- common/autotest_common.sh@10 -- # set +x 00:07:31.767 ************************************ 00:07:31.767 START TEST event_scheduler 00:07:31.767 ************************************ 00:07:31.767 19:43:23 event.event_scheduler -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:31.767 * Looking for test storage... 00:07:31.767 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:07:31.767 19:43:23 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:07:31.767 19:43:23 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=1339378 00:07:31.767 19:43:23 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:07:31.767 19:43:23 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:07:31.767 19:43:23 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 1339378 00:07:31.767 19:43:23 event.event_scheduler -- common/autotest_common.sh@831 -- # '[' -z 1339378 ']' 00:07:31.767 19:43:23 event.event_scheduler -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:31.767 19:43:23 event.event_scheduler -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:31.767 19:43:23 event.event_scheduler -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:31.767 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:31.767 19:43:23 event.event_scheduler -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:31.767 19:43:23 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:31.767 [2024-07-24 19:43:23.231657] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:07:31.767 [2024-07-24 19:43:23.231727] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1339378 ] 00:07:32.026 [2024-07-24 19:43:23.428335] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:32.285 [2024-07-24 19:43:23.621978] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.285 [2024-07-24 19:43:23.622065] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:32.285 [2024-07-24 19:43:23.622183] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:32.285 [2024-07-24 19:43:23.622194] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:32.855 19:43:24 event.event_scheduler -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:32.855 19:43:24 event.event_scheduler -- common/autotest_common.sh@864 -- # return 0 00:07:32.855 19:43:24 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:07:32.855 19:43:24 event.event_scheduler -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:32.855 19:43:24 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:32.855 [2024-07-24 19:43:24.181507] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:07:32.855 [2024-07-24 19:43:24.181565] scheduler_dynamic.c: 270:init: *NOTICE*: Unable to initialize dpdk governor 00:07:32.855 [2024-07-24 19:43:24.181599] scheduler_dynamic.c: 416:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:07:32.855 [2024-07-24 19:43:24.181625] scheduler_dynamic.c: 418:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:07:32.855 [2024-07-24 19:43:24.181649] scheduler_dynamic.c: 420:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:07:32.855 19:43:24 event.event_scheduler -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:32.855 19:43:24 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:07:32.855 19:43:24 event.event_scheduler -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:32.855 19:43:24 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:32.855 [2024-07-24 19:43:24.322214] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:07:32.855 19:43:24 event.event_scheduler -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:32.855 19:43:24 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:07:32.855 19:43:24 event.event_scheduler -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:32.855 19:43:24 event.event_scheduler -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:32.855 19:43:24 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:32.855 ************************************ 00:07:32.855 START TEST scheduler_create_thread 00:07:32.855 ************************************ 00:07:32.855 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1125 -- # scheduler_create_thread 00:07:32.855 19:43:24 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:07:32.855 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:32.855 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:32.855 2 00:07:32.855 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:32.855 19:43:24 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:07:32.855 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:32.855 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:32.855 3 00:07:32.855 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:32.855 19:43:24 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:07:32.855 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:32.855 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:32.855 4 00:07:32.855 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:32.855 19:43:24 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:07:32.855 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:32.855 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:32.855 5 00:07:32.856 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:32.856 19:43:24 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:07:32.856 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:32.856 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:32.856 6 00:07:32.856 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:32.856 19:43:24 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:07:32.856 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:32.856 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:32.856 7 00:07:32.856 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:32.856 19:43:24 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:07:32.856 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:32.856 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:32.856 8 00:07:32.856 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:32.856 19:43:24 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:07:32.856 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:32.856 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:32.856 9 00:07:33.115 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:33.115 19:43:24 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:07:33.115 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:33.115 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:33.115 10 00:07:33.115 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:33.115 19:43:24 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:07:33.115 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:33.115 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:33.116 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:33.116 19:43:24 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:07:33.116 19:43:24 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:07:33.116 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:33.116 19:43:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:34.055 19:43:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:34.055 19:43:25 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:07:34.055 19:43:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:34.055 19:43:25 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:35.436 19:43:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:35.436 19:43:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:07:35.436 19:43:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:07:35.436 19:43:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:35.436 19:43:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:36.375 19:43:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:36.375 00:07:36.375 real 0m3.386s 00:07:36.375 user 0m0.026s 00:07:36.375 sys 0m0.007s 00:07:36.375 19:43:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:36.375 19:43:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:36.375 ************************************ 00:07:36.375 END TEST scheduler_create_thread 00:07:36.375 ************************************ 00:07:36.375 19:43:27 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:07:36.375 19:43:27 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 1339378 00:07:36.375 19:43:27 event.event_scheduler -- common/autotest_common.sh@950 -- # '[' -z 1339378 ']' 00:07:36.375 19:43:27 event.event_scheduler -- common/autotest_common.sh@954 -- # kill -0 1339378 00:07:36.375 19:43:27 event.event_scheduler -- common/autotest_common.sh@955 -- # uname 00:07:36.375 19:43:27 event.event_scheduler -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:36.376 19:43:27 event.event_scheduler -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1339378 00:07:36.376 19:43:27 event.event_scheduler -- common/autotest_common.sh@956 -- # process_name=reactor_2 00:07:36.376 19:43:27 event.event_scheduler -- common/autotest_common.sh@960 -- # '[' reactor_2 = sudo ']' 00:07:36.376 19:43:27 event.event_scheduler -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1339378' 00:07:36.376 killing process with pid 1339378 00:07:36.376 19:43:27 event.event_scheduler -- common/autotest_common.sh@969 -- # kill 1339378 00:07:36.376 19:43:27 event.event_scheduler -- common/autotest_common.sh@974 -- # wait 1339378 00:07:36.635 [2024-07-24 19:43:28.129990] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:07:37.205 00:07:37.205 real 0m5.440s 00:07:37.205 user 0m10.450s 00:07:37.205 sys 0m0.641s 00:07:37.205 19:43:28 event.event_scheduler -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:37.205 19:43:28 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:37.205 ************************************ 00:07:37.205 END TEST event_scheduler 00:07:37.205 ************************************ 00:07:37.205 19:43:28 event -- event/event.sh@51 -- # modprobe -n nbd 00:07:37.205 19:43:28 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:07:37.205 19:43:28 event -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:37.205 19:43:28 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:37.205 19:43:28 event -- common/autotest_common.sh@10 -- # set +x 00:07:37.205 ************************************ 00:07:37.205 START TEST app_repeat 00:07:37.205 ************************************ 00:07:37.205 19:43:28 event.app_repeat -- common/autotest_common.sh@1125 -- # app_repeat_test 00:07:37.205 19:43:28 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:37.205 19:43:28 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:37.205 19:43:28 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:07:37.205 19:43:28 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:37.205 19:43:28 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:07:37.205 19:43:28 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:07:37.205 19:43:28 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:07:37.205 19:43:28 event.app_repeat -- event/event.sh@19 -- # repeat_pid=1340146 00:07:37.205 19:43:28 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:07:37.205 19:43:28 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:07:37.205 19:43:28 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 1340146' 00:07:37.205 Process app_repeat pid: 1340146 00:07:37.205 19:43:28 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:37.205 19:43:28 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:07:37.205 spdk_app_start Round 0 00:07:37.205 19:43:28 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1340146 /var/tmp/spdk-nbd.sock 00:07:37.205 19:43:28 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 1340146 ']' 00:07:37.205 19:43:28 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:37.205 19:43:28 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:37.205 19:43:28 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:37.205 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:37.205 19:43:28 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:37.205 19:43:28 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:37.205 [2024-07-24 19:43:28.638828] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:07:37.205 [2024-07-24 19:43:28.638899] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1340146 ] 00:07:37.205 [2024-07-24 19:43:28.770152] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:37.465 [2024-07-24 19:43:28.872972] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:37.465 [2024-07-24 19:43:28.872975] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:38.034 19:43:29 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:38.034 19:43:29 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:07:38.034 19:43:29 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:38.604 Malloc0 00:07:38.604 19:43:30 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:38.863 Malloc1 00:07:38.863 19:43:30 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:38.863 19:43:30 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:38.863 19:43:30 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:38.863 19:43:30 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:38.863 19:43:30 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:38.863 19:43:30 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:38.863 19:43:30 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:38.863 19:43:30 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:38.863 19:43:30 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:38.863 19:43:30 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:38.863 19:43:30 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:38.863 19:43:30 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:38.863 19:43:30 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:38.863 19:43:30 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:38.864 19:43:30 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:38.864 19:43:30 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:39.123 /dev/nbd0 00:07:39.123 19:43:30 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:39.123 19:43:30 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:39.123 19:43:30 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:07:39.123 19:43:30 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:39.123 19:43:30 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:39.123 19:43:30 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:39.123 19:43:30 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:07:39.123 19:43:30 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:39.123 19:43:30 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:39.123 19:43:30 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:39.123 19:43:30 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:39.123 1+0 records in 00:07:39.123 1+0 records out 00:07:39.123 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000266016 s, 15.4 MB/s 00:07:39.123 19:43:30 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:39.123 19:43:30 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:39.123 19:43:30 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:39.123 19:43:30 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:39.123 19:43:30 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:39.123 19:43:30 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:39.123 19:43:30 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:39.123 19:43:30 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:39.383 /dev/nbd1 00:07:39.384 19:43:30 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:39.384 19:43:30 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:39.384 19:43:30 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:07:39.384 19:43:30 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:39.384 19:43:30 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:39.384 19:43:30 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:39.384 19:43:30 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:07:39.384 19:43:30 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:39.384 19:43:30 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:39.384 19:43:30 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:39.384 19:43:30 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:39.384 1+0 records in 00:07:39.384 1+0 records out 00:07:39.384 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000261604 s, 15.7 MB/s 00:07:39.384 19:43:30 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:39.384 19:43:30 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:39.384 19:43:30 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:39.384 19:43:30 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:39.384 19:43:30 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:39.384 19:43:30 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:39.384 19:43:30 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:39.384 19:43:30 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:39.384 19:43:30 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:39.384 19:43:30 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:39.645 19:43:31 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:39.645 { 00:07:39.645 "nbd_device": "/dev/nbd0", 00:07:39.645 "bdev_name": "Malloc0" 00:07:39.645 }, 00:07:39.645 { 00:07:39.645 "nbd_device": "/dev/nbd1", 00:07:39.645 "bdev_name": "Malloc1" 00:07:39.645 } 00:07:39.645 ]' 00:07:39.645 19:43:31 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:39.645 { 00:07:39.645 "nbd_device": "/dev/nbd0", 00:07:39.645 "bdev_name": "Malloc0" 00:07:39.645 }, 00:07:39.645 { 00:07:39.645 "nbd_device": "/dev/nbd1", 00:07:39.645 "bdev_name": "Malloc1" 00:07:39.645 } 00:07:39.645 ]' 00:07:39.645 19:43:31 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:39.905 /dev/nbd1' 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:39.905 /dev/nbd1' 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:39.905 256+0 records in 00:07:39.905 256+0 records out 00:07:39.905 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0108314 s, 96.8 MB/s 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:39.905 256+0 records in 00:07:39.905 256+0 records out 00:07:39.905 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0301584 s, 34.8 MB/s 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:39.905 256+0 records in 00:07:39.905 256+0 records out 00:07:39.905 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0314645 s, 33.3 MB/s 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:39.905 19:43:31 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:40.165 19:43:31 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:40.165 19:43:31 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:40.165 19:43:31 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:40.165 19:43:31 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:40.165 19:43:31 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:40.165 19:43:31 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:40.165 19:43:31 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:40.165 19:43:31 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:40.166 19:43:31 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:40.166 19:43:31 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:40.425 19:43:31 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:40.425 19:43:31 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:40.425 19:43:31 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:40.425 19:43:31 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:40.425 19:43:31 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:40.425 19:43:31 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:40.425 19:43:31 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:40.425 19:43:31 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:40.425 19:43:31 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:40.425 19:43:31 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:40.425 19:43:31 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:40.685 19:43:32 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:40.685 19:43:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:40.685 19:43:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:40.685 19:43:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:40.685 19:43:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:40.685 19:43:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:40.685 19:43:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:40.685 19:43:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:40.685 19:43:32 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:40.685 19:43:32 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:40.685 19:43:32 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:40.685 19:43:32 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:40.685 19:43:32 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:40.944 19:43:32 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:41.205 [2024-07-24 19:43:32.673043] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:41.205 [2024-07-24 19:43:32.771829] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:41.205 [2024-07-24 19:43:32.771833] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.464 [2024-07-24 19:43:32.824105] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:41.464 [2024-07-24 19:43:32.824158] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:44.002 19:43:35 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:44.002 19:43:35 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:07:44.002 spdk_app_start Round 1 00:07:44.002 19:43:35 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1340146 /var/tmp/spdk-nbd.sock 00:07:44.002 19:43:35 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 1340146 ']' 00:07:44.002 19:43:35 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:44.002 19:43:35 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:44.002 19:43:35 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:44.002 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:44.002 19:43:35 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:44.002 19:43:35 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:44.262 19:43:35 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:44.262 19:43:35 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:07:44.262 19:43:35 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:44.522 Malloc0 00:07:44.522 19:43:35 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:44.781 Malloc1 00:07:44.781 19:43:36 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:44.782 19:43:36 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:44.782 19:43:36 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:44.782 19:43:36 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:44.782 19:43:36 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:44.782 19:43:36 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:44.782 19:43:36 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:44.782 19:43:36 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:44.782 19:43:36 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:44.782 19:43:36 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:44.782 19:43:36 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:44.782 19:43:36 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:44.782 19:43:36 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:44.782 19:43:36 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:44.782 19:43:36 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:44.782 19:43:36 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:45.042 /dev/nbd0 00:07:45.042 19:43:36 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:45.042 19:43:36 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:45.042 19:43:36 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:07:45.042 19:43:36 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:45.042 19:43:36 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:45.042 19:43:36 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:45.042 19:43:36 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:07:45.042 19:43:36 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:45.042 19:43:36 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:45.042 19:43:36 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:45.042 19:43:36 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:45.042 1+0 records in 00:07:45.042 1+0 records out 00:07:45.042 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000278747 s, 14.7 MB/s 00:07:45.042 19:43:36 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:45.042 19:43:36 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:45.042 19:43:36 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:45.042 19:43:36 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:45.042 19:43:36 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:45.042 19:43:36 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:45.042 19:43:36 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:45.042 19:43:36 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:45.301 /dev/nbd1 00:07:45.301 19:43:36 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:45.301 19:43:36 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:45.301 19:43:36 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:07:45.301 19:43:36 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:45.301 19:43:36 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:45.301 19:43:36 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:45.301 19:43:36 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:07:45.301 19:43:36 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:45.301 19:43:36 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:45.301 19:43:36 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:45.301 19:43:36 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:45.301 1+0 records in 00:07:45.301 1+0 records out 00:07:45.301 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000219909 s, 18.6 MB/s 00:07:45.302 19:43:36 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:45.302 19:43:36 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:45.302 19:43:36 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:45.302 19:43:36 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:45.302 19:43:36 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:45.302 19:43:36 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:45.302 19:43:36 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:45.302 19:43:36 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:45.302 19:43:36 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:45.302 19:43:36 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:45.561 { 00:07:45.561 "nbd_device": "/dev/nbd0", 00:07:45.561 "bdev_name": "Malloc0" 00:07:45.561 }, 00:07:45.561 { 00:07:45.561 "nbd_device": "/dev/nbd1", 00:07:45.561 "bdev_name": "Malloc1" 00:07:45.561 } 00:07:45.561 ]' 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:45.561 { 00:07:45.561 "nbd_device": "/dev/nbd0", 00:07:45.561 "bdev_name": "Malloc0" 00:07:45.561 }, 00:07:45.561 { 00:07:45.561 "nbd_device": "/dev/nbd1", 00:07:45.561 "bdev_name": "Malloc1" 00:07:45.561 } 00:07:45.561 ]' 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:45.561 /dev/nbd1' 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:45.561 /dev/nbd1' 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:45.561 256+0 records in 00:07:45.561 256+0 records out 00:07:45.561 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0103134 s, 102 MB/s 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:45.561 256+0 records in 00:07:45.561 256+0 records out 00:07:45.561 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0299642 s, 35.0 MB/s 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:45.561 256+0 records in 00:07:45.561 256+0 records out 00:07:45.561 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0248075 s, 42.3 MB/s 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:45.561 19:43:37 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:45.820 19:43:37 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:45.820 19:43:37 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:45.820 19:43:37 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:45.820 19:43:37 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:45.820 19:43:37 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:45.820 19:43:37 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:45.820 19:43:37 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:46.080 19:43:37 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:46.080 19:43:37 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:46.080 19:43:37 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:46.080 19:43:37 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:46.080 19:43:37 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:46.080 19:43:37 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:46.080 19:43:37 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:46.080 19:43:37 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:46.080 19:43:37 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:46.080 19:43:37 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:46.080 19:43:37 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:46.080 19:43:37 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:46.080 19:43:37 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:46.080 19:43:37 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:46.080 19:43:37 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:46.080 19:43:37 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:46.080 19:43:37 event.app_repeat -- bdev/nbd_common.sh@39 -- # sleep 0.1 00:07:46.339 19:43:37 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i++ )) 00:07:46.339 19:43:37 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:46.339 19:43:37 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:46.339 19:43:37 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:46.339 19:43:37 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:46.339 19:43:37 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:46.339 19:43:37 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:46.339 19:43:37 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:46.599 19:43:38 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:46.599 19:43:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:46.599 19:43:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:46.599 19:43:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:46.599 19:43:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:46.599 19:43:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:46.599 19:43:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:46.599 19:43:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:46.599 19:43:38 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:46.599 19:43:38 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:46.599 19:43:38 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:46.599 19:43:38 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:46.599 19:43:38 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:46.859 19:43:38 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:47.119 [2024-07-24 19:43:38.557835] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:47.119 [2024-07-24 19:43:38.656089] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:47.119 [2024-07-24 19:43:38.656092] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.119 [2024-07-24 19:43:38.709665] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:47.119 [2024-07-24 19:43:38.709715] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:50.413 19:43:41 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:50.413 19:43:41 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:07:50.413 spdk_app_start Round 2 00:07:50.413 19:43:41 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1340146 /var/tmp/spdk-nbd.sock 00:07:50.413 19:43:41 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 1340146 ']' 00:07:50.413 19:43:41 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:50.413 19:43:41 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:50.413 19:43:41 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:50.413 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:50.413 19:43:41 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:50.413 19:43:41 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:50.413 19:43:41 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:50.413 19:43:41 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:07:50.413 19:43:41 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:50.413 Malloc0 00:07:50.413 19:43:41 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:50.673 Malloc1 00:07:50.673 19:43:42 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:50.673 19:43:42 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:50.673 19:43:42 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:50.673 19:43:42 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:50.673 19:43:42 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:50.673 19:43:42 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:50.673 19:43:42 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:50.673 19:43:42 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:50.673 19:43:42 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:50.673 19:43:42 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:50.673 19:43:42 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:50.673 19:43:42 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:50.673 19:43:42 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:50.673 19:43:42 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:50.673 19:43:42 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:50.673 19:43:42 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:50.933 /dev/nbd0 00:07:50.933 19:43:42 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:50.933 19:43:42 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:50.933 19:43:42 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:07:50.933 19:43:42 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:50.933 19:43:42 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:50.933 19:43:42 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:50.933 19:43:42 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:07:50.933 19:43:42 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:50.933 19:43:42 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:50.933 19:43:42 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:50.933 19:43:42 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:50.933 1+0 records in 00:07:50.933 1+0 records out 00:07:50.933 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000242245 s, 16.9 MB/s 00:07:50.933 19:43:42 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:50.933 19:43:42 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:50.933 19:43:42 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:50.933 19:43:42 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:50.933 19:43:42 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:50.933 19:43:42 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:50.933 19:43:42 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:50.933 19:43:42 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:51.224 /dev/nbd1 00:07:51.224 19:43:42 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:51.224 19:43:42 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:51.224 19:43:42 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:07:51.224 19:43:42 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:51.224 19:43:42 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:51.224 19:43:42 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:51.224 19:43:42 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:07:51.224 19:43:42 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:51.224 19:43:42 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:51.224 19:43:42 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:51.224 19:43:42 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:51.225 1+0 records in 00:07:51.225 1+0 records out 00:07:51.225 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000294916 s, 13.9 MB/s 00:07:51.225 19:43:42 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:51.225 19:43:42 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:51.225 19:43:42 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:51.225 19:43:42 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:51.225 19:43:42 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:51.225 19:43:42 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:51.225 19:43:42 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:51.225 19:43:42 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:51.225 19:43:42 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:51.225 19:43:42 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:51.484 19:43:42 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:51.484 { 00:07:51.484 "nbd_device": "/dev/nbd0", 00:07:51.484 "bdev_name": "Malloc0" 00:07:51.484 }, 00:07:51.484 { 00:07:51.484 "nbd_device": "/dev/nbd1", 00:07:51.484 "bdev_name": "Malloc1" 00:07:51.484 } 00:07:51.484 ]' 00:07:51.484 19:43:42 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:51.484 19:43:42 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:51.484 { 00:07:51.484 "nbd_device": "/dev/nbd0", 00:07:51.484 "bdev_name": "Malloc0" 00:07:51.484 }, 00:07:51.484 { 00:07:51.484 "nbd_device": "/dev/nbd1", 00:07:51.484 "bdev_name": "Malloc1" 00:07:51.484 } 00:07:51.484 ]' 00:07:51.484 19:43:42 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:51.484 /dev/nbd1' 00:07:51.484 19:43:42 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:51.484 /dev/nbd1' 00:07:51.484 19:43:42 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:51.484 19:43:42 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:51.484 19:43:42 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:51.484 19:43:42 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:51.484 19:43:42 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:51.484 19:43:42 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:51.484 19:43:42 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:51.484 19:43:42 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:51.484 19:43:42 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:51.484 19:43:42 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:51.484 19:43:42 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:51.484 19:43:42 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:51.484 256+0 records in 00:07:51.484 256+0 records out 00:07:51.484 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0105308 s, 99.6 MB/s 00:07:51.484 19:43:43 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:51.484 19:43:43 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:51.484 256+0 records in 00:07:51.484 256+0 records out 00:07:51.484 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0301944 s, 34.7 MB/s 00:07:51.484 19:43:43 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:51.484 19:43:43 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:51.484 256+0 records in 00:07:51.484 256+0 records out 00:07:51.484 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0314658 s, 33.3 MB/s 00:07:51.484 19:43:43 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:51.484 19:43:43 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:51.484 19:43:43 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:51.484 19:43:43 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:51.484 19:43:43 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:51.484 19:43:43 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:51.484 19:43:43 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:51.484 19:43:43 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:51.485 19:43:43 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:51.744 19:43:43 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:51.744 19:43:43 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:51.744 19:43:43 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:51.744 19:43:43 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:51.744 19:43:43 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:51.744 19:43:43 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:51.744 19:43:43 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:51.744 19:43:43 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:51.744 19:43:43 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:51.744 19:43:43 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:51.744 19:43:43 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:51.744 19:43:43 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:51.744 19:43:43 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:51.744 19:43:43 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:51.744 19:43:43 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:51.744 19:43:43 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:51.744 19:43:43 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:51.744 19:43:43 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:51.744 19:43:43 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:51.744 19:43:43 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:52.006 19:43:43 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:52.006 19:43:43 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:52.006 19:43:43 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:52.006 19:43:43 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:52.006 19:43:43 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:52.006 19:43:43 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:52.006 19:43:43 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:52.006 19:43:43 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:52.006 19:43:43 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:52.006 19:43:43 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:52.006 19:43:43 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:52.267 19:43:43 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:52.267 19:43:43 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:52.267 19:43:43 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:52.573 19:43:43 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:52.573 19:43:43 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:52.573 19:43:43 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:52.573 19:43:43 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:52.573 19:43:43 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:52.573 19:43:43 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:52.573 19:43:43 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:52.573 19:43:43 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:52.573 19:43:43 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:52.573 19:43:43 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:52.833 19:43:44 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:52.833 [2024-07-24 19:43:44.417425] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:53.092 [2024-07-24 19:43:44.520599] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:53.092 [2024-07-24 19:43:44.520603] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.092 [2024-07-24 19:43:44.574259] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:53.092 [2024-07-24 19:43:44.574310] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:55.623 19:43:47 event.app_repeat -- event/event.sh@38 -- # waitforlisten 1340146 /var/tmp/spdk-nbd.sock 00:07:55.623 19:43:47 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 1340146 ']' 00:07:55.623 19:43:47 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:55.623 19:43:47 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:55.623 19:43:47 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:55.623 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:55.623 19:43:47 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:55.623 19:43:47 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:55.881 19:43:47 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:55.881 19:43:47 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:07:55.881 19:43:47 event.app_repeat -- event/event.sh@39 -- # killprocess 1340146 00:07:55.881 19:43:47 event.app_repeat -- common/autotest_common.sh@950 -- # '[' -z 1340146 ']' 00:07:55.881 19:43:47 event.app_repeat -- common/autotest_common.sh@954 -- # kill -0 1340146 00:07:55.881 19:43:47 event.app_repeat -- common/autotest_common.sh@955 -- # uname 00:07:55.881 19:43:47 event.app_repeat -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:55.881 19:43:47 event.app_repeat -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1340146 00:07:56.140 19:43:47 event.app_repeat -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:56.140 19:43:47 event.app_repeat -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:56.140 19:43:47 event.app_repeat -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1340146' 00:07:56.140 killing process with pid 1340146 00:07:56.140 19:43:47 event.app_repeat -- common/autotest_common.sh@969 -- # kill 1340146 00:07:56.140 19:43:47 event.app_repeat -- common/autotest_common.sh@974 -- # wait 1340146 00:07:56.140 spdk_app_start is called in Round 0. 00:07:56.140 Shutdown signal received, stop current app iteration 00:07:56.140 Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 reinitialization... 00:07:56.140 spdk_app_start is called in Round 1. 00:07:56.140 Shutdown signal received, stop current app iteration 00:07:56.140 Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 reinitialization... 00:07:56.140 spdk_app_start is called in Round 2. 00:07:56.140 Shutdown signal received, stop current app iteration 00:07:56.140 Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 reinitialization... 00:07:56.140 spdk_app_start is called in Round 3. 00:07:56.140 Shutdown signal received, stop current app iteration 00:07:56.140 19:43:47 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:07:56.141 19:43:47 event.app_repeat -- event/event.sh@42 -- # return 0 00:07:56.141 00:07:56.141 real 0m19.111s 00:07:56.141 user 0m41.594s 00:07:56.141 sys 0m3.843s 00:07:56.141 19:43:47 event.app_repeat -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:56.141 19:43:47 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:56.141 ************************************ 00:07:56.141 END TEST app_repeat 00:07:56.141 ************************************ 00:07:56.400 19:43:47 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:07:56.400 00:07:56.400 real 0m29.168s 00:07:56.400 user 0m58.884s 00:07:56.400 sys 0m5.278s 00:07:56.400 19:43:47 event -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:56.400 19:43:47 event -- common/autotest_common.sh@10 -- # set +x 00:07:56.400 ************************************ 00:07:56.400 END TEST event 00:07:56.400 ************************************ 00:07:56.400 19:43:47 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:07:56.400 19:43:47 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:56.400 19:43:47 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:56.400 19:43:47 -- common/autotest_common.sh@10 -- # set +x 00:07:56.400 ************************************ 00:07:56.400 START TEST thread 00:07:56.400 ************************************ 00:07:56.400 19:43:47 thread -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:07:56.400 * Looking for test storage... 00:07:56.400 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:07:56.400 19:43:47 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:56.400 19:43:47 thread -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:07:56.400 19:43:47 thread -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:56.400 19:43:47 thread -- common/autotest_common.sh@10 -- # set +x 00:07:56.400 ************************************ 00:07:56.400 START TEST thread_poller_perf 00:07:56.400 ************************************ 00:07:56.400 19:43:47 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:56.660 [2024-07-24 19:43:48.007095] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:07:56.660 [2024-07-24 19:43:48.007169] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1342851 ] 00:07:56.660 [2024-07-24 19:43:48.137550] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:56.660 [2024-07-24 19:43:48.237321] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:56.660 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:58.040 ====================================== 00:07:58.040 busy:2314704214 (cyc) 00:07:58.040 total_run_count: 267000 00:07:58.040 tsc_hz: 2300000000 (cyc) 00:07:58.040 ====================================== 00:07:58.041 poller_cost: 8669 (cyc), 3769 (nsec) 00:07:58.041 00:07:58.041 real 0m1.360s 00:07:58.041 user 0m1.210s 00:07:58.041 sys 0m0.144s 00:07:58.041 19:43:49 thread.thread_poller_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:58.041 19:43:49 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:58.041 ************************************ 00:07:58.041 END TEST thread_poller_perf 00:07:58.041 ************************************ 00:07:58.041 19:43:49 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:58.041 19:43:49 thread -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:07:58.041 19:43:49 thread -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:58.041 19:43:49 thread -- common/autotest_common.sh@10 -- # set +x 00:07:58.041 ************************************ 00:07:58.041 START TEST thread_poller_perf 00:07:58.041 ************************************ 00:07:58.041 19:43:49 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:58.041 [2024-07-24 19:43:49.448501] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:07:58.041 [2024-07-24 19:43:49.448568] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1343052 ] 00:07:58.041 [2024-07-24 19:43:49.582745] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:58.300 [2024-07-24 19:43:49.692786] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.300 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:07:59.236 ====================================== 00:07:59.236 busy:2302314312 (cyc) 00:07:59.236 total_run_count: 3490000 00:07:59.236 tsc_hz: 2300000000 (cyc) 00:07:59.236 ====================================== 00:07:59.236 poller_cost: 659 (cyc), 286 (nsec) 00:07:59.236 00:07:59.236 real 0m1.369s 00:07:59.236 user 0m1.227s 00:07:59.236 sys 0m0.135s 00:07:59.236 19:43:50 thread.thread_poller_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:59.236 19:43:50 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:59.236 ************************************ 00:07:59.236 END TEST thread_poller_perf 00:07:59.236 ************************************ 00:07:59.497 19:43:50 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:07:59.497 00:07:59.497 real 0m3.001s 00:07:59.497 user 0m2.534s 00:07:59.497 sys 0m0.477s 00:07:59.497 19:43:50 thread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:59.497 19:43:50 thread -- common/autotest_common.sh@10 -- # set +x 00:07:59.497 ************************************ 00:07:59.497 END TEST thread 00:07:59.497 ************************************ 00:07:59.497 19:43:50 -- spdk/autotest.sh@184 -- # [[ 1 -eq 1 ]] 00:07:59.497 19:43:50 -- spdk/autotest.sh@185 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:07:59.497 19:43:50 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:59.497 19:43:50 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:59.497 19:43:50 -- common/autotest_common.sh@10 -- # set +x 00:07:59.497 ************************************ 00:07:59.497 START TEST accel 00:07:59.497 ************************************ 00:07:59.497 19:43:50 accel -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:07:59.497 * Looking for test storage... 00:07:59.497 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:07:59.497 19:43:51 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:07:59.497 19:43:51 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:07:59.497 19:43:51 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:59.497 19:43:51 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=1343290 00:07:59.497 19:43:51 accel -- accel/accel.sh@63 -- # waitforlisten 1343290 00:07:59.497 19:43:51 accel -- common/autotest_common.sh@831 -- # '[' -z 1343290 ']' 00:07:59.497 19:43:51 accel -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:59.497 19:43:51 accel -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:59.497 19:43:51 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:07:59.497 19:43:51 accel -- accel/accel.sh@61 -- # build_accel_config 00:07:59.497 19:43:51 accel -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:59.497 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:59.497 19:43:51 accel -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:59.497 19:43:51 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:59.497 19:43:51 accel -- common/autotest_common.sh@10 -- # set +x 00:07:59.497 19:43:51 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:59.497 19:43:51 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:59.498 19:43:51 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:59.498 19:43:51 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:59.498 19:43:51 accel -- accel/accel.sh@40 -- # local IFS=, 00:07:59.498 19:43:51 accel -- accel/accel.sh@41 -- # jq -r . 00:07:59.756 [2024-07-24 19:43:51.094517] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:07:59.756 [2024-07-24 19:43:51.094596] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1343290 ] 00:07:59.756 [2024-07-24 19:43:51.217737] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:59.756 [2024-07-24 19:43:51.316136] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.692 19:43:51 accel -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:00.692 19:43:51 accel -- common/autotest_common.sh@864 -- # return 0 00:08:00.692 19:43:51 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:08:00.692 19:43:51 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:08:00.692 19:43:51 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:08:00.692 19:43:51 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:08:00.692 19:43:51 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:08:00.692 19:43:51 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:08:00.692 19:43:51 accel -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:00.692 19:43:51 accel -- common/autotest_common.sh@10 -- # set +x 00:08:00.692 19:43:51 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:08:00.692 19:43:51 accel -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:00.692 19:43:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.692 19:43:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.692 19:43:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.692 19:43:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.692 19:43:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.692 19:43:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.692 19:43:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.692 19:43:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.692 19:43:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.692 19:43:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.692 19:43:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.692 19:43:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.692 19:43:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.692 19:43:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.692 19:43:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.692 19:43:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.692 19:43:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.692 19:43:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.692 19:43:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.692 19:43:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.692 19:43:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.692 19:43:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.692 19:43:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.692 19:43:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.692 19:43:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.692 19:43:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.692 19:43:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.692 19:43:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.692 19:43:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.692 19:43:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.692 19:43:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.692 19:43:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.692 19:43:52 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # IFS== 00:08:00.692 19:43:52 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:00.692 19:43:52 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:00.692 19:43:52 accel -- accel/accel.sh@75 -- # killprocess 1343290 00:08:00.692 19:43:52 accel -- common/autotest_common.sh@950 -- # '[' -z 1343290 ']' 00:08:00.692 19:43:52 accel -- common/autotest_common.sh@954 -- # kill -0 1343290 00:08:00.692 19:43:52 accel -- common/autotest_common.sh@955 -- # uname 00:08:00.692 19:43:52 accel -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:08:00.692 19:43:52 accel -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1343290 00:08:00.692 19:43:52 accel -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:08:00.692 19:43:52 accel -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:08:00.692 19:43:52 accel -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1343290' 00:08:00.692 killing process with pid 1343290 00:08:00.692 19:43:52 accel -- common/autotest_common.sh@969 -- # kill 1343290 00:08:00.692 19:43:52 accel -- common/autotest_common.sh@974 -- # wait 1343290 00:08:00.951 19:43:52 accel -- accel/accel.sh@76 -- # trap - ERR 00:08:00.951 19:43:52 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:08:00.951 19:43:52 accel -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:08:00.951 19:43:52 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:00.951 19:43:52 accel -- common/autotest_common.sh@10 -- # set +x 00:08:00.951 19:43:52 accel.accel_help -- common/autotest_common.sh@1125 -- # accel_perf -h 00:08:00.951 19:43:52 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:08:00.951 19:43:52 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:08:00.951 19:43:52 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:00.951 19:43:52 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:00.951 19:43:52 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:00.951 19:43:52 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:00.951 19:43:52 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:00.951 19:43:52 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:08:00.951 19:43:52 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:08:00.951 19:43:52 accel.accel_help -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:00.951 19:43:52 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:08:01.209 19:43:52 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:08:01.209 19:43:52 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:08:01.209 19:43:52 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:01.209 19:43:52 accel -- common/autotest_common.sh@10 -- # set +x 00:08:01.209 ************************************ 00:08:01.209 START TEST accel_missing_filename 00:08:01.209 ************************************ 00:08:01.209 19:43:52 accel.accel_missing_filename -- common/autotest_common.sh@1125 -- # NOT accel_perf -t 1 -w compress 00:08:01.209 19:43:52 accel.accel_missing_filename -- common/autotest_common.sh@650 -- # local es=0 00:08:01.209 19:43:52 accel.accel_missing_filename -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress 00:08:01.209 19:43:52 accel.accel_missing_filename -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:08:01.209 19:43:52 accel.accel_missing_filename -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:08:01.209 19:43:52 accel.accel_missing_filename -- common/autotest_common.sh@642 -- # type -t accel_perf 00:08:01.209 19:43:52 accel.accel_missing_filename -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:08:01.210 19:43:52 accel.accel_missing_filename -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress 00:08:01.210 19:43:52 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:08:01.210 19:43:52 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:08:01.210 19:43:52 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:01.210 19:43:52 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:01.210 19:43:52 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:01.210 19:43:52 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:01.210 19:43:52 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:01.210 19:43:52 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:08:01.210 19:43:52 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:08:01.210 [2024-07-24 19:43:52.653266] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:01.210 [2024-07-24 19:43:52.653332] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1343589 ] 00:08:01.210 [2024-07-24 19:43:52.785015] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:01.468 [2024-07-24 19:43:52.891338] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:01.468 [2024-07-24 19:43:52.968384] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:01.468 [2024-07-24 19:43:53.041217] accel_perf.c:1540:main: *ERROR*: ERROR starting application 00:08:01.727 A filename is required. 00:08:01.727 19:43:53 accel.accel_missing_filename -- common/autotest_common.sh@653 -- # es=234 00:08:01.727 19:43:53 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:08:01.727 19:43:53 accel.accel_missing_filename -- common/autotest_common.sh@662 -- # es=106 00:08:01.727 19:43:53 accel.accel_missing_filename -- common/autotest_common.sh@663 -- # case "$es" in 00:08:01.727 19:43:53 accel.accel_missing_filename -- common/autotest_common.sh@670 -- # es=1 00:08:01.727 19:43:53 accel.accel_missing_filename -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:08:01.727 00:08:01.727 real 0m0.520s 00:08:01.727 user 0m0.339s 00:08:01.728 sys 0m0.200s 00:08:01.728 19:43:53 accel.accel_missing_filename -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:01.728 19:43:53 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:08:01.728 ************************************ 00:08:01.728 END TEST accel_missing_filename 00:08:01.728 ************************************ 00:08:01.728 19:43:53 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:01.728 19:43:53 accel -- common/autotest_common.sh@1101 -- # '[' 10 -le 1 ']' 00:08:01.728 19:43:53 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:01.728 19:43:53 accel -- common/autotest_common.sh@10 -- # set +x 00:08:01.728 ************************************ 00:08:01.728 START TEST accel_compress_verify 00:08:01.728 ************************************ 00:08:01.728 19:43:53 accel.accel_compress_verify -- common/autotest_common.sh@1125 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:01.728 19:43:53 accel.accel_compress_verify -- common/autotest_common.sh@650 -- # local es=0 00:08:01.728 19:43:53 accel.accel_compress_verify -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:01.728 19:43:53 accel.accel_compress_verify -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:08:01.728 19:43:53 accel.accel_compress_verify -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:08:01.728 19:43:53 accel.accel_compress_verify -- common/autotest_common.sh@642 -- # type -t accel_perf 00:08:01.728 19:43:53 accel.accel_compress_verify -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:08:01.728 19:43:53 accel.accel_compress_verify -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:01.728 19:43:53 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:01.728 19:43:53 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:08:01.728 19:43:53 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:01.728 19:43:53 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:01.728 19:43:53 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:01.728 19:43:53 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:01.728 19:43:53 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:01.728 19:43:53 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:08:01.728 19:43:53 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:08:01.728 [2024-07-24 19:43:53.261540] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:01.728 [2024-07-24 19:43:53.261602] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1343696 ] 00:08:01.986 [2024-07-24 19:43:53.390867] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:01.986 [2024-07-24 19:43:53.498327] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:01.986 [2024-07-24 19:43:53.563973] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:02.245 [2024-07-24 19:43:53.637085] accel_perf.c:1540:main: *ERROR*: ERROR starting application 00:08:02.245 00:08:02.245 Compression does not support the verify option, aborting. 00:08:02.245 19:43:53 accel.accel_compress_verify -- common/autotest_common.sh@653 -- # es=161 00:08:02.245 19:43:53 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:08:02.245 19:43:53 accel.accel_compress_verify -- common/autotest_common.sh@662 -- # es=33 00:08:02.245 19:43:53 accel.accel_compress_verify -- common/autotest_common.sh@663 -- # case "$es" in 00:08:02.245 19:43:53 accel.accel_compress_verify -- common/autotest_common.sh@670 -- # es=1 00:08:02.245 19:43:53 accel.accel_compress_verify -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:08:02.245 00:08:02.245 real 0m0.507s 00:08:02.245 user 0m0.336s 00:08:02.245 sys 0m0.201s 00:08:02.245 19:43:53 accel.accel_compress_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:02.245 19:43:53 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:08:02.245 ************************************ 00:08:02.245 END TEST accel_compress_verify 00:08:02.245 ************************************ 00:08:02.245 19:43:53 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:08:02.245 19:43:53 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:08:02.245 19:43:53 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:02.245 19:43:53 accel -- common/autotest_common.sh@10 -- # set +x 00:08:02.245 ************************************ 00:08:02.245 START TEST accel_wrong_workload 00:08:02.245 ************************************ 00:08:02.245 19:43:53 accel.accel_wrong_workload -- common/autotest_common.sh@1125 -- # NOT accel_perf -t 1 -w foobar 00:08:02.245 19:43:53 accel.accel_wrong_workload -- common/autotest_common.sh@650 -- # local es=0 00:08:02.245 19:43:53 accel.accel_wrong_workload -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:08:02.245 19:43:53 accel.accel_wrong_workload -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:08:02.245 19:43:53 accel.accel_wrong_workload -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:08:02.245 19:43:53 accel.accel_wrong_workload -- common/autotest_common.sh@642 -- # type -t accel_perf 00:08:02.245 19:43:53 accel.accel_wrong_workload -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:08:02.245 19:43:53 accel.accel_wrong_workload -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w foobar 00:08:02.245 19:43:53 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:08:02.245 19:43:53 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:08:02.245 19:43:53 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:02.245 19:43:53 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:02.245 19:43:53 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:02.245 19:43:53 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:02.245 19:43:53 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:02.245 19:43:53 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:08:02.245 19:43:53 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:08:02.505 Unsupported workload type: foobar 00:08:02.505 [2024-07-24 19:43:53.856488] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:08:02.505 accel_perf options: 00:08:02.505 [-h help message] 00:08:02.505 [-q queue depth per core] 00:08:02.505 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:08:02.505 [-T number of threads per core 00:08:02.505 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:08:02.505 [-t time in seconds] 00:08:02.505 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:08:02.505 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:08:02.505 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:08:02.505 [-l for compress/decompress workloads, name of uncompressed input file 00:08:02.505 [-S for crc32c workload, use this seed value (default 0) 00:08:02.505 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:08:02.505 [-f for fill workload, use this BYTE value (default 255) 00:08:02.505 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:08:02.505 [-y verify result if this switch is on] 00:08:02.505 [-a tasks to allocate per core (default: same value as -q)] 00:08:02.505 Can be used to spread operations across a wider range of memory. 00:08:02.505 19:43:53 accel.accel_wrong_workload -- common/autotest_common.sh@653 -- # es=1 00:08:02.505 19:43:53 accel.accel_wrong_workload -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:08:02.505 19:43:53 accel.accel_wrong_workload -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:08:02.505 19:43:53 accel.accel_wrong_workload -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:08:02.505 00:08:02.505 real 0m0.058s 00:08:02.505 user 0m0.074s 00:08:02.505 sys 0m0.027s 00:08:02.505 19:43:53 accel.accel_wrong_workload -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:02.505 19:43:53 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:08:02.505 ************************************ 00:08:02.505 END TEST accel_wrong_workload 00:08:02.505 ************************************ 00:08:02.505 19:43:53 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:08:02.505 19:43:53 accel -- common/autotest_common.sh@1101 -- # '[' 10 -le 1 ']' 00:08:02.505 19:43:53 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:02.505 19:43:53 accel -- common/autotest_common.sh@10 -- # set +x 00:08:02.505 ************************************ 00:08:02.505 START TEST accel_negative_buffers 00:08:02.505 ************************************ 00:08:02.505 19:43:53 accel.accel_negative_buffers -- common/autotest_common.sh@1125 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:08:02.505 19:43:53 accel.accel_negative_buffers -- common/autotest_common.sh@650 -- # local es=0 00:08:02.505 19:43:53 accel.accel_negative_buffers -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:08:02.505 19:43:53 accel.accel_negative_buffers -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:08:02.505 19:43:53 accel.accel_negative_buffers -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:08:02.505 19:43:53 accel.accel_negative_buffers -- common/autotest_common.sh@642 -- # type -t accel_perf 00:08:02.505 19:43:53 accel.accel_negative_buffers -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:08:02.505 19:43:53 accel.accel_negative_buffers -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w xor -y -x -1 00:08:02.505 19:43:53 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:08:02.505 19:43:53 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:08:02.505 19:43:53 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:02.505 19:43:53 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:02.505 19:43:53 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:02.505 19:43:53 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:02.505 19:43:53 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:02.505 19:43:53 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:08:02.505 19:43:53 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:08:02.505 -x option must be non-negative. 00:08:02.505 [2024-07-24 19:43:53.988146] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:08:02.505 accel_perf options: 00:08:02.505 [-h help message] 00:08:02.505 [-q queue depth per core] 00:08:02.505 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:08:02.505 [-T number of threads per core 00:08:02.505 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:08:02.505 [-t time in seconds] 00:08:02.505 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:08:02.505 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:08:02.505 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:08:02.505 [-l for compress/decompress workloads, name of uncompressed input file 00:08:02.505 [-S for crc32c workload, use this seed value (default 0) 00:08:02.505 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:08:02.505 [-f for fill workload, use this BYTE value (default 255) 00:08:02.505 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:08:02.505 [-y verify result if this switch is on] 00:08:02.505 [-a tasks to allocate per core (default: same value as -q)] 00:08:02.505 Can be used to spread operations across a wider range of memory. 00:08:02.505 19:43:53 accel.accel_negative_buffers -- common/autotest_common.sh@653 -- # es=1 00:08:02.506 19:43:53 accel.accel_negative_buffers -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:08:02.506 19:43:53 accel.accel_negative_buffers -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:08:02.506 19:43:53 accel.accel_negative_buffers -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:08:02.506 00:08:02.506 real 0m0.040s 00:08:02.506 user 0m0.022s 00:08:02.506 sys 0m0.018s 00:08:02.506 19:43:53 accel.accel_negative_buffers -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:02.506 19:43:53 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:08:02.506 ************************************ 00:08:02.506 END TEST accel_negative_buffers 00:08:02.506 ************************************ 00:08:02.506 Error: writing output failed: Broken pipe 00:08:02.506 19:43:54 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:08:02.506 19:43:54 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:08:02.506 19:43:54 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:02.506 19:43:54 accel -- common/autotest_common.sh@10 -- # set +x 00:08:02.506 ************************************ 00:08:02.506 START TEST accel_crc32c 00:08:02.506 ************************************ 00:08:02.506 19:43:54 accel.accel_crc32c -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w crc32c -S 32 -y 00:08:02.506 19:43:54 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:08:02.506 19:43:54 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:08:02.506 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:02.506 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:02.506 19:43:54 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:08:02.506 19:43:54 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:08:02.506 19:43:54 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:08:02.506 19:43:54 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:02.506 19:43:54 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:02.506 19:43:54 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:02.506 19:43:54 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:02.506 19:43:54 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:02.506 19:43:54 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:08:02.506 19:43:54 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:08:02.765 [2024-07-24 19:43:54.111422] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:02.765 [2024-07-24 19:43:54.111489] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1343888 ] 00:08:02.765 [2024-07-24 19:43:54.241908] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:02.765 [2024-07-24 19:43:54.342788] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.024 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.025 19:43:54 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.404 19:43:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:04.404 19:43:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.404 19:43:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.404 19:43:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.404 19:43:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:04.404 19:43:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.404 19:43:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.404 19:43:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.404 19:43:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:04.404 19:43:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.404 19:43:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.404 19:43:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.404 19:43:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:04.404 19:43:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.404 19:43:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.404 19:43:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.404 19:43:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:04.404 19:43:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.404 19:43:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.404 19:43:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.404 19:43:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:04.404 19:43:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.404 19:43:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.404 19:43:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.404 19:43:55 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:04.404 19:43:55 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:08:04.404 19:43:55 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:04.404 00:08:04.404 real 0m1.518s 00:08:04.404 user 0m1.309s 00:08:04.404 sys 0m0.209s 00:08:04.404 19:43:55 accel.accel_crc32c -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:04.404 19:43:55 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:08:04.404 ************************************ 00:08:04.404 END TEST accel_crc32c 00:08:04.404 ************************************ 00:08:04.404 19:43:55 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:08:04.404 19:43:55 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:08:04.404 19:43:55 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:04.404 19:43:55 accel -- common/autotest_common.sh@10 -- # set +x 00:08:04.404 ************************************ 00:08:04.404 START TEST accel_crc32c_C2 00:08:04.404 ************************************ 00:08:04.404 19:43:55 accel.accel_crc32c_C2 -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w crc32c -y -C 2 00:08:04.404 19:43:55 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:08:04.404 19:43:55 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:08:04.404 19:43:55 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.404 19:43:55 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.404 19:43:55 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:08:04.404 19:43:55 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:08:04.404 19:43:55 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:08:04.404 19:43:55 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:04.404 19:43:55 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:04.404 19:43:55 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:04.404 19:43:55 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:04.404 19:43:55 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:04.404 19:43:55 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:08:04.404 19:43:55 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:08:04.404 [2024-07-24 19:43:55.714340] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:04.404 [2024-07-24 19:43:55.714408] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1344119 ] 00:08:04.404 [2024-07-24 19:43:55.844096] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:04.404 [2024-07-24 19:43:55.944478] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.664 19:43:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:05.600 00:08:05.600 real 0m1.511s 00:08:05.600 user 0m1.314s 00:08:05.600 sys 0m0.203s 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:05.600 19:43:57 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:08:05.600 ************************************ 00:08:05.600 END TEST accel_crc32c_C2 00:08:05.600 ************************************ 00:08:05.859 19:43:57 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:08:05.859 19:43:57 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:08:05.859 19:43:57 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:05.859 19:43:57 accel -- common/autotest_common.sh@10 -- # set +x 00:08:05.859 ************************************ 00:08:05.859 START TEST accel_copy 00:08:05.859 ************************************ 00:08:05.859 19:43:57 accel.accel_copy -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w copy -y 00:08:05.859 19:43:57 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:08:05.859 19:43:57 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:08:05.859 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:05.859 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:05.859 19:43:57 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:08:05.859 19:43:57 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:08:05.859 19:43:57 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:08:05.859 19:43:57 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:05.859 19:43:57 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:05.859 19:43:57 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:05.859 19:43:57 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:05.859 19:43:57 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:05.859 19:43:57 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:08:05.859 19:43:57 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:08:05.859 [2024-07-24 19:43:57.308341] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:05.859 [2024-07-24 19:43:57.308414] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1344320 ] 00:08:05.859 [2024-07-24 19:43:57.419924] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:06.118 [2024-07-24 19:43:57.526950] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:08:06.118 19:43:57 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.119 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.119 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.119 19:43:57 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:08:06.119 19:43:57 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.119 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.119 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.119 19:43:57 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:08:06.119 19:43:57 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.119 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.119 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.119 19:43:57 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:08:06.119 19:43:57 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.119 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.119 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.119 19:43:57 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:08:06.119 19:43:57 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.119 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.119 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.119 19:43:57 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:06.119 19:43:57 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.119 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.119 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.119 19:43:57 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:06.119 19:43:57 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.119 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.119 19:43:57 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.496 19:43:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:07.496 19:43:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.496 19:43:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.496 19:43:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.496 19:43:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:07.496 19:43:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.496 19:43:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.496 19:43:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.496 19:43:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:07.496 19:43:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.496 19:43:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.496 19:43:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.496 19:43:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:07.496 19:43:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.496 19:43:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.496 19:43:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.496 19:43:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:07.496 19:43:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.496 19:43:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.496 19:43:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.496 19:43:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:07.496 19:43:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.496 19:43:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.496 19:43:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.496 19:43:58 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:07.496 19:43:58 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:08:07.496 19:43:58 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:07.496 00:08:07.496 real 0m1.500s 00:08:07.496 user 0m1.319s 00:08:07.496 sys 0m0.182s 00:08:07.496 19:43:58 accel.accel_copy -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:07.496 19:43:58 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:08:07.496 ************************************ 00:08:07.496 END TEST accel_copy 00:08:07.496 ************************************ 00:08:07.496 19:43:58 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:07.496 19:43:58 accel -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:08:07.496 19:43:58 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:07.496 19:43:58 accel -- common/autotest_common.sh@10 -- # set +x 00:08:07.496 ************************************ 00:08:07.496 START TEST accel_fill 00:08:07.496 ************************************ 00:08:07.496 19:43:58 accel.accel_fill -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:07.496 19:43:58 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:08:07.496 19:43:58 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:08:07.496 19:43:58 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:07.496 19:43:58 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:07.496 19:43:58 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:07.496 19:43:58 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:07.496 19:43:58 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:08:07.496 19:43:58 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:07.496 19:43:58 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:07.496 19:43:58 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:07.496 19:43:58 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:07.496 19:43:58 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:07.496 19:43:58 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:08:07.496 19:43:58 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:08:07.496 [2024-07-24 19:43:58.890252] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:07.496 [2024-07-24 19:43:58.890319] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1344516 ] 00:08:07.496 [2024-07-24 19:43:59.023553] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:07.756 [2024-07-24 19:43:59.125761] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:07.756 19:43:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.134 19:44:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:09.134 19:44:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.134 19:44:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.134 19:44:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.134 19:44:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:09.134 19:44:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.134 19:44:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.134 19:44:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.134 19:44:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:09.134 19:44:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.134 19:44:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.134 19:44:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.134 19:44:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:09.134 19:44:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.134 19:44:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.134 19:44:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.134 19:44:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:09.134 19:44:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.134 19:44:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.134 19:44:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.134 19:44:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:09.134 19:44:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.134 19:44:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.134 19:44:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.134 19:44:00 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:09.134 19:44:00 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:08:09.134 19:44:00 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:09.134 00:08:09.134 real 0m1.512s 00:08:09.134 user 0m1.322s 00:08:09.134 sys 0m0.195s 00:08:09.134 19:44:00 accel.accel_fill -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:09.134 19:44:00 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:08:09.134 ************************************ 00:08:09.134 END TEST accel_fill 00:08:09.134 ************************************ 00:08:09.134 19:44:00 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:08:09.134 19:44:00 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:08:09.134 19:44:00 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:09.134 19:44:00 accel -- common/autotest_common.sh@10 -- # set +x 00:08:09.134 ************************************ 00:08:09.134 START TEST accel_copy_crc32c 00:08:09.134 ************************************ 00:08:09.134 19:44:00 accel.accel_copy_crc32c -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w copy_crc32c -y 00:08:09.134 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:08:09.134 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:08:09.134 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.134 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.134 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:08:09.134 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:08:09.134 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:08:09.134 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:09.134 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:09.134 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:09.134 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:09.134 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:09.134 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:08:09.134 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:08:09.134 [2024-07-24 19:44:00.487286] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:09.134 [2024-07-24 19:44:00.487348] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1344714 ] 00:08:09.134 [2024-07-24 19:44:00.617669] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:09.134 [2024-07-24 19:44:00.719840] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.394 19:44:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:10.398 00:08:10.398 real 0m1.518s 00:08:10.398 user 0m1.306s 00:08:10.398 sys 0m0.213s 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:10.398 19:44:01 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:08:10.398 ************************************ 00:08:10.398 END TEST accel_copy_crc32c 00:08:10.398 ************************************ 00:08:10.658 19:44:02 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:08:10.658 19:44:02 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:08:10.658 19:44:02 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:10.658 19:44:02 accel -- common/autotest_common.sh@10 -- # set +x 00:08:10.658 ************************************ 00:08:10.658 START TEST accel_copy_crc32c_C2 00:08:10.658 ************************************ 00:08:10.658 19:44:02 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:08:10.658 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:08:10.658 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:08:10.658 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:10.658 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:10.658 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:08:10.658 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:08:10.658 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:08:10.658 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:10.658 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:10.658 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:10.658 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:10.658 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:10.658 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:08:10.658 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:08:10.658 [2024-07-24 19:44:02.087529] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:10.658 [2024-07-24 19:44:02.087594] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1344944 ] 00:08:10.658 [2024-07-24 19:44:02.203181] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:10.917 [2024-07-24 19:44:02.311041] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:10.917 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:10.917 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:10.917 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:10.917 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:10.917 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:10.917 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:10.917 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:10.917 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:10.917 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:08:10.917 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:10.917 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:10.917 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:10.917 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:10.917 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:10.917 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:10.917 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:10.917 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:10.917 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:10.917 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:10.917 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:10.917 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:08:10.917 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:10.917 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:08:10.917 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:10.917 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:10.917 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:08:10.917 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:10.918 19:44:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.371 19:44:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:12.371 19:44:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.371 19:44:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.371 19:44:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.371 19:44:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:12.371 19:44:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.371 19:44:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.371 19:44:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.371 19:44:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:12.371 19:44:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.371 19:44:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.371 19:44:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.371 19:44:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:12.371 19:44:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.371 19:44:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.372 19:44:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.372 19:44:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:12.372 19:44:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.372 19:44:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.372 19:44:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.372 19:44:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:12.372 19:44:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.372 19:44:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.372 19:44:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.372 19:44:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:12.372 19:44:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:08:12.372 19:44:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:12.372 00:08:12.372 real 0m1.508s 00:08:12.372 user 0m1.322s 00:08:12.372 sys 0m0.190s 00:08:12.372 19:44:03 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:12.372 19:44:03 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:08:12.372 ************************************ 00:08:12.372 END TEST accel_copy_crc32c_C2 00:08:12.372 ************************************ 00:08:12.372 19:44:03 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:08:12.372 19:44:03 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:08:12.372 19:44:03 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:12.372 19:44:03 accel -- common/autotest_common.sh@10 -- # set +x 00:08:12.372 ************************************ 00:08:12.372 START TEST accel_dualcast 00:08:12.372 ************************************ 00:08:12.372 19:44:03 accel.accel_dualcast -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w dualcast -y 00:08:12.372 19:44:03 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:08:12.372 19:44:03 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:08:12.372 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:12.372 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:12.372 19:44:03 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:08:12.372 19:44:03 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:08:12.372 19:44:03 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:08:12.372 19:44:03 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:12.372 19:44:03 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:12.372 19:44:03 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:12.372 19:44:03 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:12.372 19:44:03 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:12.372 19:44:03 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:08:12.372 19:44:03 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:08:12.372 [2024-07-24 19:44:03.685559] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:12.372 [2024-07-24 19:44:03.685622] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1345260 ] 00:08:12.372 [2024-07-24 19:44:03.815473] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:12.372 [2024-07-24 19:44:03.912691] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:12.631 19:44:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:13.566 19:44:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:13.566 19:44:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:13.566 19:44:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:13.566 19:44:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:13.566 19:44:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:13.566 19:44:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:13.566 19:44:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:13.566 19:44:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:13.566 19:44:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:13.566 19:44:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:13.566 19:44:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:13.566 19:44:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:13.566 19:44:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:13.566 19:44:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:13.566 19:44:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:13.566 19:44:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:13.566 19:44:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:13.566 19:44:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:13.566 19:44:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:13.566 19:44:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:13.566 19:44:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:13.566 19:44:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:13.566 19:44:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:13.566 19:44:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:13.566 19:44:05 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:13.566 19:44:05 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:08:13.566 19:44:05 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:13.566 00:08:13.566 real 0m1.491s 00:08:13.566 user 0m1.302s 00:08:13.566 sys 0m0.192s 00:08:13.566 19:44:05 accel.accel_dualcast -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:13.566 19:44:05 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:08:13.566 ************************************ 00:08:13.566 END TEST accel_dualcast 00:08:13.566 ************************************ 00:08:13.824 19:44:05 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:08:13.824 19:44:05 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:08:13.824 19:44:05 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:13.824 19:44:05 accel -- common/autotest_common.sh@10 -- # set +x 00:08:13.824 ************************************ 00:08:13.824 START TEST accel_compare 00:08:13.824 ************************************ 00:08:13.824 19:44:05 accel.accel_compare -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w compare -y 00:08:13.824 19:44:05 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:08:13.824 19:44:05 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:08:13.824 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:13.824 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:13.824 19:44:05 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:08:13.824 19:44:05 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:08:13.824 19:44:05 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:08:13.824 19:44:05 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:13.824 19:44:05 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:13.824 19:44:05 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:13.824 19:44:05 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:13.824 19:44:05 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:13.824 19:44:05 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:08:13.824 19:44:05 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:08:13.824 [2024-07-24 19:44:05.256630] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:13.824 [2024-07-24 19:44:05.256696] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1345467 ] 00:08:13.824 [2024-07-24 19:44:05.388600] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:14.083 [2024-07-24 19:44:05.494802] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.083 19:44:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.462 19:44:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:15.462 19:44:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:15.462 19:44:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.462 19:44:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.462 19:44:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:15.462 19:44:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:15.462 19:44:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.462 19:44:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.462 19:44:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:15.462 19:44:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:15.462 19:44:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.462 19:44:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.462 19:44:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:15.462 19:44:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:15.462 19:44:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.462 19:44:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.462 19:44:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:15.462 19:44:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:15.462 19:44:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.462 19:44:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.462 19:44:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:15.462 19:44:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:15.462 19:44:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.462 19:44:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.462 19:44:06 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:15.462 19:44:06 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:08:15.462 19:44:06 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:15.462 00:08:15.462 real 0m1.517s 00:08:15.462 user 0m1.314s 00:08:15.462 sys 0m0.208s 00:08:15.462 19:44:06 accel.accel_compare -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:15.462 19:44:06 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:08:15.462 ************************************ 00:08:15.462 END TEST accel_compare 00:08:15.462 ************************************ 00:08:15.462 19:44:06 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:08:15.462 19:44:06 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:08:15.462 19:44:06 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:15.462 19:44:06 accel -- common/autotest_common.sh@10 -- # set +x 00:08:15.462 ************************************ 00:08:15.462 START TEST accel_xor 00:08:15.462 ************************************ 00:08:15.462 19:44:06 accel.accel_xor -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w xor -y 00:08:15.462 19:44:06 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:08:15.462 19:44:06 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:08:15.462 19:44:06 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:15.462 19:44:06 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:15.462 19:44:06 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:08:15.462 19:44:06 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:08:15.462 19:44:06 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:08:15.462 19:44:06 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:15.462 19:44:06 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:15.462 19:44:06 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:15.462 19:44:06 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:15.462 19:44:06 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:15.462 19:44:06 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:08:15.462 19:44:06 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:08:15.462 [2024-07-24 19:44:06.854888] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:15.462 [2024-07-24 19:44:06.854948] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1345664 ] 00:08:15.462 [2024-07-24 19:44:06.983552] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.722 [2024-07-24 19:44:07.085689] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:15.722 19:44:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.098 19:44:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.098 19:44:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.098 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.098 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.098 19:44:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.098 19:44:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.098 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.098 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.098 19:44:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.098 19:44:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.098 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.098 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.098 19:44:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.098 19:44:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.098 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.098 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.098 19:44:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.098 19:44:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.098 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.098 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.098 19:44:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.098 19:44:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.098 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.098 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.098 19:44:08 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:17.098 19:44:08 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:08:17.098 19:44:08 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:17.098 00:08:17.098 real 0m1.507s 00:08:17.098 user 0m1.324s 00:08:17.098 sys 0m0.187s 00:08:17.098 19:44:08 accel.accel_xor -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:17.098 19:44:08 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:08:17.098 ************************************ 00:08:17.098 END TEST accel_xor 00:08:17.098 ************************************ 00:08:17.098 19:44:08 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:08:17.098 19:44:08 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:08:17.098 19:44:08 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:17.098 19:44:08 accel -- common/autotest_common.sh@10 -- # set +x 00:08:17.098 ************************************ 00:08:17.098 START TEST accel_xor 00:08:17.098 ************************************ 00:08:17.098 19:44:08 accel.accel_xor -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w xor -y -x 3 00:08:17.098 19:44:08 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:08:17.099 19:44:08 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:08:17.099 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.099 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.099 19:44:08 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:08:17.099 19:44:08 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:08:17.099 19:44:08 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:08:17.099 19:44:08 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:17.099 19:44:08 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:17.099 19:44:08 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:17.099 19:44:08 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:17.099 19:44:08 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:17.099 19:44:08 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:08:17.099 19:44:08 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:08:17.099 [2024-07-24 19:44:08.443032] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:17.099 [2024-07-24 19:44:08.443098] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1345860 ] 00:08:17.099 [2024-07-24 19:44:08.572474] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:17.099 [2024-07-24 19:44:08.675363] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.359 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.359 19:44:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.359 19:44:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.359 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.359 19:44:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.733 19:44:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:18.733 19:44:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.733 19:44:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.733 19:44:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.733 19:44:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:18.733 19:44:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.733 19:44:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.733 19:44:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.733 19:44:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:18.733 19:44:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.733 19:44:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.733 19:44:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.733 19:44:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:18.733 19:44:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.733 19:44:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.733 19:44:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.733 19:44:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:18.733 19:44:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.733 19:44:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.733 19:44:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.733 19:44:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:18.733 19:44:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.733 19:44:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.733 19:44:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.733 19:44:09 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:18.733 19:44:09 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:08:18.733 19:44:09 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:18.733 00:08:18.733 real 0m1.512s 00:08:18.733 user 0m1.308s 00:08:18.733 sys 0m0.202s 00:08:18.733 19:44:09 accel.accel_xor -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:18.733 19:44:09 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:08:18.733 ************************************ 00:08:18.733 END TEST accel_xor 00:08:18.733 ************************************ 00:08:18.733 19:44:09 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:08:18.733 19:44:09 accel -- common/autotest_common.sh@1101 -- # '[' 6 -le 1 ']' 00:08:18.733 19:44:09 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:18.733 19:44:09 accel -- common/autotest_common.sh@10 -- # set +x 00:08:18.733 ************************************ 00:08:18.733 START TEST accel_dif_verify 00:08:18.733 ************************************ 00:08:18.733 19:44:10 accel.accel_dif_verify -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w dif_verify 00:08:18.733 19:44:10 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:08:18.733 19:44:10 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:08:18.733 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:18.733 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:18.733 19:44:10 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:08:18.733 19:44:10 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:08:18.734 19:44:10 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:08:18.734 19:44:10 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:18.734 19:44:10 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:18.734 19:44:10 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:18.734 19:44:10 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:18.734 19:44:10 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:18.734 19:44:10 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:08:18.734 19:44:10 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:08:18.734 [2024-07-24 19:44:10.034945] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:18.734 [2024-07-24 19:44:10.035011] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1346061 ] 00:08:18.734 [2024-07-24 19:44:10.163063] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:18.734 [2024-07-24 19:44:10.262780] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:18.993 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:18.994 19:44:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:08:18.994 19:44:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:18.994 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:18.994 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:18.994 19:44:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:08:18.994 19:44:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:18.994 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:18.994 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:18.994 19:44:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:18.994 19:44:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:18.994 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:18.994 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:18.994 19:44:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:18.994 19:44:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:18.994 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:18.994 19:44:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:19.929 19:44:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:19.929 19:44:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:19.929 19:44:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:19.929 19:44:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:19.929 19:44:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:19.929 19:44:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:19.929 19:44:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:19.929 19:44:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:19.929 19:44:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:19.929 19:44:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:19.929 19:44:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:19.929 19:44:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:19.929 19:44:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:19.929 19:44:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:19.929 19:44:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:19.929 19:44:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:19.929 19:44:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:19.929 19:44:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:19.929 19:44:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:19.929 19:44:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:19.929 19:44:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:19.929 19:44:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:19.929 19:44:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:19.929 19:44:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:19.930 19:44:11 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:19.930 19:44:11 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:08:19.930 19:44:11 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:19.930 00:08:19.930 real 0m1.493s 00:08:19.930 user 0m1.301s 00:08:19.930 sys 0m0.198s 00:08:19.930 19:44:11 accel.accel_dif_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:19.930 19:44:11 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:08:19.930 ************************************ 00:08:19.930 END TEST accel_dif_verify 00:08:19.930 ************************************ 00:08:20.189 19:44:11 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:08:20.189 19:44:11 accel -- common/autotest_common.sh@1101 -- # '[' 6 -le 1 ']' 00:08:20.189 19:44:11 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:20.189 19:44:11 accel -- common/autotest_common.sh@10 -- # set +x 00:08:20.189 ************************************ 00:08:20.189 START TEST accel_dif_generate 00:08:20.189 ************************************ 00:08:20.189 19:44:11 accel.accel_dif_generate -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w dif_generate 00:08:20.189 19:44:11 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:08:20.189 19:44:11 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:08:20.189 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:20.189 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:20.189 19:44:11 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:08:20.189 19:44:11 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:08:20.189 19:44:11 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:08:20.189 19:44:11 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:20.189 19:44:11 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:20.189 19:44:11 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:20.189 19:44:11 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:20.189 19:44:11 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:20.189 19:44:11 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:08:20.189 19:44:11 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:08:20.189 [2024-07-24 19:44:11.616283] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:20.189 [2024-07-24 19:44:11.616348] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1346332 ] 00:08:20.449 [2024-07-24 19:44:11.799219] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.449 [2024-07-24 19:44:11.904003] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:20.449 19:44:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:21.825 19:44:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:21.825 19:44:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:21.825 19:44:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:21.825 19:44:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:21.825 19:44:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:21.825 19:44:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:21.825 19:44:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:21.825 19:44:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:21.825 19:44:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:21.825 19:44:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:21.825 19:44:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:21.825 19:44:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:21.825 19:44:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:21.825 19:44:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:21.825 19:44:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:21.825 19:44:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:21.825 19:44:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:21.825 19:44:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:21.825 19:44:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:21.825 19:44:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:21.825 19:44:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:21.825 19:44:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:21.825 19:44:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:21.825 19:44:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:21.825 19:44:13 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:21.825 19:44:13 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:08:21.825 19:44:13 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:21.825 00:08:21.825 real 0m1.570s 00:08:21.825 user 0m1.332s 00:08:21.825 sys 0m0.244s 00:08:21.825 19:44:13 accel.accel_dif_generate -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:21.825 19:44:13 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:08:21.825 ************************************ 00:08:21.825 END TEST accel_dif_generate 00:08:21.825 ************************************ 00:08:21.825 19:44:13 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:08:21.825 19:44:13 accel -- common/autotest_common.sh@1101 -- # '[' 6 -le 1 ']' 00:08:21.825 19:44:13 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:21.825 19:44:13 accel -- common/autotest_common.sh@10 -- # set +x 00:08:21.825 ************************************ 00:08:21.825 START TEST accel_dif_generate_copy 00:08:21.825 ************************************ 00:08:21.825 19:44:13 accel.accel_dif_generate_copy -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w dif_generate_copy 00:08:21.825 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:08:21.825 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:08:21.825 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:21.825 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:21.825 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:08:21.825 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:08:21.825 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:08:21.825 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:21.825 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:21.825 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:21.825 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:21.825 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:21.825 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:08:21.825 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:08:21.825 [2024-07-24 19:44:13.269685] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:21.825 [2024-07-24 19:44:13.269748] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1346612 ] 00:08:21.825 [2024-07-24 19:44:13.398721] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:22.083 [2024-07-24 19:44:13.500677] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.083 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:08:22.084 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.084 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.084 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.084 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:08:22.084 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.084 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.084 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.084 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:08:22.084 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.084 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.084 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.084 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:08:22.084 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.084 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.084 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.084 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:08:22.084 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.084 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.084 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.084 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:22.084 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.084 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.084 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.084 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:22.084 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.084 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.084 19:44:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:23.460 00:08:23.460 real 0m1.511s 00:08:23.460 user 0m1.327s 00:08:23.460 sys 0m0.189s 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:23.460 19:44:14 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:08:23.460 ************************************ 00:08:23.460 END TEST accel_dif_generate_copy 00:08:23.460 ************************************ 00:08:23.460 19:44:14 accel -- accel/accel.sh@114 -- # run_test accel_dix_verify accel_test -t 1 -w dix_verify 00:08:23.460 19:44:14 accel -- common/autotest_common.sh@1101 -- # '[' 6 -le 1 ']' 00:08:23.460 19:44:14 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:23.461 19:44:14 accel -- common/autotest_common.sh@10 -- # set +x 00:08:23.461 ************************************ 00:08:23.461 START TEST accel_dix_verify 00:08:23.461 ************************************ 00:08:23.461 19:44:14 accel.accel_dix_verify -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w dix_verify 00:08:23.461 19:44:14 accel.accel_dix_verify -- accel/accel.sh@16 -- # local accel_opc 00:08:23.461 19:44:14 accel.accel_dix_verify -- accel/accel.sh@17 -- # local accel_module 00:08:23.461 19:44:14 accel.accel_dix_verify -- accel/accel.sh@19 -- # IFS=: 00:08:23.461 19:44:14 accel.accel_dix_verify -- accel/accel.sh@19 -- # read -r var val 00:08:23.461 19:44:14 accel.accel_dix_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dix_verify 00:08:23.461 19:44:14 accel.accel_dix_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dix_verify 00:08:23.461 19:44:14 accel.accel_dix_verify -- accel/accel.sh@12 -- # build_accel_config 00:08:23.461 19:44:14 accel.accel_dix_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:23.461 19:44:14 accel.accel_dix_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:23.461 19:44:14 accel.accel_dix_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:23.461 19:44:14 accel.accel_dix_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:23.461 19:44:14 accel.accel_dix_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:23.461 19:44:14 accel.accel_dix_verify -- accel/accel.sh@40 -- # local IFS=, 00:08:23.461 19:44:14 accel.accel_dix_verify -- accel/accel.sh@41 -- # jq -r . 00:08:23.461 [2024-07-24 19:44:14.866694] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:23.461 [2024-07-24 19:44:14.866758] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1346815 ] 00:08:23.461 [2024-07-24 19:44:14.985057] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:23.720 [2024-07-24 19:44:15.085731] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@20 -- # val= 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # IFS=: 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # read -r var val 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@20 -- # val= 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # IFS=: 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # read -r var val 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@20 -- # val=0x1 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # IFS=: 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # read -r var val 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@20 -- # val= 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # IFS=: 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # read -r var val 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@20 -- # val= 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # IFS=: 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # read -r var val 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@20 -- # val=dix_verify 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@23 -- # accel_opc=dix_verify 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # IFS=: 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # read -r var val 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # IFS=: 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # read -r var val 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # IFS=: 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # read -r var val 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # IFS=: 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # read -r var val 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # IFS=: 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # read -r var val 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@20 -- # val= 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # IFS=: 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # read -r var val 00:08:23.720 19:44:15 accel.accel_dix_verify -- accel/accel.sh@20 -- # val=software 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@22 -- # accel_module=software 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # IFS=: 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # read -r var val 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@20 -- # val=32 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # IFS=: 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # read -r var val 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@20 -- # val=32 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # IFS=: 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # read -r var val 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@20 -- # val=1 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # IFS=: 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # read -r var val 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # IFS=: 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # read -r var val 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@20 -- # val=No 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # IFS=: 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # read -r var val 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@20 -- # val= 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # IFS=: 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # read -r var val 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@20 -- # val= 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # IFS=: 00:08:23.721 19:44:15 accel.accel_dix_verify -- accel/accel.sh@19 -- # read -r var val 00:08:25.100 19:44:16 accel.accel_dix_verify -- accel/accel.sh@20 -- # val= 00:08:25.100 19:44:16 accel.accel_dix_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:25.100 19:44:16 accel.accel_dix_verify -- accel/accel.sh@19 -- # IFS=: 00:08:25.100 19:44:16 accel.accel_dix_verify -- accel/accel.sh@19 -- # read -r var val 00:08:25.100 19:44:16 accel.accel_dix_verify -- accel/accel.sh@20 -- # val= 00:08:25.100 19:44:16 accel.accel_dix_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:25.100 19:44:16 accel.accel_dix_verify -- accel/accel.sh@19 -- # IFS=: 00:08:25.100 19:44:16 accel.accel_dix_verify -- accel/accel.sh@19 -- # read -r var val 00:08:25.100 19:44:16 accel.accel_dix_verify -- accel/accel.sh@20 -- # val= 00:08:25.100 19:44:16 accel.accel_dix_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:25.101 19:44:16 accel.accel_dix_verify -- accel/accel.sh@19 -- # IFS=: 00:08:25.101 19:44:16 accel.accel_dix_verify -- accel/accel.sh@19 -- # read -r var val 00:08:25.101 19:44:16 accel.accel_dix_verify -- accel/accel.sh@20 -- # val= 00:08:25.101 19:44:16 accel.accel_dix_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:25.101 19:44:16 accel.accel_dix_verify -- accel/accel.sh@19 -- # IFS=: 00:08:25.101 19:44:16 accel.accel_dix_verify -- accel/accel.sh@19 -- # read -r var val 00:08:25.101 19:44:16 accel.accel_dix_verify -- accel/accel.sh@20 -- # val= 00:08:25.101 19:44:16 accel.accel_dix_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:25.101 19:44:16 accel.accel_dix_verify -- accel/accel.sh@19 -- # IFS=: 00:08:25.101 19:44:16 accel.accel_dix_verify -- accel/accel.sh@19 -- # read -r var val 00:08:25.101 19:44:16 accel.accel_dix_verify -- accel/accel.sh@20 -- # val= 00:08:25.101 19:44:16 accel.accel_dix_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:25.101 19:44:16 accel.accel_dix_verify -- accel/accel.sh@19 -- # IFS=: 00:08:25.101 19:44:16 accel.accel_dix_verify -- accel/accel.sh@19 -- # read -r var val 00:08:25.101 19:44:16 accel.accel_dix_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:25.101 19:44:16 accel.accel_dix_verify -- accel/accel.sh@27 -- # [[ -n dix_verify ]] 00:08:25.101 19:44:16 accel.accel_dix_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:25.101 00:08:25.101 real 0m1.500s 00:08:25.101 user 0m1.315s 00:08:25.101 sys 0m0.190s 00:08:25.101 19:44:16 accel.accel_dix_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:25.101 19:44:16 accel.accel_dix_verify -- common/autotest_common.sh@10 -- # set +x 00:08:25.101 ************************************ 00:08:25.101 END TEST accel_dix_verify 00:08:25.101 ************************************ 00:08:25.101 19:44:16 accel -- accel/accel.sh@115 -- # run_test accel_dix_generate accel_test -t 1 -w dif_generate 00:08:25.101 19:44:16 accel -- common/autotest_common.sh@1101 -- # '[' 6 -le 1 ']' 00:08:25.101 19:44:16 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:25.101 19:44:16 accel -- common/autotest_common.sh@10 -- # set +x 00:08:25.101 ************************************ 00:08:25.101 START TEST accel_dix_generate 00:08:25.101 ************************************ 00:08:25.101 19:44:16 accel.accel_dix_generate -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w dif_generate 00:08:25.101 19:44:16 accel.accel_dix_generate -- accel/accel.sh@16 -- # local accel_opc 00:08:25.101 19:44:16 accel.accel_dix_generate -- accel/accel.sh@17 -- # local accel_module 00:08:25.101 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # IFS=: 00:08:25.101 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # read -r var val 00:08:25.101 19:44:16 accel.accel_dix_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:08:25.101 19:44:16 accel.accel_dix_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:08:25.101 19:44:16 accel.accel_dix_generate -- accel/accel.sh@12 -- # build_accel_config 00:08:25.101 19:44:16 accel.accel_dix_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:25.101 19:44:16 accel.accel_dix_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:25.101 19:44:16 accel.accel_dix_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:25.101 19:44:16 accel.accel_dix_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:25.101 19:44:16 accel.accel_dix_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:25.101 19:44:16 accel.accel_dix_generate -- accel/accel.sh@40 -- # local IFS=, 00:08:25.101 19:44:16 accel.accel_dix_generate -- accel/accel.sh@41 -- # jq -r . 00:08:25.101 [2024-07-24 19:44:16.450434] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:25.101 [2024-07-24 19:44:16.450495] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1347012 ] 00:08:25.101 [2024-07-24 19:44:16.579625] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:25.101 [2024-07-24 19:44:16.676398] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@20 -- # val= 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # IFS=: 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # read -r var val 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@20 -- # val= 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # IFS=: 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # read -r var val 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@20 -- # val=0x1 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # IFS=: 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # read -r var val 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@20 -- # val= 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # IFS=: 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # read -r var val 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@20 -- # val= 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # IFS=: 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # read -r var val 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@20 -- # val=dif_generate 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # IFS=: 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # read -r var val 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # IFS=: 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # read -r var val 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # IFS=: 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # read -r var val 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # IFS=: 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # read -r var val 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # IFS=: 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # read -r var val 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@20 -- # val= 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # IFS=: 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # read -r var val 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@20 -- # val=software 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@22 -- # accel_module=software 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # IFS=: 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # read -r var val 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@20 -- # val=32 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # IFS=: 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # read -r var val 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@20 -- # val=32 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # IFS=: 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # read -r var val 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@20 -- # val=1 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # IFS=: 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # read -r var val 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # IFS=: 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # read -r var val 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@20 -- # val=No 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # IFS=: 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # read -r var val 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@20 -- # val= 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # IFS=: 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # read -r var val 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@20 -- # val= 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # IFS=: 00:08:25.361 19:44:16 accel.accel_dix_generate -- accel/accel.sh@19 -- # read -r var val 00:08:26.741 19:44:17 accel.accel_dix_generate -- accel/accel.sh@20 -- # val= 00:08:26.741 19:44:17 accel.accel_dix_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:26.741 19:44:17 accel.accel_dix_generate -- accel/accel.sh@19 -- # IFS=: 00:08:26.741 19:44:17 accel.accel_dix_generate -- accel/accel.sh@19 -- # read -r var val 00:08:26.741 19:44:17 accel.accel_dix_generate -- accel/accel.sh@20 -- # val= 00:08:26.741 19:44:17 accel.accel_dix_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:26.741 19:44:17 accel.accel_dix_generate -- accel/accel.sh@19 -- # IFS=: 00:08:26.741 19:44:17 accel.accel_dix_generate -- accel/accel.sh@19 -- # read -r var val 00:08:26.741 19:44:17 accel.accel_dix_generate -- accel/accel.sh@20 -- # val= 00:08:26.741 19:44:17 accel.accel_dix_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:26.741 19:44:17 accel.accel_dix_generate -- accel/accel.sh@19 -- # IFS=: 00:08:26.741 19:44:17 accel.accel_dix_generate -- accel/accel.sh@19 -- # read -r var val 00:08:26.741 19:44:17 accel.accel_dix_generate -- accel/accel.sh@20 -- # val= 00:08:26.741 19:44:17 accel.accel_dix_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:26.741 19:44:17 accel.accel_dix_generate -- accel/accel.sh@19 -- # IFS=: 00:08:26.741 19:44:17 accel.accel_dix_generate -- accel/accel.sh@19 -- # read -r var val 00:08:26.741 19:44:17 accel.accel_dix_generate -- accel/accel.sh@20 -- # val= 00:08:26.741 19:44:17 accel.accel_dix_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:26.741 19:44:17 accel.accel_dix_generate -- accel/accel.sh@19 -- # IFS=: 00:08:26.741 19:44:17 accel.accel_dix_generate -- accel/accel.sh@19 -- # read -r var val 00:08:26.741 19:44:17 accel.accel_dix_generate -- accel/accel.sh@20 -- # val= 00:08:26.741 19:44:17 accel.accel_dix_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:26.741 19:44:17 accel.accel_dix_generate -- accel/accel.sh@19 -- # IFS=: 00:08:26.741 19:44:17 accel.accel_dix_generate -- accel/accel.sh@19 -- # read -r var val 00:08:26.741 19:44:17 accel.accel_dix_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:26.741 19:44:17 accel.accel_dix_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:08:26.741 19:44:17 accel.accel_dix_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:26.741 00:08:26.741 real 0m1.487s 00:08:26.741 user 0m1.315s 00:08:26.741 sys 0m0.179s 00:08:26.741 19:44:17 accel.accel_dix_generate -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:26.741 19:44:17 accel.accel_dix_generate -- common/autotest_common.sh@10 -- # set +x 00:08:26.741 ************************************ 00:08:26.741 END TEST accel_dix_generate 00:08:26.741 ************************************ 00:08:26.741 19:44:17 accel -- accel/accel.sh@117 -- # [[ y == y ]] 00:08:26.741 19:44:17 accel -- accel/accel.sh@118 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:26.741 19:44:17 accel -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:08:26.741 19:44:17 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:26.741 19:44:17 accel -- common/autotest_common.sh@10 -- # set +x 00:08:26.741 ************************************ 00:08:26.741 START TEST accel_comp 00:08:26.741 ************************************ 00:08:26.741 19:44:17 accel.accel_comp -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:26.741 19:44:17 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:08:26.741 19:44:17 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:08:26.741 19:44:17 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:26.741 19:44:17 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:26.741 19:44:17 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:26.741 19:44:17 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:26.741 19:44:17 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:08:26.741 19:44:17 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:26.741 19:44:17 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:26.741 19:44:17 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:26.741 19:44:17 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:26.741 19:44:17 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:26.741 19:44:17 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:08:26.741 19:44:17 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:08:26.741 [2024-07-24 19:44:18.019778] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:26.741 [2024-07-24 19:44:18.019841] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1347213 ] 00:08:26.741 [2024-07-24 19:44:18.146610] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:26.741 [2024-07-24 19:44:18.246925] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:26.741 19:44:18 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:26.741 19:44:18 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.741 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:26.742 19:44:18 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:28.121 19:44:19 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:28.121 19:44:19 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.121 19:44:19 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:28.121 19:44:19 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:28.121 19:44:19 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:28.121 19:44:19 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.121 19:44:19 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:28.121 19:44:19 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:28.121 19:44:19 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:28.121 19:44:19 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.121 19:44:19 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:28.121 19:44:19 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:28.121 19:44:19 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:28.121 19:44:19 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.121 19:44:19 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:28.121 19:44:19 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:28.121 19:44:19 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:28.121 19:44:19 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.121 19:44:19 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:28.121 19:44:19 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:28.121 19:44:19 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:28.121 19:44:19 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.121 19:44:19 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:28.121 19:44:19 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:28.121 19:44:19 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:28.121 19:44:19 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:08:28.121 19:44:19 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:28.121 00:08:28.121 real 0m1.498s 00:08:28.121 user 0m1.315s 00:08:28.121 sys 0m0.182s 00:08:28.121 19:44:19 accel.accel_comp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:28.121 19:44:19 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:08:28.121 ************************************ 00:08:28.121 END TEST accel_comp 00:08:28.121 ************************************ 00:08:28.121 19:44:19 accel -- accel/accel.sh@119 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:28.121 19:44:19 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:08:28.121 19:44:19 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:28.121 19:44:19 accel -- common/autotest_common.sh@10 -- # set +x 00:08:28.121 ************************************ 00:08:28.121 START TEST accel_decomp 00:08:28.121 ************************************ 00:08:28.121 19:44:19 accel.accel_decomp -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:28.121 19:44:19 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:08:28.121 19:44:19 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:08:28.121 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:28.121 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:28.121 19:44:19 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:28.121 19:44:19 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:28.121 19:44:19 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:08:28.121 19:44:19 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:28.121 19:44:19 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:28.121 19:44:19 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:28.121 19:44:19 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:28.121 19:44:19 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:28.121 19:44:19 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:08:28.121 19:44:19 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:08:28.121 [2024-07-24 19:44:19.610930] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:28.121 [2024-07-24 19:44:19.611060] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1347406 ] 00:08:28.381 [2024-07-24 19:44:19.809767] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:28.381 [2024-07-24 19:44:19.915176] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:28.641 19:44:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.641 19:44:20 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:28.641 19:44:20 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:28.641 19:44:20 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:28.641 19:44:20 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.641 19:44:20 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:28.641 19:44:20 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:29.579 19:44:21 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:29.579 19:44:21 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:29.579 19:44:21 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:29.579 19:44:21 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:29.579 19:44:21 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:29.579 19:44:21 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:29.579 19:44:21 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:29.579 19:44:21 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:29.579 19:44:21 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:29.579 19:44:21 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:29.579 19:44:21 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:29.579 19:44:21 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:29.579 19:44:21 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:29.579 19:44:21 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:29.579 19:44:21 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:29.579 19:44:21 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:29.579 19:44:21 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:29.579 19:44:21 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:29.579 19:44:21 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:29.579 19:44:21 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:29.579 19:44:21 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:29.579 19:44:21 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:29.579 19:44:21 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:29.579 19:44:21 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:29.579 19:44:21 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:29.579 19:44:21 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:29.579 19:44:21 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:29.579 00:08:29.579 real 0m1.601s 00:08:29.579 user 0m1.321s 00:08:29.579 sys 0m0.270s 00:08:29.579 19:44:21 accel.accel_decomp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:29.579 19:44:21 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:08:29.579 ************************************ 00:08:29.579 END TEST accel_decomp 00:08:29.579 ************************************ 00:08:29.839 19:44:21 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:29.839 19:44:21 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:08:29.839 19:44:21 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:29.839 19:44:21 accel -- common/autotest_common.sh@10 -- # set +x 00:08:29.839 ************************************ 00:08:29.839 START TEST accel_decomp_full 00:08:29.839 ************************************ 00:08:29.839 19:44:21 accel.accel_decomp_full -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:29.839 19:44:21 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:08:29.839 19:44:21 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:08:29.839 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:29.839 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:29.839 19:44:21 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:29.839 19:44:21 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:29.839 19:44:21 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:08:29.839 19:44:21 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:29.839 19:44:21 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:29.839 19:44:21 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:29.839 19:44:21 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:29.839 19:44:21 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:29.839 19:44:21 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:08:29.839 19:44:21 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:08:29.839 [2024-07-24 19:44:21.284597] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:29.839 [2024-07-24 19:44:21.284661] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1347730 ] 00:08:29.839 [2024-07-24 19:44:21.413268] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:30.098 [2024-07-24 19:44:21.513328] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:30.098 19:44:21 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:30.098 19:44:21 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:30.098 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:30.098 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:30.098 19:44:21 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:30.098 19:44:21 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:30.098 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:30.099 19:44:21 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:31.478 19:44:22 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:31.478 19:44:22 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:31.478 19:44:22 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:31.478 19:44:22 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:31.478 19:44:22 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:31.478 19:44:22 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:31.478 19:44:22 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:31.479 19:44:22 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:31.479 19:44:22 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:31.479 19:44:22 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:31.479 19:44:22 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:31.479 19:44:22 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:31.479 19:44:22 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:31.479 19:44:22 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:31.479 19:44:22 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:31.479 19:44:22 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:31.479 19:44:22 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:31.479 19:44:22 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:31.479 19:44:22 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:31.479 19:44:22 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:31.479 19:44:22 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:31.479 19:44:22 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:31.479 19:44:22 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:31.479 19:44:22 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:31.479 19:44:22 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:31.479 19:44:22 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:31.479 19:44:22 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:31.479 00:08:31.479 real 0m1.516s 00:08:31.479 user 0m1.316s 00:08:31.479 sys 0m0.198s 00:08:31.479 19:44:22 accel.accel_decomp_full -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:31.479 19:44:22 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:08:31.479 ************************************ 00:08:31.479 END TEST accel_decomp_full 00:08:31.479 ************************************ 00:08:31.479 19:44:22 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:31.479 19:44:22 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:08:31.479 19:44:22 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:31.479 19:44:22 accel -- common/autotest_common.sh@10 -- # set +x 00:08:31.479 ************************************ 00:08:31.479 START TEST accel_decomp_mcore 00:08:31.479 ************************************ 00:08:31.479 19:44:22 accel.accel_decomp_mcore -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:31.479 19:44:22 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:31.479 19:44:22 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:31.479 19:44:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.479 19:44:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.479 19:44:22 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:31.479 19:44:22 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:31.479 19:44:22 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:31.479 19:44:22 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:31.479 19:44:22 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:31.479 19:44:22 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:31.479 19:44:22 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:31.479 19:44:22 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:31.479 19:44:22 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:31.479 19:44:22 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:31.479 [2024-07-24 19:44:22.887088] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:31.479 [2024-07-24 19:44:22.887148] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1347962 ] 00:08:31.479 [2024-07-24 19:44:23.017466] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:31.738 [2024-07-24 19:44:23.119294] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:31.738 [2024-07-24 19:44:23.119416] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:31.738 [2024-07-24 19:44:23.119481] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:31.738 [2024-07-24 19:44:23.119483] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.738 19:44:23 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:33.116 00:08:33.116 real 0m1.519s 00:08:33.116 user 0m4.751s 00:08:33.116 sys 0m0.212s 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:33.116 19:44:24 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:33.116 ************************************ 00:08:33.116 END TEST accel_decomp_mcore 00:08:33.116 ************************************ 00:08:33.116 19:44:24 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:33.116 19:44:24 accel -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:08:33.116 19:44:24 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:33.116 19:44:24 accel -- common/autotest_common.sh@10 -- # set +x 00:08:33.116 ************************************ 00:08:33.116 START TEST accel_decomp_full_mcore 00:08:33.116 ************************************ 00:08:33.116 19:44:24 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:33.116 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:33.116 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:33.116 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.116 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.116 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:33.116 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:33.116 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:33.116 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:33.116 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:33.116 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:33.116 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:33.116 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:33.116 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:33.116 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:33.116 [2024-07-24 19:44:24.489839] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:33.116 [2024-07-24 19:44:24.489902] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1348162 ] 00:08:33.116 [2024-07-24 19:44:24.620781] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:33.376 [2024-07-24 19:44:24.724720] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:33.376 [2024-07-24 19:44:24.724821] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:33.376 [2024-07-24 19:44:24.724921] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:33.376 [2024-07-24 19:44:24.724922] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.376 19:44:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:34.755 00:08:34.755 real 0m1.543s 00:08:34.755 user 0m4.832s 00:08:34.755 sys 0m0.225s 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:34.755 19:44:25 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:34.755 ************************************ 00:08:34.755 END TEST accel_decomp_full_mcore 00:08:34.755 ************************************ 00:08:34.755 19:44:26 accel -- accel/accel.sh@123 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:34.755 19:44:26 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:08:34.755 19:44:26 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:34.755 19:44:26 accel -- common/autotest_common.sh@10 -- # set +x 00:08:34.755 ************************************ 00:08:34.755 START TEST accel_decomp_mthread 00:08:34.755 ************************************ 00:08:34.755 19:44:26 accel.accel_decomp_mthread -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:34.755 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:34.755 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:34.755 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.756 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.756 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:34.756 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:34.756 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:34.756 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:34.756 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:34.756 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:34.756 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:34.756 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:34.756 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:34.756 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:34.756 [2024-07-24 19:44:26.114537] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:34.756 [2024-07-24 19:44:26.114597] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1348366 ] 00:08:34.756 [2024-07-24 19:44:26.243547] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:34.756 [2024-07-24 19:44:26.343930] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:35.015 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:35.016 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:35.016 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:35.016 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:35.016 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:35.016 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:35.016 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:35.016 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:35.016 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:35.016 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:35.016 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:35.016 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:35.016 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:35.016 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:35.016 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:35.016 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:35.016 19:44:26 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:36.393 00:08:36.393 real 0m1.516s 00:08:36.393 user 0m1.313s 00:08:36.393 sys 0m0.210s 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:36.393 19:44:27 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:36.394 ************************************ 00:08:36.394 END TEST accel_decomp_mthread 00:08:36.394 ************************************ 00:08:36.394 19:44:27 accel -- accel/accel.sh@124 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:36.394 19:44:27 accel -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:08:36.394 19:44:27 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:36.394 19:44:27 accel -- common/autotest_common.sh@10 -- # set +x 00:08:36.394 ************************************ 00:08:36.394 START TEST accel_decomp_full_mthread 00:08:36.394 ************************************ 00:08:36.394 19:44:27 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:36.394 19:44:27 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:36.394 19:44:27 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:36.394 19:44:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.394 19:44:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.394 19:44:27 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:36.394 19:44:27 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:36.394 19:44:27 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:36.394 19:44:27 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:36.394 19:44:27 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:36.394 19:44:27 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:36.394 19:44:27 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:36.394 19:44:27 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:36.394 19:44:27 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:36.394 19:44:27 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:36.394 [2024-07-24 19:44:27.707482] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:36.394 [2024-07-24 19:44:27.707544] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1348559 ] 00:08:36.394 [2024-07-24 19:44:27.836590] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:36.394 [2024-07-24 19:44:27.937278] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.653 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.654 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.654 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:36.654 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.654 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.654 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.654 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.654 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.654 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.654 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.654 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.654 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.654 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.654 19:44:28 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:38.031 00:08:38.031 real 0m1.549s 00:08:38.031 user 0m1.347s 00:08:38.031 sys 0m0.206s 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:38.031 19:44:29 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:38.031 ************************************ 00:08:38.031 END TEST accel_decomp_full_mthread 00:08:38.031 ************************************ 00:08:38.031 19:44:29 accel -- accel/accel.sh@126 -- # [[ y == y ]] 00:08:38.031 19:44:29 accel -- accel/accel.sh@127 -- # COMPRESSDEV=1 00:08:38.031 19:44:29 accel -- accel/accel.sh@128 -- # get_expected_opcs 00:08:38.031 19:44:29 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:38.031 19:44:29 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=1348796 00:08:38.031 19:44:29 accel -- accel/accel.sh@63 -- # waitforlisten 1348796 00:08:38.031 19:44:29 accel -- common/autotest_common.sh@831 -- # '[' -z 1348796 ']' 00:08:38.031 19:44:29 accel -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:38.031 19:44:29 accel -- accel/accel.sh@61 -- # build_accel_config 00:08:38.031 19:44:29 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:08:38.031 19:44:29 accel -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:38.031 19:44:29 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:38.031 19:44:29 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:38.031 19:44:29 accel -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:38.031 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:38.031 19:44:29 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:38.031 19:44:29 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:38.031 19:44:29 accel -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:38.031 19:44:29 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:38.031 19:44:29 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:38.031 19:44:29 accel -- common/autotest_common.sh@10 -- # set +x 00:08:38.031 19:44:29 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:38.031 19:44:29 accel -- accel/accel.sh@41 -- # jq -r . 00:08:38.031 [2024-07-24 19:44:29.337696] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:38.031 [2024-07-24 19:44:29.337765] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1348796 ] 00:08:38.031 [2024-07-24 19:44:29.465620] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:38.031 [2024-07-24 19:44:29.577514] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:38.969 [2024-07-24 19:44:30.346236] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:38.969 19:44:30 accel -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:38.969 19:44:30 accel -- common/autotest_common.sh@864 -- # return 0 00:08:38.969 19:44:30 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:08:38.969 19:44:30 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:08:38.969 19:44:30 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:08:38.969 19:44:30 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:08:38.969 19:44:30 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:08:38.969 19:44:30 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:08:38.969 19:44:30 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:08:38.969 19:44:30 accel -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:38.969 19:44:30 accel -- common/autotest_common.sh@10 -- # set +x 00:08:38.969 19:44:30 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:08:39.229 19:44:30 accel -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:39.229 "method": "compressdev_scan_accel_module", 00:08:39.229 19:44:30 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:08:39.229 19:44:30 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:08:39.229 19:44:30 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:08:39.229 19:44:30 accel -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:39.229 19:44:30 accel -- common/autotest_common.sh@10 -- # set +x 00:08:39.229 19:44:30 accel -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:39.229 19:44:30 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # IFS== 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:39.229 19:44:30 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:39.229 19:44:30 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # IFS== 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:39.229 19:44:30 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:39.229 19:44:30 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # IFS== 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:39.229 19:44:30 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:39.229 19:44:30 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # IFS== 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:39.229 19:44:30 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:39.229 19:44:30 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # IFS== 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:39.229 19:44:30 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:39.229 19:44:30 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # IFS== 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:39.229 19:44:30 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:39.229 19:44:30 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # IFS== 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:39.229 19:44:30 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:08:39.229 19:44:30 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # IFS== 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:39.229 19:44:30 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:08:39.229 19:44:30 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # IFS== 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:39.229 19:44:30 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:39.229 19:44:30 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # IFS== 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:39.229 19:44:30 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:39.229 19:44:30 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # IFS== 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:39.229 19:44:30 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:39.229 19:44:30 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # IFS== 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:39.229 19:44:30 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:39.229 19:44:30 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # IFS== 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:39.229 19:44:30 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:39.229 19:44:30 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # IFS== 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:39.229 19:44:30 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:39.229 19:44:30 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # IFS== 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:39.229 19:44:30 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:39.229 19:44:30 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # IFS== 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:39.229 19:44:30 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:39.229 19:44:30 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # IFS== 00:08:39.229 19:44:30 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:39.229 19:44:30 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:39.229 19:44:30 accel -- accel/accel.sh@75 -- # killprocess 1348796 00:08:39.229 19:44:30 accel -- common/autotest_common.sh@950 -- # '[' -z 1348796 ']' 00:08:39.229 19:44:30 accel -- common/autotest_common.sh@954 -- # kill -0 1348796 00:08:39.229 19:44:30 accel -- common/autotest_common.sh@955 -- # uname 00:08:39.229 19:44:30 accel -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:08:39.229 19:44:30 accel -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1348796 00:08:39.488 19:44:30 accel -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:08:39.488 19:44:30 accel -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:08:39.488 19:44:30 accel -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1348796' 00:08:39.488 killing process with pid 1348796 00:08:39.488 19:44:30 accel -- common/autotest_common.sh@969 -- # kill 1348796 00:08:39.488 19:44:30 accel -- common/autotest_common.sh@974 -- # wait 1348796 00:08:39.748 19:44:31 accel -- accel/accel.sh@76 -- # trap - ERR 00:08:39.748 19:44:31 accel -- accel/accel.sh@129 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:39.748 19:44:31 accel -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:08:39.748 19:44:31 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:39.748 19:44:31 accel -- common/autotest_common.sh@10 -- # set +x 00:08:39.748 ************************************ 00:08:39.748 START TEST accel_cdev_comp 00:08:39.748 ************************************ 00:08:39.748 19:44:31 accel.accel_cdev_comp -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:39.748 19:44:31 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:08:39.748 19:44:31 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:08:39.748 19:44:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:39.748 19:44:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:39.748 19:44:31 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:39.748 19:44:31 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:39.748 19:44:31 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:08:39.748 19:44:31 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:39.748 19:44:31 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:39.748 19:44:31 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:39.748 19:44:31 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:39.748 19:44:31 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:39.748 19:44:31 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:39.748 19:44:31 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:08:39.748 19:44:31 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:08:39.748 [2024-07-24 19:44:31.292536] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:39.748 [2024-07-24 19:44:31.292605] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1349117 ] 00:08:40.007 [2024-07-24 19:44:31.423114] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:40.007 [2024-07-24 19:44:31.528043] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:40.945 [2024-07-24 19:44:32.294266] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:40.945 [2024-07-24 19:44:32.296891] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x16630e0 PMD being used: compress_qat 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:40.945 [2024-07-24 19:44:32.301017] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1867e70 PMD being used: compress_qat 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:08:40.945 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.946 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:40.946 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:40.946 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:08:40.946 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.946 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:40.946 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:40.946 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:40.946 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.946 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:40.946 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:40.946 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:08:40.946 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.946 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:40.946 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:40.946 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:40.946 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.946 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:40.946 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:40.946 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:40.946 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.946 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:40.946 19:44:32 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:42.325 19:44:33 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:42.325 19:44:33 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:42.325 19:44:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:42.325 19:44:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:42.325 19:44:33 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:42.325 19:44:33 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:42.325 19:44:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:42.325 19:44:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:42.325 19:44:33 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:42.325 19:44:33 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:42.325 19:44:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:42.325 19:44:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:42.325 19:44:33 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:42.325 19:44:33 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:42.325 19:44:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:42.325 19:44:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:42.325 19:44:33 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:42.325 19:44:33 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:42.325 19:44:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:42.325 19:44:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:42.325 19:44:33 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:42.325 19:44:33 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:42.325 19:44:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:42.325 19:44:33 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:42.325 19:44:33 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:42.325 19:44:33 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:08:42.325 19:44:33 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:42.325 00:08:42.325 real 0m2.226s 00:08:42.325 user 0m0.024s 00:08:42.325 sys 0m0.007s 00:08:42.325 19:44:33 accel.accel_cdev_comp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:42.325 19:44:33 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:08:42.325 ************************************ 00:08:42.325 END TEST accel_cdev_comp 00:08:42.325 ************************************ 00:08:42.325 19:44:33 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:42.325 19:44:33 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:08:42.325 19:44:33 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:42.325 19:44:33 accel -- common/autotest_common.sh@10 -- # set +x 00:08:42.325 ************************************ 00:08:42.325 START TEST accel_cdev_decomp 00:08:42.325 ************************************ 00:08:42.325 19:44:33 accel.accel_cdev_decomp -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:42.325 19:44:33 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:08:42.325 19:44:33 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:08:42.325 19:44:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:42.325 19:44:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:42.325 19:44:33 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:42.325 19:44:33 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:42.325 19:44:33 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:08:42.325 19:44:33 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:42.325 19:44:33 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:42.325 19:44:33 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:42.325 19:44:33 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:42.325 19:44:33 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:42.325 19:44:33 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:42.325 19:44:33 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:08:42.325 19:44:33 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:08:42.325 [2024-07-24 19:44:33.606723] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:42.325 [2024-07-24 19:44:33.606788] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1349450 ] 00:08:42.325 [2024-07-24 19:44:33.736072] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:42.325 [2024-07-24 19:44:33.832410] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:43.264 [2024-07-24 19:44:34.598548] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:43.264 [2024-07-24 19:44:34.601189] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xfe10e0 PMD being used: compress_qat 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:43.264 [2024-07-24 19:44:34.605463] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x11e5e70 PMD being used: compress_qat 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:43.264 19:44:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:44.202 00:08:44.202 real 0m2.218s 00:08:44.202 user 0m1.644s 00:08:44.202 sys 0m0.579s 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:44.202 19:44:35 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:08:44.202 ************************************ 00:08:44.202 END TEST accel_cdev_decomp 00:08:44.202 ************************************ 00:08:44.461 19:44:35 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:44.461 19:44:35 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:08:44.461 19:44:35 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:44.461 19:44:35 accel -- common/autotest_common.sh@10 -- # set +x 00:08:44.461 ************************************ 00:08:44.461 START TEST accel_cdev_decomp_full 00:08:44.461 ************************************ 00:08:44.461 19:44:35 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:44.461 19:44:35 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:08:44.461 19:44:35 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:08:44.461 19:44:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:44.461 19:44:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:44.462 19:44:35 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:44.462 19:44:35 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:44.462 19:44:35 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:08:44.462 19:44:35 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:44.462 19:44:35 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:44.462 19:44:35 accel.accel_cdev_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:44.462 19:44:35 accel.accel_cdev_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:44.462 19:44:35 accel.accel_cdev_decomp_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:44.462 19:44:35 accel.accel_cdev_decomp_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:44.462 19:44:35 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:08:44.462 19:44:35 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:08:44.462 [2024-07-24 19:44:35.908411] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:44.462 [2024-07-24 19:44:35.908474] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1349683 ] 00:08:44.462 [2024-07-24 19:44:36.038141] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:44.721 [2024-07-24 19:44:36.142668] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:45.658 [2024-07-24 19:44:36.905661] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:45.658 [2024-07-24 19:44:36.908238] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x24dc0e0 PMD being used: compress_qat 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:45.658 [2024-07-24 19:44:36.911625] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x24df3b0 PMD being used: compress_qat 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:08:45.658 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:45.659 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:45.659 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:45.659 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:08:45.659 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:45.659 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:45.659 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:45.659 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:08:45.659 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:45.659 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:45.659 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:45.659 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:45.659 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:45.659 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:45.659 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:45.659 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:45.659 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:45.659 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:45.659 19:44:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:46.688 00:08:46.688 real 0m2.221s 00:08:46.688 user 0m1.650s 00:08:46.688 sys 0m0.575s 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:46.688 19:44:38 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:08:46.688 ************************************ 00:08:46.688 END TEST accel_cdev_decomp_full 00:08:46.688 ************************************ 00:08:46.688 19:44:38 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:46.688 19:44:38 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:08:46.688 19:44:38 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:46.688 19:44:38 accel -- common/autotest_common.sh@10 -- # set +x 00:08:46.688 ************************************ 00:08:46.688 START TEST accel_cdev_decomp_mcore 00:08:46.688 ************************************ 00:08:46.688 19:44:38 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:46.688 19:44:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:46.688 19:44:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:46.688 19:44:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:46.688 19:44:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:46.688 19:44:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:46.688 19:44:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:46.688 19:44:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:46.688 19:44:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:46.688 19:44:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:46.688 19:44:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:46.688 19:44:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:46.688 19:44:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:46.689 19:44:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:46.689 19:44:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:46.689 19:44:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:46.689 [2024-07-24 19:44:38.210757] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:46.689 [2024-07-24 19:44:38.210823] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1350055 ] 00:08:46.948 [2024-07-24 19:44:38.341725] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:46.948 [2024-07-24 19:44:38.449983] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:46.948 [2024-07-24 19:44:38.450083] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:46.948 [2024-07-24 19:44:38.450184] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:46.948 [2024-07-24 19:44:38.450185] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:47.885 [2024-07-24 19:44:39.209812] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:47.885 [2024-07-24 19:44:39.212367] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xfec700 PMD being used: compress_qat 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:47.885 [2024-07-24 19:44:39.218163] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fdba419b8f0 PMD being used: compress_qat 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.885 [2024-07-24 19:44:39.219691] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xff19f0 PMD being used: compress_qat 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:47.885 [2024-07-24 19:44:39.223364] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fdb9c19b8f0 PMD being used: compress_qat 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.885 [2024-07-24 19:44:39.223585] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fdb9419b8f0 PMD being used: compress_qat 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.885 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:47.886 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.886 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.886 19:44:39 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.822 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:48.822 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:48.822 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:48.822 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.822 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:49.081 00:08:49.081 real 0m2.248s 00:08:49.081 user 0m7.223s 00:08:49.081 sys 0m0.612s 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:49.081 19:44:40 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:49.081 ************************************ 00:08:49.081 END TEST accel_cdev_decomp_mcore 00:08:49.081 ************************************ 00:08:49.081 19:44:40 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:49.081 19:44:40 accel -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:08:49.081 19:44:40 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:49.081 19:44:40 accel -- common/autotest_common.sh@10 -- # set +x 00:08:49.081 ************************************ 00:08:49.081 START TEST accel_cdev_decomp_full_mcore 00:08:49.081 ************************************ 00:08:49.081 19:44:40 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:49.081 19:44:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:49.081 19:44:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:49.081 19:44:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:49.081 19:44:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:49.081 19:44:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:49.081 19:44:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:49.081 19:44:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:49.081 19:44:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:49.081 19:44:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:49.081 19:44:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:49.081 19:44:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:49.081 19:44:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:49.081 19:44:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:49.081 19:44:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:49.081 19:44:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:49.081 [2024-07-24 19:44:40.541542] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:49.081 [2024-07-24 19:44:40.541608] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1350424 ] 00:08:49.081 [2024-07-24 19:44:40.670155] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:49.340 [2024-07-24 19:44:40.779931] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:49.340 [2024-07-24 19:44:40.780032] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:49.340 [2024-07-24 19:44:40.780134] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:49.340 [2024-07-24 19:44:40.780135] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:50.277 [2024-07-24 19:44:41.535837] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:50.277 [2024-07-24 19:44:41.538460] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x10e2700 PMD being used: compress_qat 00:08:50.277 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:50.277 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.277 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:50.278 [2024-07-24 19:44:41.543578] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f9e7419b8f0 PMD being used: compress_qat 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:50.278 [2024-07-24 19:44:41.545443] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x10e5a30 PMD being used: compress_qat 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.278 [2024-07-24 19:44:41.549043] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f9e6c19b8f0 PMD being used: compress_qat 00:08:50.278 [2024-07-24 19:44:41.549305] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f9e6419b8f0 PMD being used: compress_qat 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.278 19:44:41 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:51.215 00:08:51.215 real 0m2.246s 00:08:51.215 user 0m7.229s 00:08:51.215 sys 0m0.605s 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:51.215 19:44:42 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:51.215 ************************************ 00:08:51.215 END TEST accel_cdev_decomp_full_mcore 00:08:51.215 ************************************ 00:08:51.215 19:44:42 accel -- accel/accel.sh@134 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:51.215 19:44:42 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:08:51.215 19:44:42 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:51.215 19:44:42 accel -- common/autotest_common.sh@10 -- # set +x 00:08:51.475 ************************************ 00:08:51.475 START TEST accel_cdev_decomp_mthread 00:08:51.475 ************************************ 00:08:51.475 19:44:42 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:51.475 19:44:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:51.475 19:44:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:51.475 19:44:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:51.475 19:44:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:51.475 19:44:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:51.475 19:44:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:51.475 19:44:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:51.475 19:44:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:51.475 19:44:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:51.475 19:44:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:51.475 19:44:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:51.475 19:44:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:51.475 19:44:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:51.475 19:44:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:51.475 19:44:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:51.475 [2024-07-24 19:44:42.870113] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:51.475 [2024-07-24 19:44:42.870179] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1350633 ] 00:08:51.475 [2024-07-24 19:44:42.999544] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:51.734 [2024-07-24 19:44:43.102032] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:52.302 [2024-07-24 19:44:43.871706] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:52.302 [2024-07-24 19:44:43.874325] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x12b30e0 PMD being used: compress_qat 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.303 [2024-07-24 19:44:43.879358] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x12b82e0 PMD being used: compress_qat 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:52.303 [2024-07-24 19:44:43.881846] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x13dadc0 PMD being used: compress_qat 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.303 19:44:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.679 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:53.679 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.679 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.679 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.679 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:53.679 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.679 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.679 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.679 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:53.679 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.679 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.679 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.680 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:53.680 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.680 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.680 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.680 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:53.680 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.680 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.680 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.680 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:53.680 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.680 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.680 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.680 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:53.680 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.680 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.680 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.680 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:53.680 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:53.680 19:44:45 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:53.680 00:08:53.680 real 0m2.225s 00:08:53.680 user 0m1.608s 00:08:53.680 sys 0m0.603s 00:08:53.680 19:44:45 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:53.680 19:44:45 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:53.680 ************************************ 00:08:53.680 END TEST accel_cdev_decomp_mthread 00:08:53.680 ************************************ 00:08:53.680 19:44:45 accel -- accel/accel.sh@135 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:53.680 19:44:45 accel -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:08:53.680 19:44:45 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:53.680 19:44:45 accel -- common/autotest_common.sh@10 -- # set +x 00:08:53.680 ************************************ 00:08:53.680 START TEST accel_cdev_decomp_full_mthread 00:08:53.680 ************************************ 00:08:53.680 19:44:45 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:53.680 19:44:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:53.680 19:44:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:53.680 19:44:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.680 19:44:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.680 19:44:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:53.680 19:44:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:53.680 19:44:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:53.680 19:44:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:53.680 19:44:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:53.680 19:44:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:53.680 19:44:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:53.680 19:44:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:53.680 19:44:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:53.680 19:44:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:53.680 19:44:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:53.680 [2024-07-24 19:44:45.177689] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:53.680 [2024-07-24 19:44:45.177749] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1350995 ] 00:08:53.939 [2024-07-24 19:44:45.307594] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:53.939 [2024-07-24 19:44:45.408350] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:54.875 [2024-07-24 19:44:46.163458] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:54.875 [2024-07-24 19:44:46.166053] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1f840e0 PMD being used: compress_qat 00:08:54.875 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:54.875 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.875 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.875 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.875 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:54.875 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.875 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.875 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.875 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:54.875 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.875 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.875 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.875 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:54.875 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.875 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.875 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.875 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:54.875 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.875 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.875 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.876 [2024-07-24 19:44:46.170178] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1f873b0 PMD being used: compress_qat 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:54.876 [2024-07-24 19:44:46.172984] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2188db0 PMD being used: compress_qat 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.876 19:44:46 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:55.815 00:08:55.815 real 0m2.199s 00:08:55.815 user 0m1.609s 00:08:55.815 sys 0m0.584s 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:55.815 19:44:47 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:55.815 ************************************ 00:08:55.815 END TEST accel_cdev_decomp_full_mthread 00:08:55.815 ************************************ 00:08:55.815 19:44:47 accel -- accel/accel.sh@136 -- # unset COMPRESSDEV 00:08:55.815 19:44:47 accel -- accel/accel.sh@139 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:55.815 19:44:47 accel -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:08:55.815 19:44:47 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:55.815 19:44:47 accel -- common/autotest_common.sh@10 -- # set +x 00:08:55.815 19:44:47 accel -- accel/accel.sh@139 -- # build_accel_config 00:08:55.815 19:44:47 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:55.815 19:44:47 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:55.815 19:44:47 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:55.815 19:44:47 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:55.815 19:44:47 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:55.815 19:44:47 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:55.815 19:44:47 accel -- accel/accel.sh@41 -- # jq -r . 00:08:56.073 ************************************ 00:08:56.073 START TEST accel_dif_functional_tests 00:08:56.073 ************************************ 00:08:56.073 19:44:47 accel.accel_dif_functional_tests -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:56.073 [2024-07-24 19:44:47.481078] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:56.073 [2024-07-24 19:44:47.481138] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1351365 ] 00:08:56.073 [2024-07-24 19:44:47.608957] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:56.333 [2024-07-24 19:44:47.712554] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:56.333 [2024-07-24 19:44:47.712654] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:56.333 [2024-07-24 19:44:47.712657] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:56.333 00:08:56.333 00:08:56.333 CUnit - A unit testing framework for C - Version 2.1-3 00:08:56.333 http://cunit.sourceforge.net/ 00:08:56.333 00:08:56.333 00:08:56.333 Suite: accel_dif 00:08:56.333 Test: verify: DIF generated, GUARD check ...passed 00:08:56.333 Test: verify: DIX generated, GUARD check ...passed 00:08:56.333 Test: verify: DIF generated, APPTAG check ...passed 00:08:56.333 Test: verify: DIX generated, APPTAG check ...passed 00:08:56.333 Test: verify: DIF generated, REFTAG check ...passed 00:08:56.333 Test: verify: DIX generated, REFTAG check ...passed 00:08:56.333 Test: verify: DIF not generated, GUARD check ...[2024-07-24 19:44:47.815106] dif.c: 861:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:56.333 passed 00:08:56.333 Test: verify: DIX not generated, GUARD check ...[2024-07-24 19:44:47.815184] dif.c: 861:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=0, Actual=7867 00:08:56.333 passed 00:08:56.333 Test: verify: DIF not generated, APPTAG check ...[2024-07-24 19:44:47.815221] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:56.333 passed 00:08:56.333 Test: verify: DIX not generated, APPTAG check ...[2024-07-24 19:44:47.815257] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=0 00:08:56.333 passed 00:08:56.333 Test: verify: DIF not generated, REFTAG check ...[2024-07-24 19:44:47.815293] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:56.333 passed 00:08:56.333 Test: verify: DIX not generated, REFTAG check ...[2024-07-24 19:44:47.815332] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=0 00:08:56.333 passed 00:08:56.333 Test: verify: DIF APPTAG correct, APPTAG check ...passed 00:08:56.333 Test: verify: DIX APPTAG correct, APPTAG check ...passed 00:08:56.333 Test: verify: DIF APPTAG incorrect, APPTAG check ...[2024-07-24 19:44:47.815455] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:08:56.333 passed 00:08:56.333 Test: verify: DIX APPTAG incorrect, APPTAG check ...[2024-07-24 19:44:47.815501] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:08:56.333 passed 00:08:56.333 Test: verify: DIF APPTAG incorrect, no APPTAG check ...passed 00:08:56.333 Test: verify: DIX APPTAG incorrect, no APPTAG check ...passed 00:08:56.333 Test: verify: DIF REFTAG incorrect, REFTAG ignore ...passed 00:08:56.333 Test: verify: DIX REFTAG incorrect, REFTAG ignore ...passed 00:08:56.333 Test: verify: DIF REFTAG_INIT correct, REFTAG check ...passed 00:08:56.333 Test: verify: DIX REFTAG_INIT correct, REFTAG check ...passed 00:08:56.333 Test: verify: DIF REFTAG_INIT incorrect, REFTAG check ...[2024-07-24 19:44:47.815806] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:08:56.333 passed 00:08:56.333 Test: verify: DIX REFTAG_INIT incorrect, REFTAG check ...[2024-07-24 19:44:47.815861] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:08:56.333 passed 00:08:56.333 Test: verify copy: DIF generated, GUARD check ...passed 00:08:56.333 Test: verify copy: DIF generated, APPTAG check ...passed 00:08:56.333 Test: verify copy: DIF generated, REFTAG check ...passed 00:08:56.333 Test: verify copy: DIF not generated, GUARD check ...[2024-07-24 19:44:47.816036] dif.c: 861:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:56.333 passed 00:08:56.333 Test: verify copy: DIF not generated, APPTAG check ...[2024-07-24 19:44:47.816075] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:56.333 passed 00:08:56.333 Test: verify copy: DIF not generated, REFTAG check ...[2024-07-24 19:44:47.816113] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:56.333 passed 00:08:56.333 Test: generate copy: DIF generated, GUARD check ...passed 00:08:56.333 Test: generate copy: DIF generated, APTTAG check ...passed 00:08:56.333 Test: generate copy: DIF generated, REFTAG check ...passed 00:08:56.333 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:08:56.333 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:08:56.333 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:08:56.333 Test: generate copy: DIF iovecs-len validate ...[2024-07-24 19:44:47.816401] dif.c:1225:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:08:56.333 passed 00:08:56.333 Test: generate copy: DIF buffer alignment validate ...passed 00:08:56.333 00:08:56.333 Run Summary: Type Total Ran Passed Failed Inactive 00:08:56.333 suites 1 1 n/a 0 0 00:08:56.333 tests 38 38 38 0 0 00:08:56.333 asserts 170 170 170 0 n/a 00:08:56.333 00:08:56.333 Elapsed time = 0.005 seconds 00:08:56.593 00:08:56.593 real 0m0.613s 00:08:56.593 user 0m0.833s 00:08:56.593 sys 0m0.233s 00:08:56.593 19:44:48 accel.accel_dif_functional_tests -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:56.593 19:44:48 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:08:56.593 ************************************ 00:08:56.593 END TEST accel_dif_functional_tests 00:08:56.593 ************************************ 00:08:56.593 00:08:56.593 real 0m57.164s 00:08:56.593 user 1m4.627s 00:08:56.593 sys 0m12.807s 00:08:56.593 19:44:48 accel -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:56.593 19:44:48 accel -- common/autotest_common.sh@10 -- # set +x 00:08:56.593 ************************************ 00:08:56.593 END TEST accel 00:08:56.593 ************************************ 00:08:56.593 19:44:48 -- spdk/autotest.sh@186 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:56.593 19:44:48 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:08:56.593 19:44:48 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:56.593 19:44:48 -- common/autotest_common.sh@10 -- # set +x 00:08:56.593 ************************************ 00:08:56.593 START TEST accel_rpc 00:08:56.593 ************************************ 00:08:56.593 19:44:48 accel_rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:56.853 * Looking for test storage... 00:08:56.853 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:08:56.853 19:44:48 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:56.853 19:44:48 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=1351429 00:08:56.853 19:44:48 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 1351429 00:08:56.853 19:44:48 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:08:56.853 19:44:48 accel_rpc -- common/autotest_common.sh@831 -- # '[' -z 1351429 ']' 00:08:56.853 19:44:48 accel_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:56.853 19:44:48 accel_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:56.853 19:44:48 accel_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:56.853 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:56.853 19:44:48 accel_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:56.853 19:44:48 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:56.853 [2024-07-24 19:44:48.344067] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:56.853 [2024-07-24 19:44:48.344150] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1351429 ] 00:08:57.112 [2024-07-24 19:44:48.475234] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:57.112 [2024-07-24 19:44:48.581420] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:57.679 19:44:49 accel_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:57.679 19:44:49 accel_rpc -- common/autotest_common.sh@864 -- # return 0 00:08:57.679 19:44:49 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:08:57.679 19:44:49 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:08:57.679 19:44:49 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:08:57.679 19:44:49 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:08:57.679 19:44:49 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:08:57.680 19:44:49 accel_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:08:57.680 19:44:49 accel_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:57.680 19:44:49 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:57.938 ************************************ 00:08:57.938 START TEST accel_assign_opcode 00:08:57.938 ************************************ 00:08:57.938 19:44:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1125 -- # accel_assign_opcode_test_suite 00:08:57.938 19:44:49 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:08:57.938 19:44:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:57.938 19:44:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:57.938 [2024-07-24 19:44:49.311769] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:08:57.938 19:44:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:57.938 19:44:49 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:08:57.938 19:44:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:57.938 19:44:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:57.938 [2024-07-24 19:44:49.319780] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:08:57.938 19:44:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:57.938 19:44:49 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:08:57.938 19:44:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:57.938 19:44:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:58.197 19:44:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:58.197 19:44:49 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:08:58.197 19:44:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:58.197 19:44:49 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:08:58.197 19:44:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:58.197 19:44:49 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:08:58.197 19:44:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:58.197 software 00:08:58.197 00:08:58.197 real 0m0.294s 00:08:58.197 user 0m0.053s 00:08:58.197 sys 0m0.012s 00:08:58.197 19:44:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:58.197 19:44:49 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:58.197 ************************************ 00:08:58.197 END TEST accel_assign_opcode 00:08:58.197 ************************************ 00:08:58.197 19:44:49 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 1351429 00:08:58.197 19:44:49 accel_rpc -- common/autotest_common.sh@950 -- # '[' -z 1351429 ']' 00:08:58.197 19:44:49 accel_rpc -- common/autotest_common.sh@954 -- # kill -0 1351429 00:08:58.197 19:44:49 accel_rpc -- common/autotest_common.sh@955 -- # uname 00:08:58.197 19:44:49 accel_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:08:58.197 19:44:49 accel_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1351429 00:08:58.197 19:44:49 accel_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:08:58.197 19:44:49 accel_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:08:58.197 19:44:49 accel_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1351429' 00:08:58.197 killing process with pid 1351429 00:08:58.197 19:44:49 accel_rpc -- common/autotest_common.sh@969 -- # kill 1351429 00:08:58.197 19:44:49 accel_rpc -- common/autotest_common.sh@974 -- # wait 1351429 00:08:58.764 00:08:58.764 real 0m1.906s 00:08:58.764 user 0m1.997s 00:08:58.764 sys 0m0.582s 00:08:58.764 19:44:50 accel_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:58.764 19:44:50 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:58.764 ************************************ 00:08:58.764 END TEST accel_rpc 00:08:58.764 ************************************ 00:08:58.764 19:44:50 -- spdk/autotest.sh@189 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:58.764 19:44:50 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:08:58.764 19:44:50 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:58.764 19:44:50 -- common/autotest_common.sh@10 -- # set +x 00:08:58.764 ************************************ 00:08:58.764 START TEST app_cmdline 00:08:58.764 ************************************ 00:08:58.764 19:44:50 app_cmdline -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:58.764 * Looking for test storage... 00:08:58.764 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:08:58.764 19:44:50 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:08:58.764 19:44:50 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=1351846 00:08:58.764 19:44:50 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:08:58.764 19:44:50 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 1351846 00:08:58.764 19:44:50 app_cmdline -- common/autotest_common.sh@831 -- # '[' -z 1351846 ']' 00:08:58.764 19:44:50 app_cmdline -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:58.764 19:44:50 app_cmdline -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:58.764 19:44:50 app_cmdline -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:58.764 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:58.764 19:44:50 app_cmdline -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:58.764 19:44:50 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:58.764 [2024-07-24 19:44:50.340492] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:08:58.764 [2024-07-24 19:44:50.340581] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1351846 ] 00:08:59.022 [2024-07-24 19:44:50.483551] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:59.022 [2024-07-24 19:44:50.587986] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:59.958 19:44:51 app_cmdline -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:59.958 19:44:51 app_cmdline -- common/autotest_common.sh@864 -- # return 0 00:08:59.958 19:44:51 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:08:59.958 { 00:08:59.958 "version": "SPDK v24.09-pre git sha1 3bc1795d3", 00:08:59.958 "fields": { 00:08:59.958 "major": 24, 00:08:59.958 "minor": 9, 00:08:59.958 "patch": 0, 00:08:59.958 "suffix": "-pre", 00:08:59.958 "commit": "3bc1795d3" 00:08:59.958 } 00:08:59.958 } 00:08:59.958 19:44:51 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:08:59.958 19:44:51 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:08:59.958 19:44:51 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:08:59.958 19:44:51 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:08:59.958 19:44:51 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:08:59.958 19:44:51 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:08:59.958 19:44:51 app_cmdline -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:59.958 19:44:51 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:59.958 19:44:51 app_cmdline -- app/cmdline.sh@26 -- # sort 00:08:59.958 19:44:51 app_cmdline -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:59.958 19:44:51 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:08:59.958 19:44:51 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:08:59.958 19:44:51 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:59.958 19:44:51 app_cmdline -- common/autotest_common.sh@650 -- # local es=0 00:08:59.958 19:44:51 app_cmdline -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:09:00.218 19:44:51 app_cmdline -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:00.218 19:44:51 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:09:00.218 19:44:51 app_cmdline -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:00.218 19:44:51 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:09:00.218 19:44:51 app_cmdline -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:00.218 19:44:51 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:09:00.218 19:44:51 app_cmdline -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:00.218 19:44:51 app_cmdline -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:09:00.218 19:44:51 app_cmdline -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:09:00.218 request: 00:09:00.218 { 00:09:00.218 "method": "env_dpdk_get_mem_stats", 00:09:00.218 "req_id": 1 00:09:00.218 } 00:09:00.218 Got JSON-RPC error response 00:09:00.218 response: 00:09:00.218 { 00:09:00.218 "code": -32601, 00:09:00.218 "message": "Method not found" 00:09:00.218 } 00:09:00.218 19:44:51 app_cmdline -- common/autotest_common.sh@653 -- # es=1 00:09:00.218 19:44:51 app_cmdline -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:09:00.218 19:44:51 app_cmdline -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:09:00.218 19:44:51 app_cmdline -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:09:00.218 19:44:51 app_cmdline -- app/cmdline.sh@1 -- # killprocess 1351846 00:09:00.218 19:44:51 app_cmdline -- common/autotest_common.sh@950 -- # '[' -z 1351846 ']' 00:09:00.218 19:44:51 app_cmdline -- common/autotest_common.sh@954 -- # kill -0 1351846 00:09:00.218 19:44:51 app_cmdline -- common/autotest_common.sh@955 -- # uname 00:09:00.477 19:44:51 app_cmdline -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:00.477 19:44:51 app_cmdline -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1351846 00:09:00.477 19:44:51 app_cmdline -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:00.477 19:44:51 app_cmdline -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:00.477 19:44:51 app_cmdline -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1351846' 00:09:00.477 killing process with pid 1351846 00:09:00.477 19:44:51 app_cmdline -- common/autotest_common.sh@969 -- # kill 1351846 00:09:00.477 19:44:51 app_cmdline -- common/autotest_common.sh@974 -- # wait 1351846 00:09:00.736 00:09:00.736 real 0m2.092s 00:09:00.736 user 0m2.478s 00:09:00.736 sys 0m0.661s 00:09:00.736 19:44:52 app_cmdline -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:00.736 19:44:52 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:09:00.736 ************************************ 00:09:00.736 END TEST app_cmdline 00:09:00.736 ************************************ 00:09:00.737 19:44:52 -- spdk/autotest.sh@190 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:09:00.737 19:44:52 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:09:00.737 19:44:52 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:00.737 19:44:52 -- common/autotest_common.sh@10 -- # set +x 00:09:00.996 ************************************ 00:09:00.996 START TEST version 00:09:00.996 ************************************ 00:09:00.996 19:44:52 version -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:09:00.996 * Looking for test storage... 00:09:00.996 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:09:00.996 19:44:52 version -- app/version.sh@17 -- # get_header_version major 00:09:00.996 19:44:52 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:09:00.996 19:44:52 version -- app/version.sh@14 -- # cut -f2 00:09:00.996 19:44:52 version -- app/version.sh@14 -- # tr -d '"' 00:09:00.996 19:44:52 version -- app/version.sh@17 -- # major=24 00:09:00.996 19:44:52 version -- app/version.sh@18 -- # get_header_version minor 00:09:00.996 19:44:52 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:09:00.996 19:44:52 version -- app/version.sh@14 -- # cut -f2 00:09:00.996 19:44:52 version -- app/version.sh@14 -- # tr -d '"' 00:09:00.996 19:44:52 version -- app/version.sh@18 -- # minor=9 00:09:00.996 19:44:52 version -- app/version.sh@19 -- # get_header_version patch 00:09:00.996 19:44:52 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:09:00.996 19:44:52 version -- app/version.sh@14 -- # cut -f2 00:09:00.996 19:44:52 version -- app/version.sh@14 -- # tr -d '"' 00:09:00.996 19:44:52 version -- app/version.sh@19 -- # patch=0 00:09:00.996 19:44:52 version -- app/version.sh@20 -- # get_header_version suffix 00:09:00.996 19:44:52 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:09:00.996 19:44:52 version -- app/version.sh@14 -- # cut -f2 00:09:00.996 19:44:52 version -- app/version.sh@14 -- # tr -d '"' 00:09:00.996 19:44:52 version -- app/version.sh@20 -- # suffix=-pre 00:09:00.996 19:44:52 version -- app/version.sh@22 -- # version=24.9 00:09:00.996 19:44:52 version -- app/version.sh@25 -- # (( patch != 0 )) 00:09:00.996 19:44:52 version -- app/version.sh@28 -- # version=24.9rc0 00:09:00.996 19:44:52 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:09:00.996 19:44:52 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:09:00.996 19:44:52 version -- app/version.sh@30 -- # py_version=24.9rc0 00:09:00.996 19:44:52 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:09:00.996 00:09:00.996 real 0m0.200s 00:09:00.996 user 0m0.106s 00:09:00.996 sys 0m0.144s 00:09:00.996 19:44:52 version -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:00.996 19:44:52 version -- common/autotest_common.sh@10 -- # set +x 00:09:00.996 ************************************ 00:09:00.996 END TEST version 00:09:00.996 ************************************ 00:09:00.996 19:44:52 -- spdk/autotest.sh@192 -- # '[' 1 -eq 1 ']' 00:09:00.996 19:44:52 -- spdk/autotest.sh@193 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:09:00.996 19:44:52 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:09:00.996 19:44:52 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:00.996 19:44:52 -- common/autotest_common.sh@10 -- # set +x 00:09:01.257 ************************************ 00:09:01.257 START TEST blockdev_general 00:09:01.257 ************************************ 00:09:01.257 19:44:52 blockdev_general -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:09:01.257 * Looking for test storage... 00:09:01.257 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:09:01.257 19:44:52 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:09:01.257 19:44:52 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:09:01.257 19:44:52 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:09:01.257 19:44:52 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:09:01.257 19:44:52 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:09:01.257 19:44:52 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:09:01.257 19:44:52 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:09:01.257 19:44:52 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:09:01.257 19:44:52 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:09:01.257 19:44:52 blockdev_general -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:09:01.257 19:44:52 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:09:01.257 19:44:52 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:09:01.257 19:44:52 blockdev_general -- bdev/blockdev.sh@673 -- # uname -s 00:09:01.257 19:44:52 blockdev_general -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:09:01.257 19:44:52 blockdev_general -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:09:01.257 19:44:52 blockdev_general -- bdev/blockdev.sh@681 -- # test_type=bdev 00:09:01.257 19:44:52 blockdev_general -- bdev/blockdev.sh@682 -- # crypto_device= 00:09:01.257 19:44:52 blockdev_general -- bdev/blockdev.sh@683 -- # dek= 00:09:01.257 19:44:52 blockdev_general -- bdev/blockdev.sh@684 -- # env_ctx= 00:09:01.257 19:44:52 blockdev_general -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:09:01.257 19:44:52 blockdev_general -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:09:01.257 19:44:52 blockdev_general -- bdev/blockdev.sh@689 -- # [[ bdev == bdev ]] 00:09:01.257 19:44:52 blockdev_general -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:09:01.257 19:44:52 blockdev_general -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:09:01.257 19:44:52 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1352320 00:09:01.257 19:44:52 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:09:01.257 19:44:52 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 1352320 00:09:01.257 19:44:52 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:09:01.257 19:44:52 blockdev_general -- common/autotest_common.sh@831 -- # '[' -z 1352320 ']' 00:09:01.257 19:44:52 blockdev_general -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:01.257 19:44:52 blockdev_general -- common/autotest_common.sh@836 -- # local max_retries=100 00:09:01.257 19:44:52 blockdev_general -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:01.257 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:01.257 19:44:52 blockdev_general -- common/autotest_common.sh@840 -- # xtrace_disable 00:09:01.257 19:44:52 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:01.257 [2024-07-24 19:44:52.816924] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:09:01.257 [2024-07-24 19:44:52.816998] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1352320 ] 00:09:01.516 [2024-07-24 19:44:52.947421] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:01.516 [2024-07-24 19:44:53.053640] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:02.455 19:44:53 blockdev_general -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:02.455 19:44:53 blockdev_general -- common/autotest_common.sh@864 -- # return 0 00:09:02.455 19:44:53 blockdev_general -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:09:02.455 19:44:53 blockdev_general -- bdev/blockdev.sh@695 -- # setup_bdev_conf 00:09:02.455 19:44:53 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:09:02.455 19:44:53 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:02.455 19:44:53 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:02.455 [2024-07-24 19:44:53.997846] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:02.455 [2024-07-24 19:44:53.997901] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:02.455 00:09:02.455 [2024-07-24 19:44:54.005836] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:02.455 [2024-07-24 19:44:54.005861] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:02.455 00:09:02.455 Malloc0 00:09:02.455 Malloc1 00:09:02.715 Malloc2 00:09:02.715 Malloc3 00:09:02.715 Malloc4 00:09:02.715 Malloc5 00:09:02.715 Malloc6 00:09:02.715 Malloc7 00:09:02.715 Malloc8 00:09:02.715 Malloc9 00:09:02.715 [2024-07-24 19:44:54.142678] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:02.715 [2024-07-24 19:44:54.142728] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:02.715 [2024-07-24 19:44:54.142750] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e7e750 00:09:02.715 [2024-07-24 19:44:54.142763] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:02.715 [2024-07-24 19:44:54.144099] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:02.715 [2024-07-24 19:44:54.144127] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:02.715 TestPT 00:09:02.715 19:44:54 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:02.715 19:44:54 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:09:02.715 5000+0 records in 00:09:02.715 5000+0 records out 00:09:02.715 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0271676 s, 377 MB/s 00:09:02.715 19:44:54 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:09:02.715 19:44:54 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:02.715 19:44:54 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:02.715 AIO0 00:09:02.715 19:44:54 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:02.715 19:44:54 blockdev_general -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:09:02.715 19:44:54 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:02.715 19:44:54 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:02.715 19:44:54 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:02.715 19:44:54 blockdev_general -- bdev/blockdev.sh@739 -- # cat 00:09:02.715 19:44:54 blockdev_general -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:09:02.715 19:44:54 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:02.715 19:44:54 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:02.715 19:44:54 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:02.715 19:44:54 blockdev_general -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:09:02.715 19:44:54 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:02.715 19:44:54 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:02.974 19:44:54 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:02.974 19:44:54 blockdev_general -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:09:02.974 19:44:54 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:02.974 19:44:54 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:02.974 19:44:54 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:02.974 19:44:54 blockdev_general -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:09:02.974 19:44:54 blockdev_general -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:09:02.974 19:44:54 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:02.974 19:44:54 blockdev_general -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:09:02.974 19:44:54 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:02.974 19:44:54 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:02.974 19:44:54 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:09:03.234 19:44:54 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r .name 00:09:03.235 19:44:54 blockdev_general -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "59773273-a1e3-49ce-853c-7e5a5a262862"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "59773273-a1e3-49ce-853c-7e5a5a262862",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "f7c0a053-2d2b-5329-8b1c-a04dfe1b00e2"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "f7c0a053-2d2b-5329-8b1c-a04dfe1b00e2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "1cc3d25a-82b3-5ba5-806e-4893caef2daa"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "1cc3d25a-82b3-5ba5-806e-4893caef2daa",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "bb4ff44c-edbf-540a-bdf6-dce9d68fdcc3"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "bb4ff44c-edbf-540a-bdf6-dce9d68fdcc3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "6acfe163-5f4a-5fd0-be57-3556a45221fc"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "6acfe163-5f4a-5fd0-be57-3556a45221fc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "9e07eae6-4e10-5293-80b4-8f0647addbdf"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "9e07eae6-4e10-5293-80b4-8f0647addbdf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "926b0284-0a2c-50a8-bf70-6a93a860eed1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "926b0284-0a2c-50a8-bf70-6a93a860eed1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "0396453e-446c-57cb-b009-a1d4702ddb03"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "0396453e-446c-57cb-b009-a1d4702ddb03",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "9406c365-10a4-5439-9a7a-d9bb26a413b4"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "9406c365-10a4-5439-9a7a-d9bb26a413b4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "3a48ef9a-b83f-5e5a-864b-1505facb2bd9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3a48ef9a-b83f-5e5a-864b-1505facb2bd9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "12afcef7-72f9-502e-b539-157a51701373"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "12afcef7-72f9-502e-b539-157a51701373",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "47aecfc9-7fa2-5692-9819-db838377c056"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "47aecfc9-7fa2-5692-9819-db838377c056",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "eaac3e18-96ea-424a-a31c-5005985819d3"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "eaac3e18-96ea-424a-a31c-5005985819d3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "eaac3e18-96ea-424a-a31c-5005985819d3",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "a0fcc8a6-312c-4698-9353-6a8f1ed4c2d2",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "a7f7745d-9cea-484c-8eac-7ce7191e7ded",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "f5f87728-a607-4252-9fae-90a82788fe4e"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "f5f87728-a607-4252-9fae-90a82788fe4e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "f5f87728-a607-4252-9fae-90a82788fe4e",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "c89cce57-1c27-4624-99ce-dcdc0818a832",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "6bcbd865-1714-4c98-9d71-ef5d9f8e19d1",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "ce63bbd1-df84-4168-9d84-776f26fe1aa6"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ce63bbd1-df84-4168-9d84-776f26fe1aa6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "ce63bbd1-df84-4168-9d84-776f26fe1aa6",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "0ba9049c-e555-48a5-9d0c-b7a3f63c214d",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "c7ed200b-322e-42f5-83ba-a3b964556d32",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "7292d96a-990f-4c4b-aad3-e15a946986a6"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "7292d96a-990f-4c4b-aad3-e15a946986a6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:03.235 19:44:54 blockdev_general -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:09:03.235 19:44:54 blockdev_general -- bdev/blockdev.sh@751 -- # hello_world_bdev=Malloc0 00:09:03.235 19:44:54 blockdev_general -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:09:03.235 19:44:54 blockdev_general -- bdev/blockdev.sh@753 -- # killprocess 1352320 00:09:03.235 19:44:54 blockdev_general -- common/autotest_common.sh@950 -- # '[' -z 1352320 ']' 00:09:03.235 19:44:54 blockdev_general -- common/autotest_common.sh@954 -- # kill -0 1352320 00:09:03.235 19:44:54 blockdev_general -- common/autotest_common.sh@955 -- # uname 00:09:03.235 19:44:54 blockdev_general -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:03.235 19:44:54 blockdev_general -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1352320 00:09:03.235 19:44:54 blockdev_general -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:03.235 19:44:54 blockdev_general -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:03.235 19:44:54 blockdev_general -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1352320' 00:09:03.235 killing process with pid 1352320 00:09:03.235 19:44:54 blockdev_general -- common/autotest_common.sh@969 -- # kill 1352320 00:09:03.235 19:44:54 blockdev_general -- common/autotest_common.sh@974 -- # wait 1352320 00:09:03.803 19:44:55 blockdev_general -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:09:03.803 19:44:55 blockdev_general -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:09:03.803 19:44:55 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:09:03.803 19:44:55 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:03.803 19:44:55 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:03.803 ************************************ 00:09:03.803 START TEST bdev_hello_world 00:09:03.803 ************************************ 00:09:03.803 19:44:55 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:09:03.803 [2024-07-24 19:44:55.276340] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:09:03.803 [2024-07-24 19:44:55.276411] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1352577 ] 00:09:04.063 [2024-07-24 19:44:55.407146] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:04.063 [2024-07-24 19:44:55.509493] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:04.322 [2024-07-24 19:44:55.661946] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:04.322 [2024-07-24 19:44:55.662008] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:04.322 [2024-07-24 19:44:55.662023] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:04.322 [2024-07-24 19:44:55.669949] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:04.322 [2024-07-24 19:44:55.669976] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:04.322 [2024-07-24 19:44:55.677960] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:04.322 [2024-07-24 19:44:55.677984] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:04.322 [2024-07-24 19:44:55.755119] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:04.322 [2024-07-24 19:44:55.755176] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:04.322 [2024-07-24 19:44:55.755197] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x216bc30 00:09:04.322 [2024-07-24 19:44:55.755209] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:04.322 [2024-07-24 19:44:55.756877] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:04.322 [2024-07-24 19:44:55.756908] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:04.322 [2024-07-24 19:44:55.907915] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:09:04.322 [2024-07-24 19:44:55.907986] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:09:04.322 [2024-07-24 19:44:55.908042] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:09:04.322 [2024-07-24 19:44:55.908118] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:09:04.322 [2024-07-24 19:44:55.908197] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:09:04.322 [2024-07-24 19:44:55.908228] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:09:04.322 [2024-07-24 19:44:55.908291] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:09:04.322 00:09:04.322 [2024-07-24 19:44:55.908331] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:09:04.890 00:09:04.890 real 0m1.038s 00:09:04.890 user 0m0.686s 00:09:04.890 sys 0m0.313s 00:09:04.890 19:44:56 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:04.890 19:44:56 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:09:04.890 ************************************ 00:09:04.890 END TEST bdev_hello_world 00:09:04.890 ************************************ 00:09:04.890 19:44:56 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:09:04.890 19:44:56 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:09:04.890 19:44:56 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:04.890 19:44:56 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:04.890 ************************************ 00:09:04.890 START TEST bdev_bounds 00:09:04.890 ************************************ 00:09:04.890 19:44:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@1125 -- # bdev_bounds '' 00:09:04.890 19:44:56 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=1352724 00:09:04.890 19:44:56 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:09:04.890 19:44:56 blockdev_general.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:09:04.890 19:44:56 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 1352724' 00:09:04.890 Process bdevio pid: 1352724 00:09:04.890 19:44:56 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 1352724 00:09:04.890 19:44:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@831 -- # '[' -z 1352724 ']' 00:09:04.890 19:44:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:04.890 19:44:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@836 -- # local max_retries=100 00:09:04.890 19:44:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:04.890 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:04.890 19:44:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@840 -- # xtrace_disable 00:09:04.890 19:44:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:09:04.890 [2024-07-24 19:44:56.403916] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:09:04.890 [2024-07-24 19:44:56.403986] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1352724 ] 00:09:05.149 [2024-07-24 19:44:56.537695] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:09:05.149 [2024-07-24 19:44:56.648003] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:05.149 [2024-07-24 19:44:56.648115] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:05.149 [2024-07-24 19:44:56.648116] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:05.408 [2024-07-24 19:44:56.797975] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:05.408 [2024-07-24 19:44:56.798027] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:05.408 [2024-07-24 19:44:56.798041] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:05.408 [2024-07-24 19:44:56.805981] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:05.408 [2024-07-24 19:44:56.806008] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:05.409 [2024-07-24 19:44:56.813999] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:05.409 [2024-07-24 19:44:56.814024] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:05.409 [2024-07-24 19:44:56.887415] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:05.409 [2024-07-24 19:44:56.887471] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:05.409 [2024-07-24 19:44:56.887489] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11f0fa0 00:09:05.409 [2024-07-24 19:44:56.887501] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:05.409 [2024-07-24 19:44:56.888952] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:05.409 [2024-07-24 19:44:56.888981] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:05.977 19:44:57 blockdev_general.bdev_bounds -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:05.977 19:44:57 blockdev_general.bdev_bounds -- common/autotest_common.sh@864 -- # return 0 00:09:05.977 19:44:57 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:09:05.977 I/O targets: 00:09:05.977 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:09:05.977 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:09:05.977 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:09:05.977 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:09:05.977 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:09:05.977 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:09:05.977 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:09:05.977 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:09:05.977 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:09:05.977 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:09:05.977 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:09:05.977 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:09:05.977 raid0: 131072 blocks of 512 bytes (64 MiB) 00:09:05.977 concat0: 131072 blocks of 512 bytes (64 MiB) 00:09:05.977 raid1: 65536 blocks of 512 bytes (32 MiB) 00:09:05.977 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:09:05.977 00:09:05.977 00:09:05.977 CUnit - A unit testing framework for C - Version 2.1-3 00:09:05.977 http://cunit.sourceforge.net/ 00:09:05.977 00:09:05.977 00:09:05.977 Suite: bdevio tests on: AIO0 00:09:05.977 Test: blockdev write read block ...passed 00:09:05.977 Test: blockdev write zeroes read block ...passed 00:09:05.977 Test: blockdev write zeroes read no split ...passed 00:09:05.977 Test: blockdev write zeroes read split ...passed 00:09:05.977 Test: blockdev write zeroes read split partial ...passed 00:09:05.977 Test: blockdev reset ...passed 00:09:05.977 Test: blockdev write read 8 blocks ...passed 00:09:05.977 Test: blockdev write read size > 128k ...passed 00:09:05.977 Test: blockdev write read invalid size ...passed 00:09:05.977 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:05.977 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:05.977 Test: blockdev write read max offset ...passed 00:09:05.977 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:05.977 Test: blockdev writev readv 8 blocks ...passed 00:09:05.977 Test: blockdev writev readv 30 x 1block ...passed 00:09:05.977 Test: blockdev writev readv block ...passed 00:09:05.977 Test: blockdev writev readv size > 128k ...passed 00:09:05.977 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:05.977 Test: blockdev comparev and writev ...passed 00:09:05.977 Test: blockdev nvme passthru rw ...passed 00:09:05.977 Test: blockdev nvme passthru vendor specific ...passed 00:09:05.977 Test: blockdev nvme admin passthru ...passed 00:09:05.977 Test: blockdev copy ...passed 00:09:05.977 Suite: bdevio tests on: raid1 00:09:05.977 Test: blockdev write read block ...passed 00:09:05.977 Test: blockdev write zeroes read block ...passed 00:09:05.977 Test: blockdev write zeroes read no split ...passed 00:09:05.977 Test: blockdev write zeroes read split ...passed 00:09:05.977 Test: blockdev write zeroes read split partial ...passed 00:09:05.977 Test: blockdev reset ...passed 00:09:05.977 Test: blockdev write read 8 blocks ...passed 00:09:05.977 Test: blockdev write read size > 128k ...passed 00:09:05.977 Test: blockdev write read invalid size ...passed 00:09:05.977 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:05.977 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:05.977 Test: blockdev write read max offset ...passed 00:09:05.977 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:05.977 Test: blockdev writev readv 8 blocks ...passed 00:09:05.977 Test: blockdev writev readv 30 x 1block ...passed 00:09:05.977 Test: blockdev writev readv block ...passed 00:09:05.977 Test: blockdev writev readv size > 128k ...passed 00:09:05.977 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:05.977 Test: blockdev comparev and writev ...passed 00:09:05.977 Test: blockdev nvme passthru rw ...passed 00:09:05.977 Test: blockdev nvme passthru vendor specific ...passed 00:09:05.977 Test: blockdev nvme admin passthru ...passed 00:09:05.977 Test: blockdev copy ...passed 00:09:05.977 Suite: bdevio tests on: concat0 00:09:05.977 Test: blockdev write read block ...passed 00:09:05.977 Test: blockdev write zeroes read block ...passed 00:09:05.977 Test: blockdev write zeroes read no split ...passed 00:09:05.977 Test: blockdev write zeroes read split ...passed 00:09:05.977 Test: blockdev write zeroes read split partial ...passed 00:09:05.977 Test: blockdev reset ...passed 00:09:05.977 Test: blockdev write read 8 blocks ...passed 00:09:05.977 Test: blockdev write read size > 128k ...passed 00:09:05.977 Test: blockdev write read invalid size ...passed 00:09:05.977 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:05.977 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:05.977 Test: blockdev write read max offset ...passed 00:09:05.977 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:05.977 Test: blockdev writev readv 8 blocks ...passed 00:09:05.977 Test: blockdev writev readv 30 x 1block ...passed 00:09:05.977 Test: blockdev writev readv block ...passed 00:09:05.977 Test: blockdev writev readv size > 128k ...passed 00:09:05.977 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:05.977 Test: blockdev comparev and writev ...passed 00:09:05.977 Test: blockdev nvme passthru rw ...passed 00:09:05.977 Test: blockdev nvme passthru vendor specific ...passed 00:09:05.977 Test: blockdev nvme admin passthru ...passed 00:09:05.977 Test: blockdev copy ...passed 00:09:05.977 Suite: bdevio tests on: raid0 00:09:05.977 Test: blockdev write read block ...passed 00:09:05.977 Test: blockdev write zeroes read block ...passed 00:09:05.977 Test: blockdev write zeroes read no split ...passed 00:09:05.977 Test: blockdev write zeroes read split ...passed 00:09:05.977 Test: blockdev write zeroes read split partial ...passed 00:09:05.977 Test: blockdev reset ...passed 00:09:05.977 Test: blockdev write read 8 blocks ...passed 00:09:05.977 Test: blockdev write read size > 128k ...passed 00:09:05.977 Test: blockdev write read invalid size ...passed 00:09:05.977 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:05.977 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:05.977 Test: blockdev write read max offset ...passed 00:09:05.977 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:05.977 Test: blockdev writev readv 8 blocks ...passed 00:09:05.977 Test: blockdev writev readv 30 x 1block ...passed 00:09:05.977 Test: blockdev writev readv block ...passed 00:09:05.977 Test: blockdev writev readv size > 128k ...passed 00:09:05.977 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:05.978 Test: blockdev comparev and writev ...passed 00:09:05.978 Test: blockdev nvme passthru rw ...passed 00:09:05.978 Test: blockdev nvme passthru vendor specific ...passed 00:09:05.978 Test: blockdev nvme admin passthru ...passed 00:09:05.978 Test: blockdev copy ...passed 00:09:05.978 Suite: bdevio tests on: TestPT 00:09:05.978 Test: blockdev write read block ...passed 00:09:05.978 Test: blockdev write zeroes read block ...passed 00:09:05.978 Test: blockdev write zeroes read no split ...passed 00:09:05.978 Test: blockdev write zeroes read split ...passed 00:09:06.237 Test: blockdev write zeroes read split partial ...passed 00:09:06.237 Test: blockdev reset ...passed 00:09:06.237 Test: blockdev write read 8 blocks ...passed 00:09:06.237 Test: blockdev write read size > 128k ...passed 00:09:06.237 Test: blockdev write read invalid size ...passed 00:09:06.237 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:06.237 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:06.237 Test: blockdev write read max offset ...passed 00:09:06.237 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:06.237 Test: blockdev writev readv 8 blocks ...passed 00:09:06.237 Test: blockdev writev readv 30 x 1block ...passed 00:09:06.237 Test: blockdev writev readv block ...passed 00:09:06.237 Test: blockdev writev readv size > 128k ...passed 00:09:06.237 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:06.237 Test: blockdev comparev and writev ...passed 00:09:06.237 Test: blockdev nvme passthru rw ...passed 00:09:06.237 Test: blockdev nvme passthru vendor specific ...passed 00:09:06.237 Test: blockdev nvme admin passthru ...passed 00:09:06.237 Test: blockdev copy ...passed 00:09:06.237 Suite: bdevio tests on: Malloc2p7 00:09:06.237 Test: blockdev write read block ...passed 00:09:06.237 Test: blockdev write zeroes read block ...passed 00:09:06.237 Test: blockdev write zeroes read no split ...passed 00:09:06.237 Test: blockdev write zeroes read split ...passed 00:09:06.237 Test: blockdev write zeroes read split partial ...passed 00:09:06.237 Test: blockdev reset ...passed 00:09:06.237 Test: blockdev write read 8 blocks ...passed 00:09:06.237 Test: blockdev write read size > 128k ...passed 00:09:06.237 Test: blockdev write read invalid size ...passed 00:09:06.237 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:06.237 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:06.238 Test: blockdev write read max offset ...passed 00:09:06.238 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:06.238 Test: blockdev writev readv 8 blocks ...passed 00:09:06.238 Test: blockdev writev readv 30 x 1block ...passed 00:09:06.238 Test: blockdev writev readv block ...passed 00:09:06.238 Test: blockdev writev readv size > 128k ...passed 00:09:06.238 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:06.238 Test: blockdev comparev and writev ...passed 00:09:06.238 Test: blockdev nvme passthru rw ...passed 00:09:06.238 Test: blockdev nvme passthru vendor specific ...passed 00:09:06.238 Test: blockdev nvme admin passthru ...passed 00:09:06.238 Test: blockdev copy ...passed 00:09:06.238 Suite: bdevio tests on: Malloc2p6 00:09:06.238 Test: blockdev write read block ...passed 00:09:06.238 Test: blockdev write zeroes read block ...passed 00:09:06.238 Test: blockdev write zeroes read no split ...passed 00:09:06.238 Test: blockdev write zeroes read split ...passed 00:09:06.238 Test: blockdev write zeroes read split partial ...passed 00:09:06.238 Test: blockdev reset ...passed 00:09:06.238 Test: blockdev write read 8 blocks ...passed 00:09:06.238 Test: blockdev write read size > 128k ...passed 00:09:06.238 Test: blockdev write read invalid size ...passed 00:09:06.238 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:06.238 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:06.238 Test: blockdev write read max offset ...passed 00:09:06.238 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:06.238 Test: blockdev writev readv 8 blocks ...passed 00:09:06.238 Test: blockdev writev readv 30 x 1block ...passed 00:09:06.238 Test: blockdev writev readv block ...passed 00:09:06.238 Test: blockdev writev readv size > 128k ...passed 00:09:06.238 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:06.238 Test: blockdev comparev and writev ...passed 00:09:06.238 Test: blockdev nvme passthru rw ...passed 00:09:06.238 Test: blockdev nvme passthru vendor specific ...passed 00:09:06.238 Test: blockdev nvme admin passthru ...passed 00:09:06.238 Test: blockdev copy ...passed 00:09:06.238 Suite: bdevio tests on: Malloc2p5 00:09:06.238 Test: blockdev write read block ...passed 00:09:06.238 Test: blockdev write zeroes read block ...passed 00:09:06.238 Test: blockdev write zeroes read no split ...passed 00:09:06.238 Test: blockdev write zeroes read split ...passed 00:09:06.238 Test: blockdev write zeroes read split partial ...passed 00:09:06.238 Test: blockdev reset ...passed 00:09:06.238 Test: blockdev write read 8 blocks ...passed 00:09:06.238 Test: blockdev write read size > 128k ...passed 00:09:06.238 Test: blockdev write read invalid size ...passed 00:09:06.238 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:06.238 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:06.238 Test: blockdev write read max offset ...passed 00:09:06.238 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:06.238 Test: blockdev writev readv 8 blocks ...passed 00:09:06.238 Test: blockdev writev readv 30 x 1block ...passed 00:09:06.238 Test: blockdev writev readv block ...passed 00:09:06.238 Test: blockdev writev readv size > 128k ...passed 00:09:06.238 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:06.238 Test: blockdev comparev and writev ...passed 00:09:06.238 Test: blockdev nvme passthru rw ...passed 00:09:06.238 Test: blockdev nvme passthru vendor specific ...passed 00:09:06.238 Test: blockdev nvme admin passthru ...passed 00:09:06.238 Test: blockdev copy ...passed 00:09:06.238 Suite: bdevio tests on: Malloc2p4 00:09:06.238 Test: blockdev write read block ...passed 00:09:06.238 Test: blockdev write zeroes read block ...passed 00:09:06.238 Test: blockdev write zeroes read no split ...passed 00:09:06.238 Test: blockdev write zeroes read split ...passed 00:09:06.238 Test: blockdev write zeroes read split partial ...passed 00:09:06.238 Test: blockdev reset ...passed 00:09:06.238 Test: blockdev write read 8 blocks ...passed 00:09:06.238 Test: blockdev write read size > 128k ...passed 00:09:06.238 Test: blockdev write read invalid size ...passed 00:09:06.238 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:06.238 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:06.238 Test: blockdev write read max offset ...passed 00:09:06.238 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:06.238 Test: blockdev writev readv 8 blocks ...passed 00:09:06.238 Test: blockdev writev readv 30 x 1block ...passed 00:09:06.238 Test: blockdev writev readv block ...passed 00:09:06.238 Test: blockdev writev readv size > 128k ...passed 00:09:06.238 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:06.238 Test: blockdev comparev and writev ...passed 00:09:06.238 Test: blockdev nvme passthru rw ...passed 00:09:06.238 Test: blockdev nvme passthru vendor specific ...passed 00:09:06.238 Test: blockdev nvme admin passthru ...passed 00:09:06.238 Test: blockdev copy ...passed 00:09:06.238 Suite: bdevio tests on: Malloc2p3 00:09:06.238 Test: blockdev write read block ...passed 00:09:06.238 Test: blockdev write zeroes read block ...passed 00:09:06.238 Test: blockdev write zeroes read no split ...passed 00:09:06.238 Test: blockdev write zeroes read split ...passed 00:09:06.238 Test: blockdev write zeroes read split partial ...passed 00:09:06.238 Test: blockdev reset ...passed 00:09:06.238 Test: blockdev write read 8 blocks ...passed 00:09:06.238 Test: blockdev write read size > 128k ...passed 00:09:06.238 Test: blockdev write read invalid size ...passed 00:09:06.238 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:06.238 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:06.238 Test: blockdev write read max offset ...passed 00:09:06.238 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:06.238 Test: blockdev writev readv 8 blocks ...passed 00:09:06.238 Test: blockdev writev readv 30 x 1block ...passed 00:09:06.238 Test: blockdev writev readv block ...passed 00:09:06.238 Test: blockdev writev readv size > 128k ...passed 00:09:06.238 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:06.238 Test: blockdev comparev and writev ...passed 00:09:06.238 Test: blockdev nvme passthru rw ...passed 00:09:06.238 Test: blockdev nvme passthru vendor specific ...passed 00:09:06.238 Test: blockdev nvme admin passthru ...passed 00:09:06.238 Test: blockdev copy ...passed 00:09:06.238 Suite: bdevio tests on: Malloc2p2 00:09:06.238 Test: blockdev write read block ...passed 00:09:06.238 Test: blockdev write zeroes read block ...passed 00:09:06.238 Test: blockdev write zeroes read no split ...passed 00:09:06.238 Test: blockdev write zeroes read split ...passed 00:09:06.238 Test: blockdev write zeroes read split partial ...passed 00:09:06.238 Test: blockdev reset ...passed 00:09:06.238 Test: blockdev write read 8 blocks ...passed 00:09:06.238 Test: blockdev write read size > 128k ...passed 00:09:06.238 Test: blockdev write read invalid size ...passed 00:09:06.238 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:06.238 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:06.238 Test: blockdev write read max offset ...passed 00:09:06.238 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:06.238 Test: blockdev writev readv 8 blocks ...passed 00:09:06.238 Test: blockdev writev readv 30 x 1block ...passed 00:09:06.238 Test: blockdev writev readv block ...passed 00:09:06.238 Test: blockdev writev readv size > 128k ...passed 00:09:06.238 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:06.238 Test: blockdev comparev and writev ...passed 00:09:06.238 Test: blockdev nvme passthru rw ...passed 00:09:06.238 Test: blockdev nvme passthru vendor specific ...passed 00:09:06.238 Test: blockdev nvme admin passthru ...passed 00:09:06.238 Test: blockdev copy ...passed 00:09:06.238 Suite: bdevio tests on: Malloc2p1 00:09:06.238 Test: blockdev write read block ...passed 00:09:06.238 Test: blockdev write zeroes read block ...passed 00:09:06.238 Test: blockdev write zeroes read no split ...passed 00:09:06.238 Test: blockdev write zeroes read split ...passed 00:09:06.238 Test: blockdev write zeroes read split partial ...passed 00:09:06.238 Test: blockdev reset ...passed 00:09:06.238 Test: blockdev write read 8 blocks ...passed 00:09:06.238 Test: blockdev write read size > 128k ...passed 00:09:06.238 Test: blockdev write read invalid size ...passed 00:09:06.238 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:06.238 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:06.238 Test: blockdev write read max offset ...passed 00:09:06.238 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:06.238 Test: blockdev writev readv 8 blocks ...passed 00:09:06.238 Test: blockdev writev readv 30 x 1block ...passed 00:09:06.238 Test: blockdev writev readv block ...passed 00:09:06.238 Test: blockdev writev readv size > 128k ...passed 00:09:06.238 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:06.238 Test: blockdev comparev and writev ...passed 00:09:06.238 Test: blockdev nvme passthru rw ...passed 00:09:06.238 Test: blockdev nvme passthru vendor specific ...passed 00:09:06.238 Test: blockdev nvme admin passthru ...passed 00:09:06.238 Test: blockdev copy ...passed 00:09:06.238 Suite: bdevio tests on: Malloc2p0 00:09:06.238 Test: blockdev write read block ...passed 00:09:06.238 Test: blockdev write zeroes read block ...passed 00:09:06.238 Test: blockdev write zeroes read no split ...passed 00:09:06.238 Test: blockdev write zeroes read split ...passed 00:09:06.238 Test: blockdev write zeroes read split partial ...passed 00:09:06.238 Test: blockdev reset ...passed 00:09:06.238 Test: blockdev write read 8 blocks ...passed 00:09:06.238 Test: blockdev write read size > 128k ...passed 00:09:06.238 Test: blockdev write read invalid size ...passed 00:09:06.238 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:06.238 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:06.238 Test: blockdev write read max offset ...passed 00:09:06.238 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:06.238 Test: blockdev writev readv 8 blocks ...passed 00:09:06.239 Test: blockdev writev readv 30 x 1block ...passed 00:09:06.239 Test: blockdev writev readv block ...passed 00:09:06.239 Test: blockdev writev readv size > 128k ...passed 00:09:06.239 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:06.239 Test: blockdev comparev and writev ...passed 00:09:06.239 Test: blockdev nvme passthru rw ...passed 00:09:06.239 Test: blockdev nvme passthru vendor specific ...passed 00:09:06.239 Test: blockdev nvme admin passthru ...passed 00:09:06.239 Test: blockdev copy ...passed 00:09:06.239 Suite: bdevio tests on: Malloc1p1 00:09:06.239 Test: blockdev write read block ...passed 00:09:06.239 Test: blockdev write zeroes read block ...passed 00:09:06.239 Test: blockdev write zeroes read no split ...passed 00:09:06.239 Test: blockdev write zeroes read split ...passed 00:09:06.239 Test: blockdev write zeroes read split partial ...passed 00:09:06.239 Test: blockdev reset ...passed 00:09:06.239 Test: blockdev write read 8 blocks ...passed 00:09:06.239 Test: blockdev write read size > 128k ...passed 00:09:06.239 Test: blockdev write read invalid size ...passed 00:09:06.239 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:06.239 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:06.239 Test: blockdev write read max offset ...passed 00:09:06.239 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:06.239 Test: blockdev writev readv 8 blocks ...passed 00:09:06.239 Test: blockdev writev readv 30 x 1block ...passed 00:09:06.239 Test: blockdev writev readv block ...passed 00:09:06.239 Test: blockdev writev readv size > 128k ...passed 00:09:06.239 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:06.239 Test: blockdev comparev and writev ...passed 00:09:06.239 Test: blockdev nvme passthru rw ...passed 00:09:06.239 Test: blockdev nvme passthru vendor specific ...passed 00:09:06.239 Test: blockdev nvme admin passthru ...passed 00:09:06.239 Test: blockdev copy ...passed 00:09:06.239 Suite: bdevio tests on: Malloc1p0 00:09:06.239 Test: blockdev write read block ...passed 00:09:06.239 Test: blockdev write zeroes read block ...passed 00:09:06.239 Test: blockdev write zeroes read no split ...passed 00:09:06.239 Test: blockdev write zeroes read split ...passed 00:09:06.239 Test: blockdev write zeroes read split partial ...passed 00:09:06.239 Test: blockdev reset ...passed 00:09:06.239 Test: blockdev write read 8 blocks ...passed 00:09:06.239 Test: blockdev write read size > 128k ...passed 00:09:06.239 Test: blockdev write read invalid size ...passed 00:09:06.239 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:06.239 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:06.239 Test: blockdev write read max offset ...passed 00:09:06.239 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:06.239 Test: blockdev writev readv 8 blocks ...passed 00:09:06.239 Test: blockdev writev readv 30 x 1block ...passed 00:09:06.239 Test: blockdev writev readv block ...passed 00:09:06.239 Test: blockdev writev readv size > 128k ...passed 00:09:06.239 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:06.239 Test: blockdev comparev and writev ...passed 00:09:06.239 Test: blockdev nvme passthru rw ...passed 00:09:06.239 Test: blockdev nvme passthru vendor specific ...passed 00:09:06.239 Test: blockdev nvme admin passthru ...passed 00:09:06.239 Test: blockdev copy ...passed 00:09:06.239 Suite: bdevio tests on: Malloc0 00:09:06.239 Test: blockdev write read block ...passed 00:09:06.239 Test: blockdev write zeroes read block ...passed 00:09:06.239 Test: blockdev write zeroes read no split ...passed 00:09:06.239 Test: blockdev write zeroes read split ...passed 00:09:06.239 Test: blockdev write zeroes read split partial ...passed 00:09:06.239 Test: blockdev reset ...passed 00:09:06.239 Test: blockdev write read 8 blocks ...passed 00:09:06.239 Test: blockdev write read size > 128k ...passed 00:09:06.239 Test: blockdev write read invalid size ...passed 00:09:06.239 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:06.239 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:06.239 Test: blockdev write read max offset ...passed 00:09:06.239 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:06.239 Test: blockdev writev readv 8 blocks ...passed 00:09:06.239 Test: blockdev writev readv 30 x 1block ...passed 00:09:06.239 Test: blockdev writev readv block ...passed 00:09:06.239 Test: blockdev writev readv size > 128k ...passed 00:09:06.239 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:06.239 Test: blockdev comparev and writev ...passed 00:09:06.239 Test: blockdev nvme passthru rw ...passed 00:09:06.239 Test: blockdev nvme passthru vendor specific ...passed 00:09:06.239 Test: blockdev nvme admin passthru ...passed 00:09:06.239 Test: blockdev copy ...passed 00:09:06.239 00:09:06.239 Run Summary: Type Total Ran Passed Failed Inactive 00:09:06.239 suites 16 16 n/a 0 0 00:09:06.239 tests 368 368 368 0 0 00:09:06.239 asserts 2224 2224 2224 0 n/a 00:09:06.239 00:09:06.239 Elapsed time = 0.665 seconds 00:09:06.239 0 00:09:06.239 19:44:57 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 1352724 00:09:06.239 19:44:57 blockdev_general.bdev_bounds -- common/autotest_common.sh@950 -- # '[' -z 1352724 ']' 00:09:06.239 19:44:57 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # kill -0 1352724 00:09:06.239 19:44:57 blockdev_general.bdev_bounds -- common/autotest_common.sh@955 -- # uname 00:09:06.239 19:44:57 blockdev_general.bdev_bounds -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:06.239 19:44:57 blockdev_general.bdev_bounds -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1352724 00:09:06.498 19:44:57 blockdev_general.bdev_bounds -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:06.498 19:44:57 blockdev_general.bdev_bounds -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:06.498 19:44:57 blockdev_general.bdev_bounds -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1352724' 00:09:06.498 killing process with pid 1352724 00:09:06.498 19:44:57 blockdev_general.bdev_bounds -- common/autotest_common.sh@969 -- # kill 1352724 00:09:06.498 19:44:57 blockdev_general.bdev_bounds -- common/autotest_common.sh@974 -- # wait 1352724 00:09:06.758 19:44:58 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:09:06.758 00:09:06.758 real 0m1.820s 00:09:06.758 user 0m4.491s 00:09:06.758 sys 0m0.513s 00:09:06.758 19:44:58 blockdev_general.bdev_bounds -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:06.758 19:44:58 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:09:06.758 ************************************ 00:09:06.758 END TEST bdev_bounds 00:09:06.758 ************************************ 00:09:06.758 19:44:58 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:09:06.758 19:44:58 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:09:06.758 19:44:58 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:06.758 19:44:58 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:06.758 ************************************ 00:09:06.758 START TEST bdev_nbd 00:09:06.758 ************************************ 00:09:06.758 19:44:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@1125 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:09:06.758 19:44:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:09:06.758 19:44:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:09:06.758 19:44:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:06.758 19:44:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:09:06.758 19:44:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:06.758 19:44:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:09:06.758 19:44:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=16 00:09:06.758 19:44:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:09:06.758 19:44:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:06.758 19:44:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:09:06.758 19:44:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=16 00:09:06.758 19:44:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:06.758 19:44:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:09:06.758 19:44:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:06.758 19:44:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:09:06.758 19:44:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=1353099 00:09:06.758 19:44:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:09:06.758 19:44:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:09:06.758 19:44:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 1353099 /var/tmp/spdk-nbd.sock 00:09:06.758 19:44:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@831 -- # '[' -z 1353099 ']' 00:09:06.758 19:44:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:09:06.758 19:44:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@836 -- # local max_retries=100 00:09:06.758 19:44:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:09:06.758 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:09:06.758 19:44:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@840 -- # xtrace_disable 00:09:06.758 19:44:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:09:06.758 [2024-07-24 19:44:58.314965] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:09:06.758 [2024-07-24 19:44:58.315034] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:07.017 [2024-07-24 19:44:58.446134] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:07.017 [2024-07-24 19:44:58.547901] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:07.276 [2024-07-24 19:44:58.714628] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:07.276 [2024-07-24 19:44:58.714683] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:07.276 [2024-07-24 19:44:58.714697] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:07.276 [2024-07-24 19:44:58.722631] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:07.276 [2024-07-24 19:44:58.722658] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:07.276 [2024-07-24 19:44:58.730643] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:07.276 [2024-07-24 19:44:58.730666] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:07.276 [2024-07-24 19:44:58.807933] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:07.276 [2024-07-24 19:44:58.807989] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:07.276 [2024-07-24 19:44:58.808006] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24e3e40 00:09:07.276 [2024-07-24 19:44:58.808018] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:07.276 [2024-07-24 19:44:58.809489] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:07.276 [2024-07-24 19:44:58.809520] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:07.842 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:07.843 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # return 0 00:09:07.843 19:44:59 blockdev_general.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:09:07.843 19:44:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:07.843 19:44:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:07.843 19:44:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:09:07.843 19:44:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:09:07.843 19:44:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:07.843 19:44:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:07.843 19:44:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:09:07.843 19:44:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:09:07.843 19:44:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:09:07.843 19:44:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:09:07.843 19:44:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:07.843 19:44:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:09:08.102 19:44:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:09:08.102 19:44:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:09:08.102 19:44:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:09:08.102 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:09:08.102 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:08.102 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:08.102 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:08.102 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:09:08.102 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:08.102 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:08.102 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:08.102 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:08.102 1+0 records in 00:09:08.102 1+0 records out 00:09:08.102 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000225679 s, 18.1 MB/s 00:09:08.102 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.102 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:08.102 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.102 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:08.102 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:08.102 19:44:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:08.102 19:44:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:08.102 19:44:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:09:08.361 19:44:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:09:08.361 19:44:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:09:08.361 19:44:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:09:08.361 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:09:08.361 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:08.362 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:08.362 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:08.362 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:09:08.362 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:08.362 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:08.362 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:08.362 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:08.362 1+0 records in 00:09:08.362 1+0 records out 00:09:08.362 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000299336 s, 13.7 MB/s 00:09:08.362 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.362 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:08.362 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.362 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:08.362 19:44:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:08.362 19:44:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:08.362 19:44:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:08.362 19:44:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:09:08.621 19:45:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:09:08.621 19:45:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:09:08.621 19:45:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:09:08.621 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd2 00:09:08.621 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:08.621 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:08.621 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:08.621 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd2 /proc/partitions 00:09:08.621 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:08.621 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:08.621 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:08.621 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:08.621 1+0 records in 00:09:08.621 1+0 records out 00:09:08.621 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000328976 s, 12.5 MB/s 00:09:08.621 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.621 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:08.621 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.621 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:08.621 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:08.621 19:45:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:08.621 19:45:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:08.621 19:45:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:09:08.880 19:45:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:09:08.880 19:45:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:09:08.880 19:45:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:09:08.880 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd3 00:09:08.880 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:08.880 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:08.880 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:08.880 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd3 /proc/partitions 00:09:08.880 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:08.880 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:08.880 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:08.880 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:08.880 1+0 records in 00:09:08.880 1+0 records out 00:09:08.880 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000396177 s, 10.3 MB/s 00:09:08.880 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.880 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:08.880 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.880 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:08.880 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:08.880 19:45:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:08.880 19:45:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:08.880 19:45:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:09:09.139 19:45:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:09:09.139 19:45:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:09:09.139 19:45:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:09:09.139 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd4 00:09:09.139 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:09.139 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:09.139 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:09.139 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd4 /proc/partitions 00:09:09.139 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:09.139 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:09.139 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:09.139 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:09.139 1+0 records in 00:09:09.139 1+0 records out 00:09:09.139 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000340129 s, 12.0 MB/s 00:09:09.139 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:09.139 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:09.139 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:09.139 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:09.139 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:09.139 19:45:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:09.139 19:45:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:09.139 19:45:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:09:09.398 19:45:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:09:09.398 19:45:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:09:09.398 19:45:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:09:09.398 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd5 00:09:09.398 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:09.398 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:09.398 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:09.398 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd5 /proc/partitions 00:09:09.398 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:09.398 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:09.398 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:09.398 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:09.398 1+0 records in 00:09:09.398 1+0 records out 00:09:09.398 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0003001 s, 13.6 MB/s 00:09:09.398 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:09.398 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:09.398 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:09.398 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:09.398 19:45:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:09.398 19:45:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:09.399 19:45:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:09.399 19:45:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd6 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd6 /proc/partitions 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:09.967 1+0 records in 00:09:09.967 1+0 records out 00:09:09.967 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000492872 s, 8.3 MB/s 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd7 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd7 /proc/partitions 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:09.967 1+0 records in 00:09:09.967 1+0 records out 00:09:09.967 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000451759 s, 9.1 MB/s 00:09:09.967 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:10.226 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:10.226 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:10.226 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:10.226 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:10.226 19:45:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:10.226 19:45:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:10.226 19:45:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:09:10.485 19:45:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:09:10.485 19:45:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:09:10.485 19:45:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:09:10.485 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd8 00:09:10.485 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:10.485 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:10.485 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:10.485 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd8 /proc/partitions 00:09:10.485 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:10.485 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:10.485 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:10.485 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:10.485 1+0 records in 00:09:10.485 1+0 records out 00:09:10.485 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000571284 s, 7.2 MB/s 00:09:10.485 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:10.485 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:10.485 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:10.485 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:10.485 19:45:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:10.485 19:45:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:10.485 19:45:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:10.485 19:45:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:09:10.745 19:45:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:09:10.745 19:45:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:09:10.745 19:45:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:09:10.745 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd9 00:09:10.745 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:10.745 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:10.745 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:10.745 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd9 /proc/partitions 00:09:10.745 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:10.745 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:10.745 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:10.745 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:10.745 1+0 records in 00:09:10.745 1+0 records out 00:09:10.745 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000427977 s, 9.6 MB/s 00:09:10.745 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:10.745 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:10.745 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:10.745 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:10.745 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:10.745 19:45:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:10.745 19:45:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:10.745 19:45:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:09:11.004 19:45:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:09:11.004 19:45:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:09:11.004 19:45:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:09:11.004 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd10 00:09:11.004 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:11.004 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:11.004 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:11.004 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd10 /proc/partitions 00:09:11.004 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:11.004 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:11.004 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:11.004 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:11.004 1+0 records in 00:09:11.004 1+0 records out 00:09:11.004 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00056159 s, 7.3 MB/s 00:09:11.004 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:11.004 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:11.004 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:11.004 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:11.004 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:11.004 19:45:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:11.004 19:45:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:11.004 19:45:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:09:11.264 19:45:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:09:11.264 19:45:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:09:11.264 19:45:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:09:11.264 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd11 00:09:11.264 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:11.264 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:11.264 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:11.264 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd11 /proc/partitions 00:09:11.264 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:11.264 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:11.264 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:11.264 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:11.264 1+0 records in 00:09:11.264 1+0 records out 00:09:11.264 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000584005 s, 7.0 MB/s 00:09:11.264 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:11.264 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:11.264 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:11.264 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:11.264 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:11.264 19:45:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:11.264 19:45:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:11.264 19:45:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:09:11.524 19:45:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:09:11.524 19:45:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:09:11.524 19:45:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:09:11.524 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd12 00:09:11.524 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:11.524 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:11.524 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:11.524 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd12 /proc/partitions 00:09:11.524 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:11.524 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:11.524 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:11.524 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:11.524 1+0 records in 00:09:11.524 1+0 records out 00:09:11.524 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000547514 s, 7.5 MB/s 00:09:11.524 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:11.524 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:11.524 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:11.524 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:11.524 19:45:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:11.524 19:45:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:11.524 19:45:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:11.524 19:45:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:09:11.783 19:45:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:09:11.783 19:45:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:09:11.783 19:45:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:09:11.783 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd13 00:09:11.783 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:11.783 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:11.783 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:11.783 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd13 /proc/partitions 00:09:11.783 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:11.783 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:11.783 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:11.783 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:11.783 1+0 records in 00:09:11.783 1+0 records out 00:09:11.783 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000727612 s, 5.6 MB/s 00:09:11.783 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:11.783 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:11.783 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:11.783 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:11.783 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:11.783 19:45:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:11.783 19:45:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:11.783 19:45:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:09:12.042 19:45:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:09:12.042 19:45:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:09:12.042 19:45:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:09:12.042 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd14 00:09:12.042 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:12.042 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:12.042 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:12.042 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd14 /proc/partitions 00:09:12.042 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:12.042 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:12.042 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:12.042 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:12.042 1+0 records in 00:09:12.042 1+0 records out 00:09:12.042 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000593845 s, 6.9 MB/s 00:09:12.042 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:12.042 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:12.042 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:12.042 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:12.042 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:12.042 19:45:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:12.042 19:45:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:12.042 19:45:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:09:12.301 19:45:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:09:12.301 19:45:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:09:12.301 19:45:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:09:12.301 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd15 00:09:12.301 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:12.301 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:12.301 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:12.301 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd15 /proc/partitions 00:09:12.301 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:12.301 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:12.301 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:12.301 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:12.301 1+0 records in 00:09:12.301 1+0 records out 00:09:12.301 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00064867 s, 6.3 MB/s 00:09:12.301 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:12.301 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:12.301 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:12.301 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:12.301 19:45:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:12.301 19:45:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:12.301 19:45:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:12.301 19:45:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:12.561 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd0", 00:09:12.561 "bdev_name": "Malloc0" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd1", 00:09:12.561 "bdev_name": "Malloc1p0" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd2", 00:09:12.561 "bdev_name": "Malloc1p1" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd3", 00:09:12.561 "bdev_name": "Malloc2p0" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd4", 00:09:12.561 "bdev_name": "Malloc2p1" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd5", 00:09:12.561 "bdev_name": "Malloc2p2" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd6", 00:09:12.561 "bdev_name": "Malloc2p3" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd7", 00:09:12.561 "bdev_name": "Malloc2p4" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd8", 00:09:12.561 "bdev_name": "Malloc2p5" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd9", 00:09:12.561 "bdev_name": "Malloc2p6" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd10", 00:09:12.561 "bdev_name": "Malloc2p7" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd11", 00:09:12.561 "bdev_name": "TestPT" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd12", 00:09:12.561 "bdev_name": "raid0" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd13", 00:09:12.561 "bdev_name": "concat0" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd14", 00:09:12.561 "bdev_name": "raid1" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd15", 00:09:12.561 "bdev_name": "AIO0" 00:09:12.561 } 00:09:12.561 ]' 00:09:12.561 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:09:12.561 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd0", 00:09:12.561 "bdev_name": "Malloc0" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd1", 00:09:12.561 "bdev_name": "Malloc1p0" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd2", 00:09:12.561 "bdev_name": "Malloc1p1" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd3", 00:09:12.561 "bdev_name": "Malloc2p0" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd4", 00:09:12.561 "bdev_name": "Malloc2p1" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd5", 00:09:12.561 "bdev_name": "Malloc2p2" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd6", 00:09:12.561 "bdev_name": "Malloc2p3" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd7", 00:09:12.561 "bdev_name": "Malloc2p4" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd8", 00:09:12.561 "bdev_name": "Malloc2p5" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd9", 00:09:12.561 "bdev_name": "Malloc2p6" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd10", 00:09:12.561 "bdev_name": "Malloc2p7" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd11", 00:09:12.561 "bdev_name": "TestPT" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd12", 00:09:12.561 "bdev_name": "raid0" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd13", 00:09:12.561 "bdev_name": "concat0" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd14", 00:09:12.561 "bdev_name": "raid1" 00:09:12.561 }, 00:09:12.561 { 00:09:12.561 "nbd_device": "/dev/nbd15", 00:09:12.561 "bdev_name": "AIO0" 00:09:12.561 } 00:09:12.561 ]' 00:09:12.561 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:09:12.561 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:09:12.561 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:12.561 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:09:12.561 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:12.561 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:09:12.561 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:12.561 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:12.820 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:12.820 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:12.820 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:12.820 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:12.820 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:12.820 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:12.820 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:12.820 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:12.820 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:12.820 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:09:13.079 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:09:13.079 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:09:13.079 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:09:13.079 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:13.079 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:13.079 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:09:13.079 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:13.079 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:13.080 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:13.080 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:09:13.339 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:09:13.339 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:09:13.339 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:09:13.339 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:13.339 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:13.339 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:09:13.339 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:13.339 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:13.339 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:13.339 19:45:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:09:13.598 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:09:13.598 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:09:13.598 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:09:13.598 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:13.598 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:13.598 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:09:13.598 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:13.598 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:13.598 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:13.598 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:09:13.857 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:09:13.857 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:09:13.857 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:09:13.857 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:13.857 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:13.857 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:09:13.857 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:13.858 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:13.858 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:13.858 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:09:14.117 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:09:14.117 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:09:14.117 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:09:14.117 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:14.117 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:14.117 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:09:14.117 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:14.117 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:14.117 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:14.117 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:09:14.376 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:09:14.376 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:09:14.376 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:09:14.376 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:14.376 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:14.376 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:09:14.376 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:14.376 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:14.376 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:14.376 19:45:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:09:14.636 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:09:14.636 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:09:14.636 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:09:14.636 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:14.636 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:14.636 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:09:14.636 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:14.636 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:14.636 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:14.636 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:09:14.895 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:09:14.895 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:09:14.895 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:09:14.895 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:14.895 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:14.895 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:09:14.895 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:14.895 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:14.895 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:14.895 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:09:15.154 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:09:15.154 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:09:15.154 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:09:15.154 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:15.154 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:15.154 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:09:15.154 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:15.154 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:15.154 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:15.154 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:09:15.413 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:09:15.413 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:09:15.413 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:09:15.413 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:15.413 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:15.413 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:09:15.413 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:15.413 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:15.413 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:15.413 19:45:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:09:15.673 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:09:15.673 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:09:15.673 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:09:15.673 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:15.673 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:15.673 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:09:15.673 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:15.673 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:15.673 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:15.673 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:09:15.931 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:09:15.931 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:09:15.931 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:09:15.931 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:15.931 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:15.932 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:09:15.932 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:15.932 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:15.932 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:15.932 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:09:16.191 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:09:16.191 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:09:16.191 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:09:16.191 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:16.191 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:16.191 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:09:16.191 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:16.191 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:16.191 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:16.191 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:09:16.450 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:09:16.450 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:09:16.450 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:09:16.450 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:16.450 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:16.450 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:09:16.450 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:16.450 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:16.450 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:16.450 19:45:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:09:16.709 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:09:16.709 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:09:16.709 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:09:16.709 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:16.709 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:16.709 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:09:16.709 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:16.709 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:16.709 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:16.709 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:16.709 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:16.968 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:16.968 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:16.968 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:16.968 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:16.968 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:09:16.968 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:16.968 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:09:16.968 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:09:16.968 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:09:16.968 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:09:16.968 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:09:16.968 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:09:16.968 19:45:08 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:16.968 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:16.968 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:16.968 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:09:16.968 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:16.968 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:09:16.968 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:16.968 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:16.968 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:16.968 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:09:16.968 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:16.968 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:09:16.968 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:09:16.968 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:09:16.968 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:16.968 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:09:17.227 /dev/nbd0 00:09:17.227 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:09:17.227 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:09:17.227 19:45:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:09:17.227 19:45:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:17.227 19:45:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:17.227 19:45:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:17.227 19:45:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:09:17.227 19:45:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:17.227 19:45:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:17.228 19:45:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:17.228 19:45:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:17.228 1+0 records in 00:09:17.228 1+0 records out 00:09:17.228 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000251884 s, 16.3 MB/s 00:09:17.228 19:45:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:17.228 19:45:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:17.228 19:45:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:17.228 19:45:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:17.228 19:45:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:17.228 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:17.228 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:17.228 19:45:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:09:17.487 /dev/nbd1 00:09:17.487 19:45:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:09:17.487 19:45:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:09:17.487 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:09:17.487 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:17.487 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:17.487 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:17.487 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:09:17.487 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:17.487 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:17.487 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:17.488 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:17.488 1+0 records in 00:09:17.488 1+0 records out 00:09:17.488 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000262157 s, 15.6 MB/s 00:09:17.488 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:17.488 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:17.488 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:17.488 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:17.488 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:17.488 19:45:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:17.488 19:45:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:17.488 19:45:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:09:17.747 /dev/nbd10 00:09:17.747 19:45:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:09:17.747 19:45:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:09:17.747 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd10 00:09:17.747 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:17.747 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:17.747 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:17.747 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd10 /proc/partitions 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:18.006 1+0 records in 00:09:18.006 1+0 records out 00:09:18.006 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000328251 s, 12.5 MB/s 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:09:18.006 /dev/nbd11 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd11 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd11 /proc/partitions 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:18.006 1+0 records in 00:09:18.006 1+0 records out 00:09:18.006 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00032094 s, 12.8 MB/s 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:18.006 19:45:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:09:18.266 /dev/nbd12 00:09:18.266 19:45:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:09:18.266 19:45:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:09:18.266 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd12 00:09:18.266 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:18.266 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:18.266 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:18.266 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd12 /proc/partitions 00:09:18.266 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:18.266 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:18.266 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:18.266 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:18.266 1+0 records in 00:09:18.266 1+0 records out 00:09:18.266 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000401516 s, 10.2 MB/s 00:09:18.266 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.266 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:18.266 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.266 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:18.266 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:18.266 19:45:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:18.266 19:45:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:18.266 19:45:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:09:18.524 /dev/nbd13 00:09:18.524 19:45:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:09:18.524 19:45:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:09:18.524 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd13 00:09:18.524 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:18.524 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:18.524 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:18.524 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd13 /proc/partitions 00:09:18.524 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:18.524 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:18.524 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:18.524 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:18.524 1+0 records in 00:09:18.524 1+0 records out 00:09:18.524 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000417604 s, 9.8 MB/s 00:09:18.524 19:45:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.524 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:18.524 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.524 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:18.524 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:18.524 19:45:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:18.524 19:45:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:18.524 19:45:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:09:18.784 /dev/nbd14 00:09:18.784 19:45:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:09:18.784 19:45:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:09:18.784 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd14 00:09:18.784 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:18.784 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:18.784 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:18.784 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd14 /proc/partitions 00:09:18.784 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:18.784 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:18.784 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:18.784 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:18.784 1+0 records in 00:09:18.784 1+0 records out 00:09:18.784 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000442824 s, 9.2 MB/s 00:09:18.784 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.784 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:18.784 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.784 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:18.784 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:18.784 19:45:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:18.784 19:45:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:18.784 19:45:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:09:19.043 /dev/nbd15 00:09:19.043 19:45:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:09:19.043 19:45:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:09:19.043 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd15 00:09:19.043 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:19.043 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:19.043 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:19.043 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd15 /proc/partitions 00:09:19.043 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:19.043 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:19.043 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:19.043 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:19.043 1+0 records in 00:09:19.043 1+0 records out 00:09:19.043 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00044101 s, 9.3 MB/s 00:09:19.043 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:19.043 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:19.043 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:19.043 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:19.043 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:19.043 19:45:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:19.043 19:45:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:19.043 19:45:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:09:19.303 /dev/nbd2 00:09:19.303 19:45:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:09:19.303 19:45:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:09:19.303 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd2 00:09:19.303 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:19.303 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:19.303 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:19.303 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd2 /proc/partitions 00:09:19.303 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:19.303 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:19.303 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:19.303 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:19.303 1+0 records in 00:09:19.303 1+0 records out 00:09:19.303 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000447213 s, 9.2 MB/s 00:09:19.303 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:19.303 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:19.303 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:19.303 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:19.303 19:45:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:19.303 19:45:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:19.303 19:45:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:19.303 19:45:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:09:19.561 /dev/nbd3 00:09:19.561 19:45:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:09:19.561 19:45:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:09:19.561 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd3 00:09:19.561 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:19.561 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:19.561 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:19.561 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd3 /proc/partitions 00:09:19.561 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:19.561 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:19.561 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:19.561 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:19.561 1+0 records in 00:09:19.561 1+0 records out 00:09:19.561 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000380064 s, 10.8 MB/s 00:09:19.561 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:19.561 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:19.561 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:19.561 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:19.561 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:19.561 19:45:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:19.561 19:45:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:19.561 19:45:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:09:19.820 /dev/nbd4 00:09:19.820 19:45:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:09:20.080 19:45:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:09:20.080 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd4 00:09:20.080 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:20.080 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:20.080 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:20.080 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd4 /proc/partitions 00:09:20.080 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:20.080 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:20.080 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:20.080 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:20.080 1+0 records in 00:09:20.080 1+0 records out 00:09:20.080 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000478101 s, 8.6 MB/s 00:09:20.080 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:20.080 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:20.080 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:20.080 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:20.080 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:20.080 19:45:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:20.080 19:45:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:20.080 19:45:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:09:20.339 /dev/nbd5 00:09:20.339 19:45:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:09:20.339 19:45:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:09:20.339 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd5 00:09:20.339 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:20.339 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:20.339 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:20.339 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd5 /proc/partitions 00:09:20.339 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:20.339 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:20.339 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:20.339 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:20.339 1+0 records in 00:09:20.339 1+0 records out 00:09:20.339 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000568471 s, 7.2 MB/s 00:09:20.339 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:20.339 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:20.339 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:20.339 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:20.339 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:20.339 19:45:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:20.339 19:45:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:20.339 19:45:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:09:20.339 /dev/nbd6 00:09:20.339 19:45:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:09:20.339 19:45:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:09:20.339 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd6 00:09:20.339 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:20.339 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:20.339 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:20.339 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd6 /proc/partitions 00:09:20.599 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:20.599 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:20.600 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:20.600 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:20.600 1+0 records in 00:09:20.600 1+0 records out 00:09:20.600 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000569426 s, 7.2 MB/s 00:09:20.600 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:20.600 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:20.600 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:20.600 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:20.600 19:45:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:20.600 19:45:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:20.600 19:45:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:20.600 19:45:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:09:20.600 /dev/nbd7 00:09:20.860 19:45:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:09:20.860 19:45:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:09:20.860 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd7 00:09:20.860 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:20.860 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:20.860 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:20.860 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd7 /proc/partitions 00:09:20.860 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:20.860 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:20.860 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:20.860 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:20.860 1+0 records in 00:09:20.860 1+0 records out 00:09:20.860 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000649712 s, 6.3 MB/s 00:09:20.860 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:20.860 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:20.860 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:20.860 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:20.860 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:20.860 19:45:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:20.860 19:45:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:20.860 19:45:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:09:21.119 /dev/nbd8 00:09:21.119 19:45:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:09:21.119 19:45:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:09:21.119 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd8 00:09:21.119 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:21.119 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:21.119 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:21.119 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd8 /proc/partitions 00:09:21.119 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:21.119 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:21.119 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:21.119 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:21.119 1+0 records in 00:09:21.119 1+0 records out 00:09:21.119 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000687547 s, 6.0 MB/s 00:09:21.119 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:21.119 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:21.119 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:21.119 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:21.119 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:21.119 19:45:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:21.119 19:45:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:21.119 19:45:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:09:21.378 /dev/nbd9 00:09:21.378 19:45:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:09:21.378 19:45:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:09:21.378 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd9 00:09:21.378 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:21.378 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:21.378 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:21.378 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd9 /proc/partitions 00:09:21.378 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:21.378 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:21.378 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:21.379 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:21.379 1+0 records in 00:09:21.379 1+0 records out 00:09:21.379 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000846886 s, 4.8 MB/s 00:09:21.379 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:21.379 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:21.379 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:21.379 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:21.379 19:45:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:21.379 19:45:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:21.379 19:45:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:21.379 19:45:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:21.379 19:45:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:21.379 19:45:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:21.638 19:45:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:09:21.638 { 00:09:21.638 "nbd_device": "/dev/nbd0", 00:09:21.638 "bdev_name": "Malloc0" 00:09:21.638 }, 00:09:21.638 { 00:09:21.638 "nbd_device": "/dev/nbd1", 00:09:21.638 "bdev_name": "Malloc1p0" 00:09:21.638 }, 00:09:21.638 { 00:09:21.638 "nbd_device": "/dev/nbd10", 00:09:21.638 "bdev_name": "Malloc1p1" 00:09:21.638 }, 00:09:21.638 { 00:09:21.638 "nbd_device": "/dev/nbd11", 00:09:21.638 "bdev_name": "Malloc2p0" 00:09:21.638 }, 00:09:21.638 { 00:09:21.638 "nbd_device": "/dev/nbd12", 00:09:21.638 "bdev_name": "Malloc2p1" 00:09:21.638 }, 00:09:21.638 { 00:09:21.638 "nbd_device": "/dev/nbd13", 00:09:21.638 "bdev_name": "Malloc2p2" 00:09:21.638 }, 00:09:21.638 { 00:09:21.638 "nbd_device": "/dev/nbd14", 00:09:21.638 "bdev_name": "Malloc2p3" 00:09:21.638 }, 00:09:21.638 { 00:09:21.638 "nbd_device": "/dev/nbd15", 00:09:21.638 "bdev_name": "Malloc2p4" 00:09:21.638 }, 00:09:21.638 { 00:09:21.638 "nbd_device": "/dev/nbd2", 00:09:21.638 "bdev_name": "Malloc2p5" 00:09:21.638 }, 00:09:21.638 { 00:09:21.638 "nbd_device": "/dev/nbd3", 00:09:21.638 "bdev_name": "Malloc2p6" 00:09:21.638 }, 00:09:21.638 { 00:09:21.638 "nbd_device": "/dev/nbd4", 00:09:21.638 "bdev_name": "Malloc2p7" 00:09:21.638 }, 00:09:21.638 { 00:09:21.638 "nbd_device": "/dev/nbd5", 00:09:21.638 "bdev_name": "TestPT" 00:09:21.638 }, 00:09:21.638 { 00:09:21.638 "nbd_device": "/dev/nbd6", 00:09:21.638 "bdev_name": "raid0" 00:09:21.638 }, 00:09:21.638 { 00:09:21.638 "nbd_device": "/dev/nbd7", 00:09:21.638 "bdev_name": "concat0" 00:09:21.638 }, 00:09:21.638 { 00:09:21.638 "nbd_device": "/dev/nbd8", 00:09:21.638 "bdev_name": "raid1" 00:09:21.638 }, 00:09:21.638 { 00:09:21.638 "nbd_device": "/dev/nbd9", 00:09:21.638 "bdev_name": "AIO0" 00:09:21.638 } 00:09:21.638 ]' 00:09:21.638 19:45:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:09:21.638 { 00:09:21.638 "nbd_device": "/dev/nbd0", 00:09:21.638 "bdev_name": "Malloc0" 00:09:21.638 }, 00:09:21.638 { 00:09:21.638 "nbd_device": "/dev/nbd1", 00:09:21.638 "bdev_name": "Malloc1p0" 00:09:21.638 }, 00:09:21.638 { 00:09:21.638 "nbd_device": "/dev/nbd10", 00:09:21.638 "bdev_name": "Malloc1p1" 00:09:21.638 }, 00:09:21.638 { 00:09:21.638 "nbd_device": "/dev/nbd11", 00:09:21.638 "bdev_name": "Malloc2p0" 00:09:21.638 }, 00:09:21.638 { 00:09:21.638 "nbd_device": "/dev/nbd12", 00:09:21.638 "bdev_name": "Malloc2p1" 00:09:21.638 }, 00:09:21.638 { 00:09:21.638 "nbd_device": "/dev/nbd13", 00:09:21.638 "bdev_name": "Malloc2p2" 00:09:21.638 }, 00:09:21.638 { 00:09:21.639 "nbd_device": "/dev/nbd14", 00:09:21.639 "bdev_name": "Malloc2p3" 00:09:21.639 }, 00:09:21.639 { 00:09:21.639 "nbd_device": "/dev/nbd15", 00:09:21.639 "bdev_name": "Malloc2p4" 00:09:21.639 }, 00:09:21.639 { 00:09:21.639 "nbd_device": "/dev/nbd2", 00:09:21.639 "bdev_name": "Malloc2p5" 00:09:21.639 }, 00:09:21.639 { 00:09:21.639 "nbd_device": "/dev/nbd3", 00:09:21.639 "bdev_name": "Malloc2p6" 00:09:21.639 }, 00:09:21.639 { 00:09:21.639 "nbd_device": "/dev/nbd4", 00:09:21.639 "bdev_name": "Malloc2p7" 00:09:21.639 }, 00:09:21.639 { 00:09:21.639 "nbd_device": "/dev/nbd5", 00:09:21.639 "bdev_name": "TestPT" 00:09:21.639 }, 00:09:21.639 { 00:09:21.639 "nbd_device": "/dev/nbd6", 00:09:21.639 "bdev_name": "raid0" 00:09:21.639 }, 00:09:21.639 { 00:09:21.639 "nbd_device": "/dev/nbd7", 00:09:21.639 "bdev_name": "concat0" 00:09:21.639 }, 00:09:21.639 { 00:09:21.639 "nbd_device": "/dev/nbd8", 00:09:21.639 "bdev_name": "raid1" 00:09:21.639 }, 00:09:21.639 { 00:09:21.639 "nbd_device": "/dev/nbd9", 00:09:21.639 "bdev_name": "AIO0" 00:09:21.639 } 00:09:21.639 ]' 00:09:21.639 19:45:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:21.639 19:45:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:09:21.639 /dev/nbd1 00:09:21.639 /dev/nbd10 00:09:21.639 /dev/nbd11 00:09:21.639 /dev/nbd12 00:09:21.639 /dev/nbd13 00:09:21.639 /dev/nbd14 00:09:21.639 /dev/nbd15 00:09:21.639 /dev/nbd2 00:09:21.639 /dev/nbd3 00:09:21.639 /dev/nbd4 00:09:21.639 /dev/nbd5 00:09:21.639 /dev/nbd6 00:09:21.639 /dev/nbd7 00:09:21.639 /dev/nbd8 00:09:21.639 /dev/nbd9' 00:09:21.639 19:45:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:09:21.639 /dev/nbd1 00:09:21.639 /dev/nbd10 00:09:21.639 /dev/nbd11 00:09:21.639 /dev/nbd12 00:09:21.639 /dev/nbd13 00:09:21.639 /dev/nbd14 00:09:21.639 /dev/nbd15 00:09:21.639 /dev/nbd2 00:09:21.639 /dev/nbd3 00:09:21.639 /dev/nbd4 00:09:21.639 /dev/nbd5 00:09:21.639 /dev/nbd6 00:09:21.639 /dev/nbd7 00:09:21.639 /dev/nbd8 00:09:21.639 /dev/nbd9' 00:09:21.639 19:45:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:21.639 19:45:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:09:21.639 19:45:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:09:21.639 19:45:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:09:21.639 19:45:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:09:21.639 19:45:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:09:21.639 19:45:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:21.639 19:45:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:09:21.639 19:45:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:09:21.639 19:45:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:09:21.639 19:45:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:09:21.639 19:45:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:09:21.639 256+0 records in 00:09:21.639 256+0 records out 00:09:21.639 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0116 s, 90.4 MB/s 00:09:21.639 19:45:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:21.639 19:45:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:09:21.899 256+0 records in 00:09:21.899 256+0 records out 00:09:21.899 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.178406 s, 5.9 MB/s 00:09:21.899 19:45:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:21.899 19:45:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:09:22.157 256+0 records in 00:09:22.157 256+0 records out 00:09:22.157 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183805 s, 5.7 MB/s 00:09:22.157 19:45:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:22.157 19:45:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:09:22.157 256+0 records in 00:09:22.157 256+0 records out 00:09:22.157 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184349 s, 5.7 MB/s 00:09:22.157 19:45:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:22.157 19:45:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:09:22.522 256+0 records in 00:09:22.522 256+0 records out 00:09:22.522 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184491 s, 5.7 MB/s 00:09:22.522 19:45:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:22.522 19:45:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:09:22.782 256+0 records in 00:09:22.782 256+0 records out 00:09:22.782 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183941 s, 5.7 MB/s 00:09:22.782 19:45:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:22.782 19:45:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:09:22.782 256+0 records in 00:09:22.782 256+0 records out 00:09:22.782 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.182585 s, 5.7 MB/s 00:09:22.782 19:45:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:22.782 19:45:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:09:23.041 256+0 records in 00:09:23.041 256+0 records out 00:09:23.041 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.176428 s, 5.9 MB/s 00:09:23.041 19:45:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:23.041 19:45:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:09:23.300 256+0 records in 00:09:23.300 256+0 records out 00:09:23.300 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.181736 s, 5.8 MB/s 00:09:23.300 19:45:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:23.300 19:45:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:09:23.300 256+0 records in 00:09:23.300 256+0 records out 00:09:23.300 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183636 s, 5.7 MB/s 00:09:23.300 19:45:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:23.300 19:45:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:09:23.560 256+0 records in 00:09:23.560 256+0 records out 00:09:23.560 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183377 s, 5.7 MB/s 00:09:23.560 19:45:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:23.560 19:45:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:09:23.819 256+0 records in 00:09:23.819 256+0 records out 00:09:23.819 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183602 s, 5.7 MB/s 00:09:23.819 19:45:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:23.819 19:45:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:09:23.819 256+0 records in 00:09:23.819 256+0 records out 00:09:23.819 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183784 s, 5.7 MB/s 00:09:23.819 19:45:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:23.819 19:45:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:09:24.078 256+0 records in 00:09:24.078 256+0 records out 00:09:24.078 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.18443 s, 5.7 MB/s 00:09:24.079 19:45:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:24.079 19:45:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:09:24.338 256+0 records in 00:09:24.338 256+0 records out 00:09:24.338 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.143727 s, 7.3 MB/s 00:09:24.338 19:45:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:24.338 19:45:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:09:24.598 256+0 records in 00:09:24.598 256+0 records out 00:09:24.598 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.187556 s, 5.6 MB/s 00:09:24.598 19:45:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:24.598 19:45:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:09:24.598 256+0 records in 00:09:24.598 256+0 records out 00:09:24.598 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.182067 s, 5.8 MB/s 00:09:24.598 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:09:24.598 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:24.598 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:09:24.598 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:09:24.598 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:09:24.598 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:09:24.598 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:09:24.598 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.598 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:09:24.598 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.598 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:09:24.598 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.598 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:09:24.598 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.598 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:09:24.598 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.598 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:09:24.598 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.598 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:09:24.598 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.598 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:09:24.860 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.860 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:09:24.860 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.860 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:09:24.860 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.860 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:09:24.860 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.860 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:09:24.860 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.860 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:09:24.860 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.860 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:09:24.861 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.861 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:09:24.861 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.861 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:09:24.861 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.861 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:09:24.861 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:09:24.861 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:24.861 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:24.861 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:24.861 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:24.861 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:09:24.861 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:24.861 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:25.121 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:25.121 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:25.121 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:25.121 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:25.121 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:25.121 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:25.121 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:25.121 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:25.121 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:25.121 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:09:25.121 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:09:25.121 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:09:25.121 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:09:25.121 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:25.121 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:25.121 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:09:25.121 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:25.121 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:25.121 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:25.121 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:09:25.380 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:09:25.380 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:09:25.380 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:09:25.380 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:25.380 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:25.380 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:09:25.380 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:25.380 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:25.380 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:25.380 19:45:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:09:25.947 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:09:25.947 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:09:25.947 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:09:25.948 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:25.948 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:25.948 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:09:25.948 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:25.948 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:25.948 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:25.948 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:09:26.207 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:09:26.207 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:09:26.207 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:09:26.207 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:26.207 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:26.207 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:09:26.207 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:26.207 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:26.207 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:26.207 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:09:26.466 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:09:26.466 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:09:26.466 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:09:26.466 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:26.466 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:26.466 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:09:26.466 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:26.466 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:26.466 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:26.466 19:45:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:09:26.725 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:09:26.725 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:09:26.725 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:09:26.725 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:26.725 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:26.725 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:09:26.725 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:26.725 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:26.725 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:26.725 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:09:26.725 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:09:26.725 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:09:26.725 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:09:26.725 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:26.725 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:26.725 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:09:26.725 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:26.725 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:26.725 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:26.725 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:09:26.983 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:09:26.983 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:09:26.983 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:09:26.983 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:26.983 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:26.983 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:09:26.983 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:26.983 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:26.983 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:26.983 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:09:27.242 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:09:27.242 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:09:27.242 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:09:27.242 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:27.242 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:27.242 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:09:27.242 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:27.242 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:27.242 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:27.242 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:09:27.500 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:09:27.500 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:09:27.500 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:09:27.500 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:27.500 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:27.500 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:09:27.500 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:27.500 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:27.500 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:27.501 19:45:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:09:27.759 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:09:27.759 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:09:27.759 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:09:27.759 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:27.759 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:27.759 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:09:27.759 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:27.759 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:27.759 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:27.759 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:09:27.759 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:09:27.759 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:09:27.759 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:09:27.759 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:27.759 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:27.759 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:09:27.759 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:27.759 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:27.759 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:27.759 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:09:28.017 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:09:28.017 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:09:28.017 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:09:28.017 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:28.017 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:28.017 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:09:28.017 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:28.017 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:28.017 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:28.017 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:09:28.276 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:09:28.276 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:09:28.276 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:09:28.276 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:28.276 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:28.276 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:09:28.276 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:28.276 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:28.276 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:28.276 19:45:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:09:28.534 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:09:28.534 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:09:28.534 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:09:28.534 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:28.534 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:28.534 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:09:28.534 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:28.534 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:28.534 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:28.534 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:28.534 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:28.793 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:28.793 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:28.793 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:28.793 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:28.793 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:09:28.793 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:28.793 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:09:28.793 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:09:28.793 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:09:28.793 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:09:28.793 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:09:28.793 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:09:28.793 19:45:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:28.793 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:28.793 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:28.793 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:09:28.793 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:09:28.793 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:09:29.052 malloc_lvol_verify 00:09:29.052 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:09:29.310 5119f319-2daa-47c5-bd03-1ca83f3bd8a6 00:09:29.310 19:45:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:09:29.569 288508f2-d803-4e02-9453-20a173e5ba6a 00:09:29.569 19:45:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:09:29.828 /dev/nbd0 00:09:29.828 19:45:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:09:29.828 mke2fs 1.46.5 (30-Dec-2021) 00:09:29.828 Discarding device blocks: 0/4096 done 00:09:29.828 Creating filesystem with 4096 1k blocks and 1024 inodes 00:09:29.828 00:09:29.828 Allocating group tables: 0/1 done 00:09:29.828 Writing inode tables: 0/1 done 00:09:29.828 Creating journal (1024 blocks): done 00:09:29.828 Writing superblocks and filesystem accounting information: 0/1 done 00:09:29.828 00:09:29.828 19:45:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:09:29.828 19:45:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:09:29.828 19:45:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:29.828 19:45:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:09:29.828 19:45:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:29.828 19:45:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:09:29.828 19:45:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:29.828 19:45:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:30.087 19:45:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:30.087 19:45:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:30.087 19:45:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:30.087 19:45:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:30.087 19:45:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:30.087 19:45:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:30.087 19:45:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:30.087 19:45:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:30.087 19:45:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:09:30.087 19:45:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:09:30.087 19:45:21 blockdev_general.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 1353099 00:09:30.087 19:45:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@950 -- # '[' -z 1353099 ']' 00:09:30.087 19:45:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # kill -0 1353099 00:09:30.087 19:45:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@955 -- # uname 00:09:30.087 19:45:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:30.087 19:45:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1353099 00:09:30.088 19:45:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:30.088 19:45:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:30.088 19:45:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1353099' 00:09:30.088 killing process with pid 1353099 00:09:30.088 19:45:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@969 -- # kill 1353099 00:09:30.088 19:45:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@974 -- # wait 1353099 00:09:30.657 19:45:21 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:09:30.657 00:09:30.657 real 0m23.743s 00:09:30.657 user 0m28.861s 00:09:30.657 sys 0m13.722s 00:09:30.657 19:45:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:30.657 19:45:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:09:30.657 ************************************ 00:09:30.657 END TEST bdev_nbd 00:09:30.657 ************************************ 00:09:30.657 19:45:22 blockdev_general -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:09:30.657 19:45:22 blockdev_general -- bdev/blockdev.sh@763 -- # '[' bdev = nvme ']' 00:09:30.657 19:45:22 blockdev_general -- bdev/blockdev.sh@763 -- # '[' bdev = gpt ']' 00:09:30.657 19:45:22 blockdev_general -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:09:30.657 19:45:22 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:09:30.657 19:45:22 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:30.657 19:45:22 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:30.657 ************************************ 00:09:30.657 START TEST bdev_fio 00:09:30.657 ************************************ 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- common/autotest_common.sh@1125 -- # fio_test_suite '' 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:09:30.657 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc0]' 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc0 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc1p0]' 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc1p0 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc1p1]' 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc1p1 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p0]' 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p0 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p1]' 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p1 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p2]' 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p2 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p3]' 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p3 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p4]' 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p4 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p5]' 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p5 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p6]' 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p6 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p7]' 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p7 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_TestPT]' 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=TestPT 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_raid0]' 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=raid0 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:30.657 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_concat0]' 00:09:30.658 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=concat0 00:09:30.658 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:30.658 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_raid1]' 00:09:30.658 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=raid1 00:09:30.658 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:30.658 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_AIO0]' 00:09:30.658 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=AIO0 00:09:30.658 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:09:30.658 19:45:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:30.658 19:45:22 blockdev_general.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:09:30.658 19:45:22 blockdev_general.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:30.658 19:45:22 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:30.658 ************************************ 00:09:30.658 START TEST bdev_fio_rw_verify 00:09:30.658 ************************************ 00:09:30.658 19:45:22 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:30.658 19:45:22 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:30.658 19:45:22 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:09:30.658 19:45:22 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:09:30.658 19:45:22 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:09:30.658 19:45:22 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:30.658 19:45:22 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:09:30.658 19:45:22 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:09:30.658 19:45:22 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:30.658 19:45:22 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:30.658 19:45:22 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:09:30.658 19:45:22 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:30.917 19:45:22 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:30.917 19:45:22 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:30.917 19:45:22 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:30.917 19:45:22 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:09:30.917 19:45:22 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:30.917 19:45:22 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:30.917 19:45:22 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:30.917 19:45:22 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:30.918 19:45:22 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:09:30.918 19:45:22 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:31.177 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:31.177 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:31.177 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:31.177 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:31.177 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:31.177 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:31.177 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:31.177 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:31.177 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:31.177 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:31.177 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:31.177 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:31.177 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:31.177 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:31.177 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:31.177 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:31.177 fio-3.35 00:09:31.177 Starting 16 threads 00:09:43.383 00:09:43.383 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=1357451: Wed Jul 24 19:45:33 2024 00:09:43.383 read: IOPS=86.6k, BW=338MiB/s (355MB/s)(3383MiB/10001msec) 00:09:43.383 slat (nsec): min=1959, max=1015.7k, avg=37830.16, stdev=15001.34 00:09:43.383 clat (usec): min=11, max=1551, avg=303.06, stdev=137.09 00:09:43.383 lat (usec): min=24, max=1586, avg=340.89, stdev=144.98 00:09:43.383 clat percentiles (usec): 00:09:43.383 | 50.000th=[ 293], 99.000th=[ 619], 99.900th=[ 693], 99.990th=[ 1004], 00:09:43.383 | 99.999th=[ 1188] 00:09:43.383 write: IOPS=137k, BW=536MiB/s (562MB/s)(5288MiB/9868msec); 0 zone resets 00:09:43.383 slat (usec): min=4, max=3872, avg=50.63, stdev=16.08 00:09:43.383 clat (usec): min=13, max=1763, avg=352.64, stdev=161.15 00:09:43.383 lat (usec): min=40, max=4418, avg=403.27, stdev=169.16 00:09:43.383 clat percentiles (usec): 00:09:43.383 | 50.000th=[ 334], 99.000th=[ 791], 99.900th=[ 1037], 99.990th=[ 1106], 00:09:43.383 | 99.999th=[ 1205] 00:09:43.383 bw ( KiB/s): min=433979, max=702550, per=98.80%, avg=542161.32, stdev=4292.02, samples=304 00:09:43.383 iops : min=108494, max=175635, avg=135540.16, stdev=1072.99, samples=304 00:09:43.383 lat (usec) : 20=0.01%, 50=0.34%, 100=3.26%, 250=30.48%, 500=51.54% 00:09:43.383 lat (usec) : 750=13.56%, 1000=0.72% 00:09:43.383 lat (msec) : 2=0.11% 00:09:43.383 cpu : usr=99.25%, sys=0.34%, ctx=616, majf=0, minf=2686 00:09:43.383 IO depths : 1=12.4%, 2=24.8%, 4=50.3%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:09:43.383 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:43.383 complete : 0=0.0%, 4=89.1%, 8=10.9%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:43.383 issued rwts: total=865951,1353704,0,0 short=0,0,0,0 dropped=0,0,0,0 00:09:43.383 latency : target=0, window=0, percentile=100.00%, depth=8 00:09:43.383 00:09:43.383 Run status group 0 (all jobs): 00:09:43.383 READ: bw=338MiB/s (355MB/s), 338MiB/s-338MiB/s (355MB/s-355MB/s), io=3383MiB (3547MB), run=10001-10001msec 00:09:43.383 WRITE: bw=536MiB/s (562MB/s), 536MiB/s-536MiB/s (562MB/s-562MB/s), io=5288MiB (5545MB), run=9868-9868msec 00:09:43.383 00:09:43.383 real 0m11.798s 00:09:43.383 user 2m45.363s 00:09:43.383 sys 0m1.411s 00:09:43.383 19:45:34 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:43.383 19:45:34 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:09:43.383 ************************************ 00:09:43.383 END TEST bdev_fio_rw_verify 00:09:43.383 ************************************ 00:09:43.383 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:09:43.383 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:43.383 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:09:43.383 19:45:34 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:43.383 19:45:34 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:09:43.383 19:45:34 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:09:43.383 19:45:34 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:09:43.383 19:45:34 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:09:43.383 19:45:34 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:09:43.383 19:45:34 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:09:43.383 19:45:34 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:09:43.383 19:45:34 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:43.383 19:45:34 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:09:43.383 19:45:34 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:09:43.383 19:45:34 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:09:43.383 19:45:34 blockdev_general.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:09:43.383 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:09:43.384 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "59773273-a1e3-49ce-853c-7e5a5a262862"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "59773273-a1e3-49ce-853c-7e5a5a262862",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "f7c0a053-2d2b-5329-8b1c-a04dfe1b00e2"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "f7c0a053-2d2b-5329-8b1c-a04dfe1b00e2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "1cc3d25a-82b3-5ba5-806e-4893caef2daa"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "1cc3d25a-82b3-5ba5-806e-4893caef2daa",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "bb4ff44c-edbf-540a-bdf6-dce9d68fdcc3"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "bb4ff44c-edbf-540a-bdf6-dce9d68fdcc3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "6acfe163-5f4a-5fd0-be57-3556a45221fc"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "6acfe163-5f4a-5fd0-be57-3556a45221fc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "9e07eae6-4e10-5293-80b4-8f0647addbdf"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "9e07eae6-4e10-5293-80b4-8f0647addbdf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "926b0284-0a2c-50a8-bf70-6a93a860eed1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "926b0284-0a2c-50a8-bf70-6a93a860eed1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "0396453e-446c-57cb-b009-a1d4702ddb03"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "0396453e-446c-57cb-b009-a1d4702ddb03",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "9406c365-10a4-5439-9a7a-d9bb26a413b4"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "9406c365-10a4-5439-9a7a-d9bb26a413b4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "3a48ef9a-b83f-5e5a-864b-1505facb2bd9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3a48ef9a-b83f-5e5a-864b-1505facb2bd9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "12afcef7-72f9-502e-b539-157a51701373"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "12afcef7-72f9-502e-b539-157a51701373",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "47aecfc9-7fa2-5692-9819-db838377c056"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "47aecfc9-7fa2-5692-9819-db838377c056",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "eaac3e18-96ea-424a-a31c-5005985819d3"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "eaac3e18-96ea-424a-a31c-5005985819d3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "eaac3e18-96ea-424a-a31c-5005985819d3",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "a0fcc8a6-312c-4698-9353-6a8f1ed4c2d2",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "a7f7745d-9cea-484c-8eac-7ce7191e7ded",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "f5f87728-a607-4252-9fae-90a82788fe4e"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "f5f87728-a607-4252-9fae-90a82788fe4e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "f5f87728-a607-4252-9fae-90a82788fe4e",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "c89cce57-1c27-4624-99ce-dcdc0818a832",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "6bcbd865-1714-4c98-9d71-ef5d9f8e19d1",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "ce63bbd1-df84-4168-9d84-776f26fe1aa6"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ce63bbd1-df84-4168-9d84-776f26fe1aa6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "ce63bbd1-df84-4168-9d84-776f26fe1aa6",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "0ba9049c-e555-48a5-9d0c-b7a3f63c214d",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "c7ed200b-322e-42f5-83ba-a3b964556d32",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "7292d96a-990f-4c4b-aad3-e15a946986a6"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "7292d96a-990f-4c4b-aad3-e15a946986a6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:43.384 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n Malloc0 00:09:43.384 Malloc1p0 00:09:43.384 Malloc1p1 00:09:43.384 Malloc2p0 00:09:43.384 Malloc2p1 00:09:43.384 Malloc2p2 00:09:43.384 Malloc2p3 00:09:43.384 Malloc2p4 00:09:43.384 Malloc2p5 00:09:43.384 Malloc2p6 00:09:43.384 Malloc2p7 00:09:43.384 TestPT 00:09:43.384 raid0 00:09:43.384 concat0 ]] 00:09:43.384 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "59773273-a1e3-49ce-853c-7e5a5a262862"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "59773273-a1e3-49ce-853c-7e5a5a262862",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "f7c0a053-2d2b-5329-8b1c-a04dfe1b00e2"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "f7c0a053-2d2b-5329-8b1c-a04dfe1b00e2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "1cc3d25a-82b3-5ba5-806e-4893caef2daa"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "1cc3d25a-82b3-5ba5-806e-4893caef2daa",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "bb4ff44c-edbf-540a-bdf6-dce9d68fdcc3"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "bb4ff44c-edbf-540a-bdf6-dce9d68fdcc3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "6acfe163-5f4a-5fd0-be57-3556a45221fc"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "6acfe163-5f4a-5fd0-be57-3556a45221fc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "9e07eae6-4e10-5293-80b4-8f0647addbdf"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "9e07eae6-4e10-5293-80b4-8f0647addbdf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "926b0284-0a2c-50a8-bf70-6a93a860eed1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "926b0284-0a2c-50a8-bf70-6a93a860eed1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "0396453e-446c-57cb-b009-a1d4702ddb03"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "0396453e-446c-57cb-b009-a1d4702ddb03",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "9406c365-10a4-5439-9a7a-d9bb26a413b4"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "9406c365-10a4-5439-9a7a-d9bb26a413b4",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "3a48ef9a-b83f-5e5a-864b-1505facb2bd9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3a48ef9a-b83f-5e5a-864b-1505facb2bd9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "12afcef7-72f9-502e-b539-157a51701373"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "12afcef7-72f9-502e-b539-157a51701373",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "47aecfc9-7fa2-5692-9819-db838377c056"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "47aecfc9-7fa2-5692-9819-db838377c056",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "eaac3e18-96ea-424a-a31c-5005985819d3"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "eaac3e18-96ea-424a-a31c-5005985819d3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "eaac3e18-96ea-424a-a31c-5005985819d3",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "a0fcc8a6-312c-4698-9353-6a8f1ed4c2d2",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "a7f7745d-9cea-484c-8eac-7ce7191e7ded",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "f5f87728-a607-4252-9fae-90a82788fe4e"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "f5f87728-a607-4252-9fae-90a82788fe4e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "f5f87728-a607-4252-9fae-90a82788fe4e",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "c89cce57-1c27-4624-99ce-dcdc0818a832",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "6bcbd865-1714-4c98-9d71-ef5d9f8e19d1",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "ce63bbd1-df84-4168-9d84-776f26fe1aa6"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ce63bbd1-df84-4168-9d84-776f26fe1aa6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "ce63bbd1-df84-4168-9d84-776f26fe1aa6",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "0ba9049c-e555-48a5-9d0c-b7a3f63c214d",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "c7ed200b-322e-42f5-83ba-a3b964556d32",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "7292d96a-990f-4c4b-aad3-e15a946986a6"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "7292d96a-990f-4c4b-aad3-e15a946986a6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc0]' 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc0 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc1p0]' 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc1p0 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc1p1]' 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc1p1 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p0]' 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p0 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p1]' 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p1 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p2]' 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p2 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p3]' 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p3 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p4]' 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p4 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p5]' 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p5 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p6]' 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p6 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p7]' 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p7 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_TestPT]' 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=TestPT 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_raid0]' 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=raid0 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_concat0]' 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=concat0 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:43.386 19:45:34 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:43.386 ************************************ 00:09:43.386 START TEST bdev_fio_trim 00:09:43.386 ************************************ 00:09:43.386 19:45:34 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:43.386 19:45:34 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:43.386 19:45:34 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:09:43.386 19:45:34 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:09:43.386 19:45:34 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:09:43.386 19:45:34 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:43.386 19:45:34 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:09:43.386 19:45:34 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:09:43.386 19:45:34 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:43.386 19:45:34 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:43.386 19:45:34 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:09:43.386 19:45:34 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:43.386 19:45:34 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:43.386 19:45:34 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:43.386 19:45:34 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:43.386 19:45:34 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:43.386 19:45:34 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:09:43.386 19:45:34 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:43.386 19:45:34 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:43.386 19:45:34 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:43.386 19:45:34 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:09:43.386 19:45:34 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:43.387 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:43.387 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:43.387 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:43.387 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:43.387 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:43.387 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:43.387 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:43.387 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:43.387 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:43.387 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:43.387 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:43.387 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:43.387 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:43.387 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:43.387 fio-3.35 00:09:43.387 Starting 14 threads 00:09:55.591 00:09:55.591 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=1359148: Wed Jul 24 19:45:45 2024 00:09:55.591 write: IOPS=113k, BW=439MiB/s (461MB/s)(4395MiB/10001msec); 0 zone resets 00:09:55.591 slat (usec): min=2, max=3576, avg=43.96, stdev=15.76 00:09:55.591 clat (usec): min=28, max=3845, avg=311.32, stdev=124.61 00:09:55.591 lat (usec): min=44, max=3870, avg=355.28, stdev=133.28 00:09:55.591 clat percentiles (usec): 00:09:55.591 | 50.000th=[ 289], 99.000th=[ 603], 99.900th=[ 660], 99.990th=[ 816], 00:09:55.591 | 99.999th=[ 1172] 00:09:55.591 bw ( KiB/s): min=376192, max=722176, per=100.00%, avg=452635.37, stdev=6873.89, samples=266 00:09:55.591 iops : min=94048, max=180542, avg=113158.74, stdev=1718.46, samples=266 00:09:55.591 trim: IOPS=113k, BW=439MiB/s (461MB/s)(4395MiB/10001msec); 0 zone resets 00:09:55.591 slat (usec): min=4, max=1461, avg=30.53, stdev=10.21 00:09:55.591 clat (usec): min=4, max=3871, avg=349.28, stdev=140.21 00:09:55.591 lat (usec): min=18, max=3891, avg=379.82, stdev=146.35 00:09:55.591 clat percentiles (usec): 00:09:55.591 | 50.000th=[ 334], 99.000th=[ 660], 99.900th=[ 725], 99.990th=[ 906], 00:09:55.591 | 99.999th=[ 1303] 00:09:55.591 bw ( KiB/s): min=376200, max=722176, per=100.00%, avg=452635.79, stdev=6873.90, samples=266 00:09:55.591 iops : min=94050, max=180542, avg=113158.95, stdev=1718.46, samples=266 00:09:55.591 lat (usec) : 10=0.01%, 20=0.03%, 50=0.13%, 100=1.22%, 250=30.09% 00:09:55.591 lat (usec) : 500=55.64%, 750=12.86%, 1000=0.04% 00:09:55.591 lat (msec) : 2=0.01%, 4=0.01% 00:09:55.591 cpu : usr=99.54%, sys=0.00%, ctx=623, majf=0, minf=973 00:09:55.591 IO depths : 1=12.5%, 2=24.9%, 4=50.0%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:09:55.591 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:55.591 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:55.591 issued rwts: total=0,1125169,1125178,0 short=0,0,0,0 dropped=0,0,0,0 00:09:55.591 latency : target=0, window=0, percentile=100.00%, depth=8 00:09:55.591 00:09:55.591 Run status group 0 (all jobs): 00:09:55.591 WRITE: bw=439MiB/s (461MB/s), 439MiB/s-439MiB/s (461MB/s-461MB/s), io=4395MiB (4609MB), run=10001-10001msec 00:09:55.591 TRIM: bw=439MiB/s (461MB/s), 439MiB/s-439MiB/s (461MB/s-461MB/s), io=4395MiB (4609MB), run=10001-10001msec 00:09:55.591 00:09:55.591 real 0m11.684s 00:09:55.591 user 2m26.199s 00:09:55.591 sys 0m1.040s 00:09:55.591 19:45:45 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:55.591 19:45:45 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:09:55.591 ************************************ 00:09:55.591 END TEST bdev_fio_trim 00:09:55.591 ************************************ 00:09:55.591 19:45:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:09:55.591 19:45:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:55.591 19:45:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:09:55.591 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:09:55.591 19:45:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:09:55.591 00:09:55.591 real 0m23.883s 00:09:55.591 user 5m11.779s 00:09:55.591 sys 0m2.669s 00:09:55.591 19:45:45 blockdev_general.bdev_fio -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:55.591 19:45:45 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:55.591 ************************************ 00:09:55.591 END TEST bdev_fio 00:09:55.591 ************************************ 00:09:55.591 19:45:45 blockdev_general -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:09:55.591 19:45:45 blockdev_general -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:09:55.591 19:45:45 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:09:55.591 19:45:46 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:55.592 19:45:46 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:55.592 ************************************ 00:09:55.592 START TEST bdev_verify 00:09:55.592 ************************************ 00:09:55.592 19:45:46 blockdev_general.bdev_verify -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:09:55.592 [2024-07-24 19:45:46.097455] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:09:55.592 [2024-07-24 19:45:46.097507] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1360591 ] 00:09:55.592 [2024-07-24 19:45:46.209489] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:55.592 [2024-07-24 19:45:46.318483] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:55.592 [2024-07-24 19:45:46.318489] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:55.592 [2024-07-24 19:45:46.477376] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:55.592 [2024-07-24 19:45:46.477448] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:55.592 [2024-07-24 19:45:46.477464] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:55.592 [2024-07-24 19:45:46.485379] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:55.592 [2024-07-24 19:45:46.485413] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:55.592 [2024-07-24 19:45:46.493396] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:55.592 [2024-07-24 19:45:46.493422] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:55.592 [2024-07-24 19:45:46.570882] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:55.592 [2024-07-24 19:45:46.570938] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:55.592 [2024-07-24 19:45:46.570956] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1749910 00:09:55.592 [2024-07-24 19:45:46.570969] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:55.592 [2024-07-24 19:45:46.572461] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:55.592 [2024-07-24 19:45:46.572492] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:55.592 Running I/O for 5 seconds... 00:10:00.866 00:10:00.866 Latency(us) 00:10:00.866 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:00.866 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.866 Verification LBA range: start 0x0 length 0x1000 00:10:00.866 Malloc0 : 5.19 1183.79 4.62 0.00 0.00 107941.43 541.38 341015.15 00:10:00.866 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.866 Verification LBA range: start 0x1000 length 0x1000 00:10:00.866 Malloc0 : 5.29 991.71 3.87 0.00 0.00 123649.24 651.80 194214.51 00:10:00.866 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.866 Verification LBA range: start 0x0 length 0x800 00:10:00.866 Malloc1p0 : 5.19 616.30 2.41 0.00 0.00 206963.76 2507.46 173242.99 00:10:00.866 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.866 Verification LBA range: start 0x800 length 0x800 00:10:00.866 Malloc1p0 : 5.26 511.36 2.00 0.00 0.00 249909.79 3105.84 225215.89 00:10:00.866 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.866 Verification LBA range: start 0x0 length 0x800 00:10:00.866 Malloc1p1 : 5.19 616.05 2.41 0.00 0.00 206612.78 2493.22 171419.38 00:10:00.866 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.866 Verification LBA range: start 0x800 length 0x800 00:10:00.866 Malloc1p1 : 5.26 511.11 2.00 0.00 0.00 249359.89 3205.57 225215.89 00:10:00.866 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.866 Verification LBA range: start 0x0 length 0x200 00:10:00.866 Malloc2p0 : 5.20 615.80 2.41 0.00 0.00 206266.42 2721.17 171419.38 00:10:00.866 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.866 Verification LBA range: start 0x200 length 0x200 00:10:00.866 Malloc2p0 : 5.26 510.87 2.00 0.00 0.00 248882.50 3348.03 225215.89 00:10:00.866 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.866 Verification LBA range: start 0x0 length 0x200 00:10:00.866 Malloc2p1 : 5.20 615.52 2.40 0.00 0.00 205917.15 3433.52 168683.97 00:10:00.866 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.866 Verification LBA range: start 0x200 length 0x200 00:10:00.866 Malloc2p1 : 5.26 510.63 1.99 0.00 0.00 248372.47 3932.16 224304.08 00:10:00.866 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.866 Verification LBA range: start 0x0 length 0x200 00:10:00.866 Malloc2p2 : 5.20 615.27 2.40 0.00 0.00 205469.63 3575.99 164124.94 00:10:00.866 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.866 Verification LBA range: start 0x200 length 0x200 00:10:00.866 Malloc2p2 : 5.27 510.39 1.99 0.00 0.00 247736.81 4074.63 220656.86 00:10:00.866 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.866 Verification LBA range: start 0x0 length 0x200 00:10:00.866 Malloc2p3 : 5.20 615.02 2.40 0.00 0.00 204993.14 2877.89 162301.33 00:10:00.866 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.866 Verification LBA range: start 0x200 length 0x200 00:10:00.866 Malloc2p3 : 5.27 510.15 1.99 0.00 0.00 247076.27 3490.50 217921.45 00:10:00.866 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.866 Verification LBA range: start 0x0 length 0x200 00:10:00.866 Malloc2p4 : 5.21 614.76 2.40 0.00 0.00 204627.65 2535.96 163213.13 00:10:00.866 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.866 Verification LBA range: start 0x200 length 0x200 00:10:00.866 Malloc2p4 : 5.27 509.91 1.99 0.00 0.00 246538.83 3191.32 217921.45 00:10:00.866 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.866 Verification LBA range: start 0x0 length 0x200 00:10:00.866 Malloc2p5 : 5.21 614.49 2.40 0.00 0.00 204285.11 2493.22 165036.74 00:10:00.866 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.866 Verification LBA range: start 0x200 length 0x200 00:10:00.866 Malloc2p5 : 5.27 509.67 1.99 0.00 0.00 246056.08 3433.52 217009.64 00:10:00.866 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.866 Verification LBA range: start 0x0 length 0x200 00:10:00.866 Malloc2p6 : 5.21 614.22 2.40 0.00 0.00 203929.26 2820.90 165036.74 00:10:00.866 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.866 Verification LBA range: start 0x200 length 0x200 00:10:00.866 Malloc2p6 : 5.28 509.42 1.99 0.00 0.00 245536.23 3989.15 214274.23 00:10:00.866 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.866 Verification LBA range: start 0x0 length 0x200 00:10:00.866 Malloc2p7 : 5.21 613.94 2.40 0.00 0.00 203564.36 3533.25 163213.13 00:10:00.866 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.866 Verification LBA range: start 0x200 length 0x200 00:10:00.866 Malloc2p7 : 5.28 509.18 1.99 0.00 0.00 244891.63 2920.63 212450.62 00:10:00.866 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.866 Verification LBA range: start 0x0 length 0x1000 00:10:00.866 TestPT : 5.23 612.29 2.39 0.00 0.00 203485.52 10086.85 161389.52 00:10:00.866 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.866 Verification LBA range: start 0x1000 length 0x1000 00:10:00.866 TestPT : 5.24 488.84 1.91 0.00 0.00 254282.17 72944.42 213362.42 00:10:00.866 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.866 Verification LBA range: start 0x0 length 0x2000 00:10:00.866 raid0 : 5.22 613.30 2.40 0.00 0.00 202715.57 2336.50 154095.08 00:10:00.866 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.866 Verification LBA range: start 0x2000 length 0x2000 00:10:00.866 raid0 : 5.28 508.93 1.99 0.00 0.00 243628.03 3519.00 196949.93 00:10:00.866 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.866 Verification LBA range: start 0x0 length 0x2000 00:10:00.866 concat0 : 5.22 613.03 2.39 0.00 0.00 202385.68 2621.44 156830.50 00:10:00.866 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.866 Verification LBA range: start 0x2000 length 0x2000 00:10:00.867 concat0 : 5.28 508.69 1.99 0.00 0.00 243116.90 3091.59 192390.90 00:10:00.867 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.867 Verification LBA range: start 0x0 length 0x1000 00:10:00.867 raid1 : 5.22 612.74 2.39 0.00 0.00 201978.06 3162.82 168683.97 00:10:00.867 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.867 Verification LBA range: start 0x1000 length 0x1000 00:10:00.867 raid1 : 5.29 508.44 1.99 0.00 0.00 242599.20 4131.62 192390.90 00:10:00.867 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.867 Verification LBA range: start 0x0 length 0x4e2 00:10:00.867 AIO0 : 5.22 612.55 2.39 0.00 0.00 201613.96 1317.84 175066.60 00:10:00.867 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.867 Verification LBA range: start 0x4e2 length 0x4e2 00:10:00.867 AIO0 : 5.29 508.25 1.99 0.00 0.00 241948.42 1659.77 196038.12 00:10:00.867 =================================================================================================================== 00:10:00.867 Total : 19016.60 74.28 0.00 0.00 211202.27 541.38 341015.15 00:10:01.167 00:10:01.167 real 0m6.511s 00:10:01.167 user 0m12.059s 00:10:01.167 sys 0m0.410s 00:10:01.167 19:45:52 blockdev_general.bdev_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:01.167 19:45:52 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:10:01.167 ************************************ 00:10:01.167 END TEST bdev_verify 00:10:01.167 ************************************ 00:10:01.167 19:45:52 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:10:01.167 19:45:52 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:10:01.167 19:45:52 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:01.167 19:45:52 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:01.167 ************************************ 00:10:01.167 START TEST bdev_verify_big_io 00:10:01.167 ************************************ 00:10:01.167 19:45:52 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:10:01.167 [2024-07-24 19:45:52.697152] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:10:01.167 [2024-07-24 19:45:52.697218] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1361484 ] 00:10:01.427 [2024-07-24 19:45:52.810556] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:01.427 [2024-07-24 19:45:52.916183] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:01.427 [2024-07-24 19:45:52.916187] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:01.686 [2024-07-24 19:45:53.074139] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:01.686 [2024-07-24 19:45:53.074202] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:01.686 [2024-07-24 19:45:53.074217] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:01.686 [2024-07-24 19:45:53.082148] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:01.686 [2024-07-24 19:45:53.082177] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:01.686 [2024-07-24 19:45:53.090163] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:01.686 [2024-07-24 19:45:53.090188] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:01.686 [2024-07-24 19:45:53.167849] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:01.686 [2024-07-24 19:45:53.167906] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:01.686 [2024-07-24 19:45:53.167925] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c03910 00:10:01.686 [2024-07-24 19:45:53.167938] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:01.686 [2024-07-24 19:45:53.169435] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:01.686 [2024-07-24 19:45:53.169466] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:01.946 [2024-07-24 19:45:53.339717] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:10:01.946 [2024-07-24 19:45:53.340928] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:10:01.946 [2024-07-24 19:45:53.342702] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:10:01.946 [2024-07-24 19:45:53.343888] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:10:01.946 [2024-07-24 19:45:53.345527] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:10:01.946 [2024-07-24 19:45:53.346417] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:10:01.946 [2024-07-24 19:45:53.347800] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:10:01.946 [2024-07-24 19:45:53.349233] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:10:01.946 [2024-07-24 19:45:53.350162] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:10:01.946 [2024-07-24 19:45:53.351602] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:10:01.946 [2024-07-24 19:45:53.352524] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:10:01.946 [2024-07-24 19:45:53.354002] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:10:01.946 [2024-07-24 19:45:53.354935] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:10:01.946 [2024-07-24 19:45:53.356274] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:10:01.946 [2024-07-24 19:45:53.357105] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:10:01.946 [2024-07-24 19:45:53.358438] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:10:01.946 [2024-07-24 19:45:53.380274] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:10:01.946 [2024-07-24 19:45:53.382105] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:10:01.946 Running I/O for 5 seconds... 00:10:10.067 00:10:10.067 Latency(us) 00:10:10.067 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:10.067 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:10.067 Verification LBA range: start 0x0 length 0x100 00:10:10.067 Malloc0 : 6.19 144.83 9.05 0.00 0.00 867112.46 869.06 2348810.24 00:10:10.067 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:10.067 Verification LBA range: start 0x100 length 0x100 00:10:10.067 Malloc0 : 6.25 122.93 7.68 0.00 0.00 1017945.41 1111.26 2771887.86 00:10:10.067 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:10.067 Verification LBA range: start 0x0 length 0x80 00:10:10.067 Malloc1p0 : 6.38 85.92 5.37 0.00 0.00 1382256.22 2478.97 2801065.63 00:10:10.067 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:10.067 Verification LBA range: start 0x80 length 0x80 00:10:10.067 Malloc1p0 : 7.10 31.54 1.97 0.00 0.00 3644357.26 1852.10 5952264.46 00:10:10.067 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:10.067 Verification LBA range: start 0x0 length 0x80 00:10:10.067 Malloc1p1 : 6.70 35.84 2.24 0.00 0.00 3167315.03 1524.42 5602131.26 00:10:10.067 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:10.067 Verification LBA range: start 0x80 length 0x80 00:10:10.067 Malloc1p1 : 7.10 31.53 1.97 0.00 0.00 3503824.90 1909.09 5718842.32 00:10:10.067 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:10.067 Verification LBA range: start 0x0 length 0x20 00:10:10.067 Malloc2p0 : 6.27 22.97 1.44 0.00 0.00 1241278.24 658.92 2042443.69 00:10:10.067 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:10.067 Verification LBA range: start 0x20 length 0x20 00:10:10.067 Malloc2p0 : 6.45 19.84 1.24 0.00 0.00 1390265.07 954.55 2319632.47 00:10:10.067 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:10.067 Verification LBA range: start 0x0 length 0x20 00:10:10.067 Malloc2p1 : 6.27 22.97 1.44 0.00 0.00 1229897.47 641.11 2013265.92 00:10:10.067 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:10.067 Verification LBA range: start 0x20 length 0x20 00:10:10.067 Malloc2p1 : 6.45 19.83 1.24 0.00 0.00 1374757.16 787.14 2290454.71 00:10:10.067 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:10.067 Verification LBA range: start 0x0 length 0x20 00:10:10.067 Malloc2p2 : 6.27 22.96 1.44 0.00 0.00 1218512.32 633.99 1998677.04 00:10:10.067 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:10.067 Verification LBA range: start 0x20 length 0x20 00:10:10.067 Malloc2p2 : 6.46 19.83 1.24 0.00 0.00 1360138.87 797.83 2261276.94 00:10:10.067 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:10.067 Verification LBA range: start 0x0 length 0x20 00:10:10.067 Malloc2p3 : 6.27 22.96 1.43 0.00 0.00 1207757.88 819.20 1969499.27 00:10:10.067 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:10.067 Verification LBA range: start 0x20 length 0x20 00:10:10.067 Malloc2p3 : 6.46 19.82 1.24 0.00 0.00 1344604.34 769.34 2232099.17 00:10:10.067 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:10.067 Verification LBA range: start 0x0 length 0x20 00:10:10.067 Malloc2p4 : 6.27 22.95 1.43 0.00 0.00 1196735.36 648.24 1940321.50 00:10:10.067 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:10.067 Verification LBA range: start 0x20 length 0x20 00:10:10.067 Malloc2p4 : 6.46 19.82 1.24 0.00 0.00 1329721.61 783.58 2188332.52 00:10:10.067 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:10.067 Verification LBA range: start 0x0 length 0x20 00:10:10.068 Malloc2p5 : 6.28 22.95 1.43 0.00 0.00 1185320.85 644.67 1911143.74 00:10:10.068 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:10.068 Verification LBA range: start 0x20 length 0x20 00:10:10.068 Malloc2p5 : 6.58 21.87 1.37 0.00 0.00 1204614.44 780.02 2159154.75 00:10:10.068 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:10.068 Verification LBA range: start 0x0 length 0x20 00:10:10.068 Malloc2p6 : 6.28 22.94 1.43 0.00 0.00 1174604.81 666.05 1881965.97 00:10:10.068 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:10.068 Verification LBA range: start 0x20 length 0x20 00:10:10.068 Malloc2p6 : 6.59 21.87 1.37 0.00 0.00 1191213.77 790.71 2129976.99 00:10:10.068 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:10.068 Verification LBA range: start 0x0 length 0x20 00:10:10.068 Malloc2p7 : 6.28 22.94 1.43 0.00 0.00 1163622.24 630.43 1860082.64 00:10:10.068 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:10.068 Verification LBA range: start 0x20 length 0x20 00:10:10.068 Malloc2p7 : 6.59 21.86 1.37 0.00 0.00 1177364.20 783.58 2100799.22 00:10:10.068 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:10.068 Verification LBA range: start 0x0 length 0x100 00:10:10.068 TestPT : 6.70 33.74 2.11 0.00 0.00 3023318.15 110784.33 4026531.84 00:10:10.068 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:10.068 Verification LBA range: start 0x100 length 0x100 00:10:10.068 TestPT : 7.13 31.40 1.96 0.00 0.00 3140004.79 124005.51 3938998.54 00:10:10.068 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:10.068 Verification LBA range: start 0x0 length 0x200 00:10:10.068 raid0 : 6.89 39.50 2.47 0.00 0.00 2487627.91 1624.15 4814331.55 00:10:10.068 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:10.068 Verification LBA range: start 0x200 length 0x200 00:10:10.068 raid0 : 6.99 38.93 2.43 0.00 0.00 2482593.90 2051.56 4755976.01 00:10:10.068 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:10.068 Verification LBA range: start 0x0 length 0x200 00:10:10.068 concat0 : 6.70 45.37 2.84 0.00 0.00 2131132.34 1609.91 4639264.95 00:10:10.068 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:10.068 Verification LBA range: start 0x200 length 0x200 00:10:10.068 concat0 : 6.99 51.51 3.22 0.00 0.00 1806924.42 2023.07 4551731.65 00:10:10.068 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:10.068 Verification LBA range: start 0x0 length 0x100 00:10:10.068 raid1 : 6.89 60.39 3.77 0.00 0.00 1575516.52 2051.56 4435020.58 00:10:10.068 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:10.068 Verification LBA range: start 0x100 length 0x100 00:10:10.068 raid1 : 7.14 71.75 4.48 0.00 0.00 1251120.90 2635.69 4347487.28 00:10:10.068 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:10:10.068 Verification LBA range: start 0x0 length 0x4e 00:10:10.068 AIO0 : 6.89 57.47 3.59 0.00 0.00 982075.06 480.83 2946954.46 00:10:10.068 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:10:10.068 Verification LBA range: start 0x4e length 0x4e 00:10:10.068 AIO0 : 7.27 97.10 6.07 0.00 0.00 548479.43 466.59 3209554.37 00:10:10.068 =================================================================================================================== 00:10:10.068 Total : 1328.12 83.01 0.00 0.00 1538399.00 466.59 5952264.46 00:10:10.068 00:10:10.068 real 0m8.534s 00:10:10.068 user 0m16.088s 00:10:10.068 sys 0m0.438s 00:10:10.068 19:46:01 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:10.068 19:46:01 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:10:10.068 ************************************ 00:10:10.068 END TEST bdev_verify_big_io 00:10:10.068 ************************************ 00:10:10.068 19:46:01 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:10.068 19:46:01 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:10:10.068 19:46:01 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:10.068 19:46:01 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:10.068 ************************************ 00:10:10.068 START TEST bdev_write_zeroes 00:10:10.068 ************************************ 00:10:10.068 19:46:01 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:10.068 [2024-07-24 19:46:01.325197] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:10:10.068 [2024-07-24 19:46:01.325265] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1362569 ] 00:10:10.068 [2024-07-24 19:46:01.458051] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:10.068 [2024-07-24 19:46:01.564355] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:10.327 [2024-07-24 19:46:01.720354] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:10.327 [2024-07-24 19:46:01.720423] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:10.327 [2024-07-24 19:46:01.720439] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:10.327 [2024-07-24 19:46:01.728355] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:10.327 [2024-07-24 19:46:01.728382] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:10.327 [2024-07-24 19:46:01.736366] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:10.327 [2024-07-24 19:46:01.736403] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:10.327 [2024-07-24 19:46:01.813756] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:10.327 [2024-07-24 19:46:01.813816] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:10.327 [2024-07-24 19:46:01.813834] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x261f640 00:10:10.327 [2024-07-24 19:46:01.813847] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:10.327 [2024-07-24 19:46:01.815277] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:10.327 [2024-07-24 19:46:01.815308] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:10.585 Running I/O for 1 seconds... 00:10:11.970 00:10:11.970 Latency(us) 00:10:11.970 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:11.970 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.970 Malloc0 : 1.04 4944.01 19.31 0.00 0.00 25871.97 666.05 42854.85 00:10:11.970 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.970 Malloc1p0 : 1.04 4936.82 19.28 0.00 0.00 25862.63 904.68 41943.04 00:10:11.970 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.970 Malloc1p1 : 1.04 4929.68 19.26 0.00 0.00 25843.97 894.00 41031.23 00:10:11.970 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.970 Malloc2p0 : 1.04 4922.53 19.23 0.00 0.00 25827.15 890.43 40347.38 00:10:11.970 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.970 Malloc2p1 : 1.04 4915.49 19.20 0.00 0.00 25805.55 890.43 39435.58 00:10:11.970 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.970 Malloc2p2 : 1.04 4908.44 19.17 0.00 0.00 25783.28 886.87 38523.77 00:10:11.970 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.970 Malloc2p3 : 1.04 4901.29 19.15 0.00 0.00 25761.14 897.56 37611.97 00:10:11.970 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.970 Malloc2p4 : 1.05 4894.24 19.12 0.00 0.00 25738.46 894.00 36700.16 00:10:11.970 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.970 Malloc2p5 : 1.05 4887.29 19.09 0.00 0.00 25719.08 897.56 35788.35 00:10:11.970 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.970 Malloc2p6 : 1.05 4880.25 19.06 0.00 0.00 25699.88 897.56 34876.55 00:10:11.970 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.970 Malloc2p7 : 1.05 4873.28 19.04 0.00 0.00 25687.20 894.00 34192.70 00:10:11.970 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.970 TestPT : 1.06 4935.95 19.28 0.00 0.00 25311.28 926.05 33280.89 00:10:11.970 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.970 raid0 : 1.06 4927.96 19.25 0.00 0.00 25279.10 1624.15 31457.28 00:10:11.970 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.970 concat0 : 1.07 4920.10 19.22 0.00 0.00 25227.09 1624.15 29861.62 00:10:11.970 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.970 raid1 : 1.07 4910.33 19.18 0.00 0.00 25166.17 2578.70 27354.16 00:10:11.970 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.970 AIO0 : 1.07 4904.37 19.16 0.00 0.00 25072.32 1018.66 27354.16 00:10:11.970 =================================================================================================================== 00:10:11.970 Total : 78592.03 307.00 0.00 0.00 25600.48 666.05 42854.85 00:10:11.970 00:10:11.970 real 0m2.222s 00:10:11.970 user 0m1.808s 00:10:11.970 sys 0m0.364s 00:10:11.970 19:46:03 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:11.970 19:46:03 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:10:11.970 ************************************ 00:10:11.970 END TEST bdev_write_zeroes 00:10:11.970 ************************************ 00:10:11.970 19:46:03 blockdev_general -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:11.970 19:46:03 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:10:11.970 19:46:03 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:11.970 19:46:03 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:12.230 ************************************ 00:10:12.230 START TEST bdev_json_nonenclosed 00:10:12.230 ************************************ 00:10:12.230 19:46:03 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:12.230 [2024-07-24 19:46:03.643277] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:10:12.230 [2024-07-24 19:46:03.643339] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1362926 ] 00:10:12.230 [2024-07-24 19:46:03.771846] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:12.489 [2024-07-24 19:46:03.872801] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:12.489 [2024-07-24 19:46:03.872872] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:10:12.489 [2024-07-24 19:46:03.872891] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:10:12.489 [2024-07-24 19:46:03.872905] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:12.489 00:10:12.489 real 0m0.397s 00:10:12.489 user 0m0.238s 00:10:12.489 sys 0m0.157s 00:10:12.489 19:46:03 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:12.489 19:46:03 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:10:12.489 ************************************ 00:10:12.489 END TEST bdev_json_nonenclosed 00:10:12.489 ************************************ 00:10:12.489 19:46:04 blockdev_general -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:12.489 19:46:04 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:10:12.489 19:46:04 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:12.489 19:46:04 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:12.489 ************************************ 00:10:12.489 START TEST bdev_json_nonarray 00:10:12.489 ************************************ 00:10:12.489 19:46:04 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:12.748 [2024-07-24 19:46:04.122053] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:10:12.748 [2024-07-24 19:46:04.122139] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1362989 ] 00:10:12.748 [2024-07-24 19:46:04.267152] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:13.007 [2024-07-24 19:46:04.373871] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:13.007 [2024-07-24 19:46:04.373949] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:10:13.007 [2024-07-24 19:46:04.373966] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:10:13.007 [2024-07-24 19:46:04.373978] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:13.007 00:10:13.007 real 0m0.419s 00:10:13.007 user 0m0.247s 00:10:13.007 sys 0m0.168s 00:10:13.007 19:46:04 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:13.007 19:46:04 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:10:13.007 ************************************ 00:10:13.007 END TEST bdev_json_nonarray 00:10:13.007 ************************************ 00:10:13.007 19:46:04 blockdev_general -- bdev/blockdev.sh@786 -- # [[ bdev == bdev ]] 00:10:13.007 19:46:04 blockdev_general -- bdev/blockdev.sh@787 -- # run_test bdev_qos qos_test_suite '' 00:10:13.007 19:46:04 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:10:13.007 19:46:04 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:13.007 19:46:04 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:13.007 ************************************ 00:10:13.007 START TEST bdev_qos 00:10:13.007 ************************************ 00:10:13.007 19:46:04 blockdev_general.bdev_qos -- common/autotest_common.sh@1125 -- # qos_test_suite '' 00:10:13.007 19:46:04 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # QOS_PID=1363135 00:10:13.007 19:46:04 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # echo 'Process qos testing pid: 1363135' 00:10:13.007 Process qos testing pid: 1363135 00:10:13.007 19:46:04 blockdev_general.bdev_qos -- bdev/blockdev.sh@444 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:10:13.007 19:46:04 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:10:13.007 19:46:04 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # waitforlisten 1363135 00:10:13.007 19:46:04 blockdev_general.bdev_qos -- common/autotest_common.sh@831 -- # '[' -z 1363135 ']' 00:10:13.007 19:46:04 blockdev_general.bdev_qos -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:13.007 19:46:04 blockdev_general.bdev_qos -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:13.007 19:46:04 blockdev_general.bdev_qos -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:13.007 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:13.007 19:46:04 blockdev_general.bdev_qos -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:13.007 19:46:04 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:13.266 [2024-07-24 19:46:04.632380] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:10:13.266 [2024-07-24 19:46:04.632459] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1363135 ] 00:10:13.266 [2024-07-24 19:46:04.767895] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:13.525 [2024-07-24 19:46:04.883985] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@864 -- # return 0 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@450 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:14.092 Malloc_0 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # waitforbdev Malloc_0 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local bdev_name=Malloc_0 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # local i 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:14.092 [ 00:10:14.092 { 00:10:14.092 "name": "Malloc_0", 00:10:14.092 "aliases": [ 00:10:14.092 "15eaeeec-2eec-46ab-9455-1c39a5de1d79" 00:10:14.092 ], 00:10:14.092 "product_name": "Malloc disk", 00:10:14.092 "block_size": 512, 00:10:14.092 "num_blocks": 262144, 00:10:14.092 "uuid": "15eaeeec-2eec-46ab-9455-1c39a5de1d79", 00:10:14.092 "assigned_rate_limits": { 00:10:14.092 "rw_ios_per_sec": 0, 00:10:14.092 "rw_mbytes_per_sec": 0, 00:10:14.092 "r_mbytes_per_sec": 0, 00:10:14.092 "w_mbytes_per_sec": 0 00:10:14.092 }, 00:10:14.092 "claimed": false, 00:10:14.092 "zoned": false, 00:10:14.092 "supported_io_types": { 00:10:14.092 "read": true, 00:10:14.092 "write": true, 00:10:14.092 "unmap": true, 00:10:14.092 "flush": true, 00:10:14.092 "reset": true, 00:10:14.092 "nvme_admin": false, 00:10:14.092 "nvme_io": false, 00:10:14.092 "nvme_io_md": false, 00:10:14.092 "write_zeroes": true, 00:10:14.092 "zcopy": true, 00:10:14.092 "get_zone_info": false, 00:10:14.092 "zone_management": false, 00:10:14.092 "zone_append": false, 00:10:14.092 "compare": false, 00:10:14.092 "compare_and_write": false, 00:10:14.092 "abort": true, 00:10:14.092 "seek_hole": false, 00:10:14.092 "seek_data": false, 00:10:14.092 "copy": true, 00:10:14.092 "nvme_iov_md": false 00:10:14.092 }, 00:10:14.092 "memory_domains": [ 00:10:14.092 { 00:10:14.092 "dma_device_id": "system", 00:10:14.092 "dma_device_type": 1 00:10:14.092 }, 00:10:14.092 { 00:10:14.092 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:14.092 "dma_device_type": 2 00:10:14.092 } 00:10:14.092 ], 00:10:14.092 "driver_specific": {} 00:10:14.092 } 00:10:14.092 ] 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@907 -- # return 0 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # rpc_cmd bdev_null_create Null_1 128 512 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:14.092 Null_1 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # waitforbdev Null_1 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local bdev_name=Null_1 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # local i 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:14.092 [ 00:10:14.092 { 00:10:14.092 "name": "Null_1", 00:10:14.092 "aliases": [ 00:10:14.092 "0bd812ed-3690-4b7e-927f-09655d5ab562" 00:10:14.092 ], 00:10:14.092 "product_name": "Null disk", 00:10:14.092 "block_size": 512, 00:10:14.092 "num_blocks": 262144, 00:10:14.092 "uuid": "0bd812ed-3690-4b7e-927f-09655d5ab562", 00:10:14.092 "assigned_rate_limits": { 00:10:14.092 "rw_ios_per_sec": 0, 00:10:14.092 "rw_mbytes_per_sec": 0, 00:10:14.092 "r_mbytes_per_sec": 0, 00:10:14.092 "w_mbytes_per_sec": 0 00:10:14.092 }, 00:10:14.092 "claimed": false, 00:10:14.092 "zoned": false, 00:10:14.092 "supported_io_types": { 00:10:14.092 "read": true, 00:10:14.092 "write": true, 00:10:14.092 "unmap": false, 00:10:14.092 "flush": false, 00:10:14.092 "reset": true, 00:10:14.092 "nvme_admin": false, 00:10:14.092 "nvme_io": false, 00:10:14.092 "nvme_io_md": false, 00:10:14.092 "write_zeroes": true, 00:10:14.092 "zcopy": false, 00:10:14.092 "get_zone_info": false, 00:10:14.092 "zone_management": false, 00:10:14.092 "zone_append": false, 00:10:14.092 "compare": false, 00:10:14.092 "compare_and_write": false, 00:10:14.092 "abort": true, 00:10:14.092 "seek_hole": false, 00:10:14.092 "seek_data": false, 00:10:14.092 "copy": false, 00:10:14.092 "nvme_iov_md": false 00:10:14.092 }, 00:10:14.092 "driver_specific": {} 00:10:14.092 } 00:10:14.092 ] 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- common/autotest_common.sh@907 -- # return 0 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # qos_function_test 00:10:14.092 19:46:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@409 -- # local qos_lower_iops_limit=1000 00:10:14.351 19:46:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@455 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:14.351 19:46:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_bw_limit=2 00:10:14.351 19:46:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local io_result=0 00:10:14.351 19:46:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local iops_limit=0 00:10:14.351 19:46:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local bw_limit=0 00:10:14.351 19:46:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@415 -- # get_io_result IOPS Malloc_0 00:10:14.351 19:46:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@374 -- # local limit_type=IOPS 00:10:14.351 19:46:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local qos_dev=Malloc_0 00:10:14.351 19:46:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local iostat_result 00:10:14.351 19:46:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:14.351 19:46:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # grep Malloc_0 00:10:14.351 19:46:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # tail -1 00:10:14.351 Running I/O for 60 seconds... 00:10:19.621 19:46:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # iostat_result='Malloc_0 48724.53 194898.12 0.00 0.00 196608.00 0.00 0.00 ' 00:10:19.621 19:46:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # '[' IOPS = IOPS ']' 00:10:19.621 19:46:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # awk '{print $2}' 00:10:19.621 19:46:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # iostat_result=48724.53 00:10:19.621 19:46:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@384 -- # echo 48724 00:10:19.621 19:46:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@415 -- # io_result=48724 00:10:19.621 19:46:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@417 -- # iops_limit=12000 00:10:19.621 19:46:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # '[' 12000 -gt 1000 ']' 00:10:19.621 19:46:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@421 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 12000 Malloc_0 00:10:19.621 19:46:10 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:19.621 19:46:10 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:19.621 19:46:10 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:19.621 19:46:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # run_test bdev_qos_iops run_qos_test 12000 IOPS Malloc_0 00:10:19.621 19:46:10 blockdev_general.bdev_qos -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:10:19.621 19:46:10 blockdev_general.bdev_qos -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:19.621 19:46:10 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:19.621 ************************************ 00:10:19.621 START TEST bdev_qos_iops 00:10:19.621 ************************************ 00:10:19.621 19:46:10 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1125 -- # run_qos_test 12000 IOPS Malloc_0 00:10:19.621 19:46:10 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@388 -- # local qos_limit=12000 00:10:19.621 19:46:10 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_result=0 00:10:19.622 19:46:10 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@391 -- # get_io_result IOPS Malloc_0 00:10:19.622 19:46:10 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@374 -- # local limit_type=IOPS 00:10:19.622 19:46:10 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local qos_dev=Malloc_0 00:10:19.622 19:46:10 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local iostat_result 00:10:19.622 19:46:10 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:19.622 19:46:10 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # grep Malloc_0 00:10:19.622 19:46:10 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # tail -1 00:10:24.889 19:46:16 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # iostat_result='Malloc_0 11996.67 47986.66 0.00 0.00 49056.00 0.00 0.00 ' 00:10:24.889 19:46:16 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # '[' IOPS = IOPS ']' 00:10:24.889 19:46:16 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # awk '{print $2}' 00:10:24.889 19:46:16 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # iostat_result=11996.67 00:10:24.889 19:46:16 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@384 -- # echo 11996 00:10:24.889 19:46:16 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@391 -- # qos_result=11996 00:10:24.889 19:46:16 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # '[' IOPS = BANDWIDTH ']' 00:10:24.889 19:46:16 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@395 -- # lower_limit=10800 00:10:24.889 19:46:16 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # upper_limit=13200 00:10:24.889 19:46:16 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@399 -- # '[' 11996 -lt 10800 ']' 00:10:24.889 19:46:16 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@399 -- # '[' 11996 -gt 13200 ']' 00:10:24.889 00:10:24.889 real 0m5.280s 00:10:24.889 user 0m0.128s 00:10:24.889 sys 0m0.039s 00:10:24.889 19:46:16 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:24.889 19:46:16 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:10:24.889 ************************************ 00:10:24.889 END TEST bdev_qos_iops 00:10:24.889 ************************************ 00:10:24.889 19:46:16 blockdev_general.bdev_qos -- bdev/blockdev.sh@426 -- # get_io_result BANDWIDTH Null_1 00:10:24.889 19:46:16 blockdev_general.bdev_qos -- bdev/blockdev.sh@374 -- # local limit_type=BANDWIDTH 00:10:24.889 19:46:16 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local qos_dev=Null_1 00:10:24.889 19:46:16 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local iostat_result 00:10:24.889 19:46:16 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:24.889 19:46:16 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # grep Null_1 00:10:24.889 19:46:16 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # tail -1 00:10:30.159 19:46:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # iostat_result='Null_1 15545.13 62180.53 0.00 0.00 63488.00 0.00 0.00 ' 00:10:30.159 19:46:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # '[' BANDWIDTH = IOPS ']' 00:10:30.159 19:46:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:30.159 19:46:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # awk '{print $6}' 00:10:30.159 19:46:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # iostat_result=63488.00 00:10:30.159 19:46:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@384 -- # echo 63488 00:10:30.159 19:46:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@426 -- # bw_limit=63488 00:10:30.159 19:46:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=6 00:10:30.159 19:46:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # '[' 6 -lt 2 ']' 00:10:30.159 19:46:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@431 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 6 Null_1 00:10:30.159 19:46:21 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:30.159 19:46:21 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:30.159 19:46:21 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:30.159 19:46:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # run_test bdev_qos_bw run_qos_test 6 BANDWIDTH Null_1 00:10:30.159 19:46:21 blockdev_general.bdev_qos -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:10:30.159 19:46:21 blockdev_general.bdev_qos -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:30.159 19:46:21 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:30.159 ************************************ 00:10:30.159 START TEST bdev_qos_bw 00:10:30.159 ************************************ 00:10:30.159 19:46:21 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1125 -- # run_qos_test 6 BANDWIDTH Null_1 00:10:30.159 19:46:21 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@388 -- # local qos_limit=6 00:10:30.159 19:46:21 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_result=0 00:10:30.159 19:46:21 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@391 -- # get_io_result BANDWIDTH Null_1 00:10:30.159 19:46:21 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@374 -- # local limit_type=BANDWIDTH 00:10:30.159 19:46:21 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local qos_dev=Null_1 00:10:30.159 19:46:21 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local iostat_result 00:10:30.159 19:46:21 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:30.159 19:46:21 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # grep Null_1 00:10:30.159 19:46:21 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # tail -1 00:10:35.428 19:46:26 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # iostat_result='Null_1 1535.38 6141.52 0.00 0.00 6320.00 0.00 0.00 ' 00:10:35.428 19:46:26 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # '[' BANDWIDTH = IOPS ']' 00:10:35.428 19:46:26 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@380 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:35.428 19:46:26 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # awk '{print $6}' 00:10:35.428 19:46:26 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # iostat_result=6320.00 00:10:35.428 19:46:26 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@384 -- # echo 6320 00:10:35.428 19:46:26 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@391 -- # qos_result=6320 00:10:35.428 19:46:26 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:35.428 19:46:26 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # qos_limit=6144 00:10:35.428 19:46:26 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@395 -- # lower_limit=5529 00:10:35.428 19:46:26 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # upper_limit=6758 00:10:35.428 19:46:26 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@399 -- # '[' 6320 -lt 5529 ']' 00:10:35.428 19:46:26 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@399 -- # '[' 6320 -gt 6758 ']' 00:10:35.428 00:10:35.428 real 0m5.318s 00:10:35.428 user 0m0.125s 00:10:35.428 sys 0m0.041s 00:10:35.428 19:46:26 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:35.428 19:46:26 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:10:35.428 ************************************ 00:10:35.428 END TEST bdev_qos_bw 00:10:35.428 ************************************ 00:10:35.428 19:46:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@435 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:10:35.428 19:46:26 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:35.428 19:46:26 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:35.687 19:46:27 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:35.687 19:46:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:10:35.687 19:46:27 blockdev_general.bdev_qos -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:10:35.687 19:46:27 blockdev_general.bdev_qos -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:35.687 19:46:27 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:35.687 ************************************ 00:10:35.687 START TEST bdev_qos_ro_bw 00:10:35.687 ************************************ 00:10:35.687 19:46:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1125 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:10:35.687 19:46:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@388 -- # local qos_limit=2 00:10:35.687 19:46:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_result=0 00:10:35.688 19:46:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@391 -- # get_io_result BANDWIDTH Malloc_0 00:10:35.688 19:46:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@374 -- # local limit_type=BANDWIDTH 00:10:35.688 19:46:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local qos_dev=Malloc_0 00:10:35.688 19:46:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local iostat_result 00:10:35.688 19:46:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:35.688 19:46:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # grep Malloc_0 00:10:35.688 19:46:27 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # tail -1 00:10:41.015 19:46:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # iostat_result='Malloc_0 511.22 2044.87 0.00 0.00 2052.00 0.00 0.00 ' 00:10:41.015 19:46:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # '[' BANDWIDTH = IOPS ']' 00:10:41.015 19:46:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@380 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:41.015 19:46:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # awk '{print $6}' 00:10:41.015 19:46:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # iostat_result=2052.00 00:10:41.015 19:46:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@384 -- # echo 2052 00:10:41.015 19:46:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@391 -- # qos_result=2052 00:10:41.015 19:46:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:41.015 19:46:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # qos_limit=2048 00:10:41.015 19:46:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@395 -- # lower_limit=1843 00:10:41.015 19:46:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # upper_limit=2252 00:10:41.015 19:46:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@399 -- # '[' 2052 -lt 1843 ']' 00:10:41.015 19:46:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@399 -- # '[' 2052 -gt 2252 ']' 00:10:41.015 00:10:41.015 real 0m5.185s 00:10:41.015 user 0m0.111s 00:10:41.015 sys 0m0.054s 00:10:41.015 19:46:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:41.015 19:46:32 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:10:41.015 ************************************ 00:10:41.015 END TEST bdev_qos_ro_bw 00:10:41.015 ************************************ 00:10:41.015 19:46:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@458 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:10:41.015 19:46:32 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:41.016 19:46:32 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:41.274 19:46:32 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:41.274 19:46:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_null_delete Null_1 00:10:41.274 19:46:32 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:41.274 19:46:32 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:41.533 00:10:41.533 Latency(us) 00:10:41.533 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:41.533 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:41.533 Malloc_0 : 26.96 16454.56 64.28 0.00 0.00 15413.02 2550.21 503316.48 00:10:41.533 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:41.533 Null_1 : 27.16 15860.65 61.96 0.00 0.00 16081.89 1011.53 198773.54 00:10:41.533 =================================================================================================================== 00:10:41.533 Total : 32315.21 126.23 0.00 0.00 15742.54 1011.53 503316.48 00:10:41.533 0 00:10:41.533 19:46:33 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:41.533 19:46:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # killprocess 1363135 00:10:41.533 19:46:33 blockdev_general.bdev_qos -- common/autotest_common.sh@950 -- # '[' -z 1363135 ']' 00:10:41.533 19:46:33 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # kill -0 1363135 00:10:41.533 19:46:33 blockdev_general.bdev_qos -- common/autotest_common.sh@955 -- # uname 00:10:41.533 19:46:33 blockdev_general.bdev_qos -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:41.534 19:46:33 blockdev_general.bdev_qos -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1363135 00:10:41.534 19:46:33 blockdev_general.bdev_qos -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:10:41.534 19:46:33 blockdev_general.bdev_qos -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:10:41.534 19:46:33 blockdev_general.bdev_qos -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1363135' 00:10:41.534 killing process with pid 1363135 00:10:41.534 19:46:33 blockdev_general.bdev_qos -- common/autotest_common.sh@969 -- # kill 1363135 00:10:41.534 Received shutdown signal, test time was about 27.222544 seconds 00:10:41.534 00:10:41.534 Latency(us) 00:10:41.534 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:41.534 =================================================================================================================== 00:10:41.534 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:41.534 19:46:33 blockdev_general.bdev_qos -- common/autotest_common.sh@974 -- # wait 1363135 00:10:41.793 19:46:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # trap - SIGINT SIGTERM EXIT 00:10:41.793 00:10:41.793 real 0m28.815s 00:10:41.793 user 0m29.578s 00:10:41.793 sys 0m0.883s 00:10:41.793 19:46:33 blockdev_general.bdev_qos -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:41.793 19:46:33 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:41.793 ************************************ 00:10:41.793 END TEST bdev_qos 00:10:41.793 ************************************ 00:10:42.051 19:46:33 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:10:42.051 19:46:33 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:10:42.051 19:46:33 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:42.051 19:46:33 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:42.051 ************************************ 00:10:42.051 START TEST bdev_qd_sampling 00:10:42.051 ************************************ 00:10:42.051 19:46:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1125 -- # qd_sampling_test_suite '' 00:10:42.051 19:46:33 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@537 -- # QD_DEV=Malloc_QD 00:10:42.051 19:46:33 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # QD_PID=1366926 00:10:42.051 19:46:33 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # echo 'Process bdev QD sampling period testing pid: 1366926' 00:10:42.051 Process bdev QD sampling period testing pid: 1366926 00:10:42.051 19:46:33 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:10:42.051 19:46:33 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:10:42.051 19:46:33 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # waitforlisten 1366926 00:10:42.051 19:46:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@831 -- # '[' -z 1366926 ']' 00:10:42.051 19:46:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:42.051 19:46:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:42.051 19:46:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:42.051 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:42.051 19:46:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:42.051 19:46:33 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:42.051 [2024-07-24 19:46:33.528287] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:10:42.051 [2024-07-24 19:46:33.528348] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1366926 ] 00:10:42.051 [2024-07-24 19:46:33.640952] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:42.309 [2024-07-24 19:46:33.746361] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:42.309 [2024-07-24 19:46:33.746366] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:42.875 19:46:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:42.875 19:46:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@864 -- # return 0 00:10:42.876 19:46:34 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@545 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:10:42.876 19:46:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:42.876 19:46:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:42.876 Malloc_QD 00:10:42.876 19:46:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:42.876 19:46:34 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # waitforbdev Malloc_QD 00:10:42.876 19:46:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local bdev_name=Malloc_QD 00:10:42.876 19:46:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:10:42.876 19:46:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@901 -- # local i 00:10:42.876 19:46:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:10:42.876 19:46:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:10:42.876 19:46:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:10:42.876 19:46:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:42.876 19:46:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:42.876 19:46:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:42.876 19:46:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:10:42.876 19:46:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:42.876 19:46:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:43.135 [ 00:10:43.135 { 00:10:43.135 "name": "Malloc_QD", 00:10:43.135 "aliases": [ 00:10:43.135 "09d8a246-81cc-4045-979b-6187cf635375" 00:10:43.135 ], 00:10:43.135 "product_name": "Malloc disk", 00:10:43.135 "block_size": 512, 00:10:43.135 "num_blocks": 262144, 00:10:43.135 "uuid": "09d8a246-81cc-4045-979b-6187cf635375", 00:10:43.135 "assigned_rate_limits": { 00:10:43.135 "rw_ios_per_sec": 0, 00:10:43.135 "rw_mbytes_per_sec": 0, 00:10:43.135 "r_mbytes_per_sec": 0, 00:10:43.135 "w_mbytes_per_sec": 0 00:10:43.135 }, 00:10:43.135 "claimed": false, 00:10:43.135 "zoned": false, 00:10:43.135 "supported_io_types": { 00:10:43.135 "read": true, 00:10:43.135 "write": true, 00:10:43.135 "unmap": true, 00:10:43.135 "flush": true, 00:10:43.135 "reset": true, 00:10:43.135 "nvme_admin": false, 00:10:43.135 "nvme_io": false, 00:10:43.135 "nvme_io_md": false, 00:10:43.135 "write_zeroes": true, 00:10:43.135 "zcopy": true, 00:10:43.135 "get_zone_info": false, 00:10:43.135 "zone_management": false, 00:10:43.135 "zone_append": false, 00:10:43.135 "compare": false, 00:10:43.135 "compare_and_write": false, 00:10:43.135 "abort": true, 00:10:43.135 "seek_hole": false, 00:10:43.135 "seek_data": false, 00:10:43.135 "copy": true, 00:10:43.135 "nvme_iov_md": false 00:10:43.135 }, 00:10:43.135 "memory_domains": [ 00:10:43.135 { 00:10:43.135 "dma_device_id": "system", 00:10:43.135 "dma_device_type": 1 00:10:43.135 }, 00:10:43.135 { 00:10:43.135 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:43.135 "dma_device_type": 2 00:10:43.135 } 00:10:43.135 ], 00:10:43.135 "driver_specific": {} 00:10:43.135 } 00:10:43.135 ] 00:10:43.135 19:46:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:43.135 19:46:34 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@907 -- # return 0 00:10:43.135 19:46:34 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # sleep 2 00:10:43.135 19:46:34 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:43.135 Running I/O for 5 seconds... 00:10:45.039 19:46:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # qd_sampling_function_test Malloc_QD 00:10:45.039 19:46:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@518 -- # local bdev_name=Malloc_QD 00:10:45.039 19:46:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local sampling_period=10 00:10:45.039 19:46:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local iostats 00:10:45.039 19:46:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@522 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:10:45.039 19:46:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:45.039 19:46:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:45.039 19:46:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:45.039 19:46:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@524 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:10:45.039 19:46:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:45.039 19:46:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:45.039 19:46:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:45.039 19:46:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@524 -- # iostats='{ 00:10:45.039 "tick_rate": 2300000000, 00:10:45.039 "ticks": 7195764871071998, 00:10:45.039 "bdevs": [ 00:10:45.039 { 00:10:45.039 "name": "Malloc_QD", 00:10:45.039 "bytes_read": 677425664, 00:10:45.039 "num_read_ops": 165380, 00:10:45.039 "bytes_written": 0, 00:10:45.039 "num_write_ops": 0, 00:10:45.039 "bytes_unmapped": 0, 00:10:45.039 "num_unmap_ops": 0, 00:10:45.039 "bytes_copied": 0, 00:10:45.039 "num_copy_ops": 0, 00:10:45.039 "read_latency_ticks": 2236140587198, 00:10:45.039 "max_read_latency_ticks": 17848004, 00:10:45.039 "min_read_latency_ticks": 300666, 00:10:45.039 "write_latency_ticks": 0, 00:10:45.039 "max_write_latency_ticks": 0, 00:10:45.039 "min_write_latency_ticks": 0, 00:10:45.039 "unmap_latency_ticks": 0, 00:10:45.039 "max_unmap_latency_ticks": 0, 00:10:45.039 "min_unmap_latency_ticks": 0, 00:10:45.039 "copy_latency_ticks": 0, 00:10:45.039 "max_copy_latency_ticks": 0, 00:10:45.039 "min_copy_latency_ticks": 0, 00:10:45.039 "io_error": {}, 00:10:45.039 "queue_depth_polling_period": 10, 00:10:45.039 "queue_depth": 512, 00:10:45.039 "io_time": 30, 00:10:45.039 "weighted_io_time": 15360 00:10:45.039 } 00:10:45.039 ] 00:10:45.039 }' 00:10:45.039 19:46:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@526 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:10:45.039 19:46:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@526 -- # qd_sampling_period=10 00:10:45.039 19:46:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@528 -- # '[' 10 == null ']' 00:10:45.039 19:46:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@528 -- # '[' 10 -ne 10 ']' 00:10:45.039 19:46:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@552 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:10:45.039 19:46:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:45.039 19:46:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:45.039 00:10:45.039 Latency(us) 00:10:45.039 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:45.040 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:10:45.040 Malloc_QD : 1.98 47128.99 184.10 0.00 0.00 5418.17 1410.45 5869.75 00:10:45.040 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:45.040 Malloc_QD : 1.98 39856.97 155.69 0.00 0.00 6405.79 1218.11 7807.33 00:10:45.040 =================================================================================================================== 00:10:45.040 Total : 86985.95 339.79 0.00 0.00 5870.95 1218.11 7807.33 00:10:45.040 0 00:10:45.040 19:46:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:45.040 19:46:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # killprocess 1366926 00:10:45.040 19:46:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@950 -- # '[' -z 1366926 ']' 00:10:45.040 19:46:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # kill -0 1366926 00:10:45.040 19:46:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@955 -- # uname 00:10:45.040 19:46:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:45.040 19:46:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1366926 00:10:45.299 19:46:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:10:45.299 19:46:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:10:45.299 19:46:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1366926' 00:10:45.299 killing process with pid 1366926 00:10:45.299 19:46:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@969 -- # kill 1366926 00:10:45.299 Received shutdown signal, test time was about 2.059873 seconds 00:10:45.299 00:10:45.299 Latency(us) 00:10:45.299 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:45.299 =================================================================================================================== 00:10:45.299 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:45.299 19:46:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@974 -- # wait 1366926 00:10:45.299 19:46:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # trap - SIGINT SIGTERM EXIT 00:10:45.299 00:10:45.299 real 0m3.406s 00:10:45.299 user 0m6.661s 00:10:45.299 sys 0m0.416s 00:10:45.299 19:46:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:45.299 19:46:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:45.299 ************************************ 00:10:45.299 END TEST bdev_qd_sampling 00:10:45.299 ************************************ 00:10:45.559 19:46:36 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_error error_test_suite '' 00:10:45.559 19:46:36 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:10:45.559 19:46:36 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:45.559 19:46:36 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:45.559 ************************************ 00:10:45.559 START TEST bdev_error 00:10:45.559 ************************************ 00:10:45.559 19:46:36 blockdev_general.bdev_error -- common/autotest_common.sh@1125 -- # error_test_suite '' 00:10:45.559 19:46:36 blockdev_general.bdev_error -- bdev/blockdev.sh@465 -- # DEV_1=Dev_1 00:10:45.559 19:46:36 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_2=Dev_2 00:10:45.559 19:46:36 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # ERR_DEV=EE_Dev_1 00:10:45.559 19:46:36 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # ERR_PID=1367413 00:10:45.559 19:46:36 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # echo 'Process error testing pid: 1367413' 00:10:45.559 Process error testing pid: 1367413 00:10:45.559 19:46:36 blockdev_general.bdev_error -- bdev/blockdev.sh@470 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:10:45.559 19:46:36 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # waitforlisten 1367413 00:10:45.559 19:46:36 blockdev_general.bdev_error -- common/autotest_common.sh@831 -- # '[' -z 1367413 ']' 00:10:45.559 19:46:36 blockdev_general.bdev_error -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:45.559 19:46:36 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:45.559 19:46:36 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:45.559 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:45.559 19:46:36 blockdev_general.bdev_error -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:45.559 19:46:36 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:45.559 [2024-07-24 19:46:37.031453] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:10:45.559 [2024-07-24 19:46:37.031526] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1367413 ] 00:10:45.818 [2024-07-24 19:46:37.169641] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:45.818 [2024-07-24 19:46:37.302546] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:46.385 19:46:37 blockdev_general.bdev_error -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:46.644 19:46:37 blockdev_general.bdev_error -- common/autotest_common.sh@864 -- # return 0 00:10:46.644 19:46:37 blockdev_general.bdev_error -- bdev/blockdev.sh@475 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:10:46.644 19:46:37 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:46.644 19:46:37 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:46.644 Dev_1 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:46.644 19:46:38 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # waitforbdev Dev_1 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_name=Dev_1 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # local i 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:46.644 [ 00:10:46.644 { 00:10:46.644 "name": "Dev_1", 00:10:46.644 "aliases": [ 00:10:46.644 "5e5128cd-fabd-4836-82e5-ded555be5abb" 00:10:46.644 ], 00:10:46.644 "product_name": "Malloc disk", 00:10:46.644 "block_size": 512, 00:10:46.644 "num_blocks": 262144, 00:10:46.644 "uuid": "5e5128cd-fabd-4836-82e5-ded555be5abb", 00:10:46.644 "assigned_rate_limits": { 00:10:46.644 "rw_ios_per_sec": 0, 00:10:46.644 "rw_mbytes_per_sec": 0, 00:10:46.644 "r_mbytes_per_sec": 0, 00:10:46.644 "w_mbytes_per_sec": 0 00:10:46.644 }, 00:10:46.644 "claimed": false, 00:10:46.644 "zoned": false, 00:10:46.644 "supported_io_types": { 00:10:46.644 "read": true, 00:10:46.644 "write": true, 00:10:46.644 "unmap": true, 00:10:46.644 "flush": true, 00:10:46.644 "reset": true, 00:10:46.644 "nvme_admin": false, 00:10:46.644 "nvme_io": false, 00:10:46.644 "nvme_io_md": false, 00:10:46.644 "write_zeroes": true, 00:10:46.644 "zcopy": true, 00:10:46.644 "get_zone_info": false, 00:10:46.644 "zone_management": false, 00:10:46.644 "zone_append": false, 00:10:46.644 "compare": false, 00:10:46.644 "compare_and_write": false, 00:10:46.644 "abort": true, 00:10:46.644 "seek_hole": false, 00:10:46.644 "seek_data": false, 00:10:46.644 "copy": true, 00:10:46.644 "nvme_iov_md": false 00:10:46.644 }, 00:10:46.644 "memory_domains": [ 00:10:46.644 { 00:10:46.644 "dma_device_id": "system", 00:10:46.644 "dma_device_type": 1 00:10:46.644 }, 00:10:46.644 { 00:10:46.644 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:46.644 "dma_device_type": 2 00:10:46.644 } 00:10:46.644 ], 00:10:46.644 "driver_specific": {} 00:10:46.644 } 00:10:46.644 ] 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@907 -- # return 0 00:10:46.644 19:46:38 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # rpc_cmd bdev_error_create Dev_1 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:46.644 true 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:46.644 19:46:38 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:46.644 Dev_2 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:46.644 19:46:38 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # waitforbdev Dev_2 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_name=Dev_2 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # local i 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:46.644 [ 00:10:46.644 { 00:10:46.644 "name": "Dev_2", 00:10:46.644 "aliases": [ 00:10:46.644 "f89c59db-f276-4611-b2d1-f7e59ae1532a" 00:10:46.644 ], 00:10:46.644 "product_name": "Malloc disk", 00:10:46.644 "block_size": 512, 00:10:46.644 "num_blocks": 262144, 00:10:46.644 "uuid": "f89c59db-f276-4611-b2d1-f7e59ae1532a", 00:10:46.644 "assigned_rate_limits": { 00:10:46.644 "rw_ios_per_sec": 0, 00:10:46.644 "rw_mbytes_per_sec": 0, 00:10:46.644 "r_mbytes_per_sec": 0, 00:10:46.644 "w_mbytes_per_sec": 0 00:10:46.644 }, 00:10:46.644 "claimed": false, 00:10:46.644 "zoned": false, 00:10:46.644 "supported_io_types": { 00:10:46.644 "read": true, 00:10:46.644 "write": true, 00:10:46.644 "unmap": true, 00:10:46.644 "flush": true, 00:10:46.644 "reset": true, 00:10:46.644 "nvme_admin": false, 00:10:46.644 "nvme_io": false, 00:10:46.644 "nvme_io_md": false, 00:10:46.644 "write_zeroes": true, 00:10:46.644 "zcopy": true, 00:10:46.644 "get_zone_info": false, 00:10:46.644 "zone_management": false, 00:10:46.644 "zone_append": false, 00:10:46.644 "compare": false, 00:10:46.644 "compare_and_write": false, 00:10:46.644 "abort": true, 00:10:46.644 "seek_hole": false, 00:10:46.644 "seek_data": false, 00:10:46.644 "copy": true, 00:10:46.644 "nvme_iov_md": false 00:10:46.644 }, 00:10:46.644 "memory_domains": [ 00:10:46.644 { 00:10:46.644 "dma_device_id": "system", 00:10:46.644 "dma_device_type": 1 00:10:46.644 }, 00:10:46.644 { 00:10:46.644 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:46.644 "dma_device_type": 2 00:10:46.644 } 00:10:46.644 ], 00:10:46.644 "driver_specific": {} 00:10:46.644 } 00:10:46.644 ] 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@907 -- # return 0 00:10:46.644 19:46:38 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:46.644 19:46:38 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:46.644 19:46:38 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # sleep 1 00:10:46.645 19:46:38 blockdev_general.bdev_error -- bdev/blockdev.sh@482 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:10:46.903 Running I/O for 5 seconds... 00:10:47.840 19:46:39 blockdev_general.bdev_error -- bdev/blockdev.sh@486 -- # kill -0 1367413 00:10:47.840 19:46:39 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # echo 'Process is existed as continue on error is set. Pid: 1367413' 00:10:47.840 Process is existed as continue on error is set. Pid: 1367413 00:10:47.840 19:46:39 blockdev_general.bdev_error -- bdev/blockdev.sh@494 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:10:47.840 19:46:39 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:47.840 19:46:39 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:47.840 19:46:39 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:47.840 19:46:39 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_malloc_delete Dev_1 00:10:47.840 19:46:39 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:47.840 19:46:39 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:47.840 19:46:39 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:47.840 19:46:39 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # sleep 5 00:10:47.840 Timeout while waiting for response: 00:10:47.840 00:10:47.840 00:10:52.035 00:10:52.035 Latency(us) 00:10:52.035 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:52.035 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:52.035 EE_Dev_1 : 0.90 29099.50 113.67 5.57 0.00 545.13 167.40 897.56 00:10:52.035 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:52.035 Dev_2 : 5.00 62980.63 246.02 0.00 0.00 249.56 85.93 31229.33 00:10:52.035 =================================================================================================================== 00:10:52.035 Total : 92080.13 359.69 5.57 0.00 272.20 85.93 31229.33 00:10:52.970 19:46:44 blockdev_general.bdev_error -- bdev/blockdev.sh@498 -- # killprocess 1367413 00:10:52.970 19:46:44 blockdev_general.bdev_error -- common/autotest_common.sh@950 -- # '[' -z 1367413 ']' 00:10:52.970 19:46:44 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # kill -0 1367413 00:10:52.970 19:46:44 blockdev_general.bdev_error -- common/autotest_common.sh@955 -- # uname 00:10:52.970 19:46:44 blockdev_general.bdev_error -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:52.970 19:46:44 blockdev_general.bdev_error -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1367413 00:10:52.970 19:46:44 blockdev_general.bdev_error -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:10:52.970 19:46:44 blockdev_general.bdev_error -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:10:52.970 19:46:44 blockdev_general.bdev_error -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1367413' 00:10:52.970 killing process with pid 1367413 00:10:52.970 19:46:44 blockdev_general.bdev_error -- common/autotest_common.sh@969 -- # kill 1367413 00:10:52.970 Received shutdown signal, test time was about 5.000000 seconds 00:10:52.970 00:10:52.970 Latency(us) 00:10:52.970 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:52.970 =================================================================================================================== 00:10:52.970 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:52.970 19:46:44 blockdev_general.bdev_error -- common/autotest_common.sh@974 -- # wait 1367413 00:10:53.230 19:46:44 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # ERR_PID=1368373 00:10:53.230 19:46:44 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # echo 'Process error testing pid: 1368373' 00:10:53.230 Process error testing pid: 1368373 00:10:53.230 19:46:44 blockdev_general.bdev_error -- bdev/blockdev.sh@501 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:10:53.230 19:46:44 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # waitforlisten 1368373 00:10:53.230 19:46:44 blockdev_general.bdev_error -- common/autotest_common.sh@831 -- # '[' -z 1368373 ']' 00:10:53.230 19:46:44 blockdev_general.bdev_error -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:53.230 19:46:44 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:53.230 19:46:44 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:53.230 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:53.230 19:46:44 blockdev_general.bdev_error -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:53.230 19:46:44 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:53.230 [2024-07-24 19:46:44.673261] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:10:53.230 [2024-07-24 19:46:44.673338] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1368373 ] 00:10:53.230 [2024-07-24 19:46:44.808555] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:53.489 [2024-07-24 19:46:44.927834] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:54.056 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:54.056 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@864 -- # return 0 00:10:54.056 19:46:45 blockdev_general.bdev_error -- bdev/blockdev.sh@506 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:10:54.056 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:54.056 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:54.056 Dev_1 00:10:54.056 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:54.056 19:46:45 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # waitforbdev Dev_1 00:10:54.056 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_name=Dev_1 00:10:54.056 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:10:54.056 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # local i 00:10:54.056 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:10:54.057 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:10:54.057 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:10:54.057 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:54.057 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:54.316 [ 00:10:54.316 { 00:10:54.316 "name": "Dev_1", 00:10:54.316 "aliases": [ 00:10:54.316 "4c79cdf9-ca83-44db-9df2-4672daef3394" 00:10:54.316 ], 00:10:54.316 "product_name": "Malloc disk", 00:10:54.316 "block_size": 512, 00:10:54.316 "num_blocks": 262144, 00:10:54.316 "uuid": "4c79cdf9-ca83-44db-9df2-4672daef3394", 00:10:54.316 "assigned_rate_limits": { 00:10:54.316 "rw_ios_per_sec": 0, 00:10:54.316 "rw_mbytes_per_sec": 0, 00:10:54.316 "r_mbytes_per_sec": 0, 00:10:54.316 "w_mbytes_per_sec": 0 00:10:54.316 }, 00:10:54.316 "claimed": false, 00:10:54.316 "zoned": false, 00:10:54.316 "supported_io_types": { 00:10:54.316 "read": true, 00:10:54.316 "write": true, 00:10:54.316 "unmap": true, 00:10:54.316 "flush": true, 00:10:54.316 "reset": true, 00:10:54.316 "nvme_admin": false, 00:10:54.316 "nvme_io": false, 00:10:54.316 "nvme_io_md": false, 00:10:54.316 "write_zeroes": true, 00:10:54.316 "zcopy": true, 00:10:54.316 "get_zone_info": false, 00:10:54.316 "zone_management": false, 00:10:54.316 "zone_append": false, 00:10:54.316 "compare": false, 00:10:54.316 "compare_and_write": false, 00:10:54.316 "abort": true, 00:10:54.316 "seek_hole": false, 00:10:54.316 "seek_data": false, 00:10:54.316 "copy": true, 00:10:54.316 "nvme_iov_md": false 00:10:54.316 }, 00:10:54.316 "memory_domains": [ 00:10:54.316 { 00:10:54.316 "dma_device_id": "system", 00:10:54.316 "dma_device_type": 1 00:10:54.316 }, 00:10:54.316 { 00:10:54.316 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:54.316 "dma_device_type": 2 00:10:54.316 } 00:10:54.316 ], 00:10:54.316 "driver_specific": {} 00:10:54.316 } 00:10:54.316 ] 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@907 -- # return 0 00:10:54.316 19:46:45 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # rpc_cmd bdev_error_create Dev_1 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:54.316 true 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:54.316 19:46:45 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:54.316 Dev_2 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:54.316 19:46:45 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # waitforbdev Dev_2 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_name=Dev_2 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # local i 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:54.316 [ 00:10:54.316 { 00:10:54.316 "name": "Dev_2", 00:10:54.316 "aliases": [ 00:10:54.316 "01f813de-584b-4b9a-a77a-b9c12fd5dbf9" 00:10:54.316 ], 00:10:54.316 "product_name": "Malloc disk", 00:10:54.316 "block_size": 512, 00:10:54.316 "num_blocks": 262144, 00:10:54.316 "uuid": "01f813de-584b-4b9a-a77a-b9c12fd5dbf9", 00:10:54.316 "assigned_rate_limits": { 00:10:54.316 "rw_ios_per_sec": 0, 00:10:54.316 "rw_mbytes_per_sec": 0, 00:10:54.316 "r_mbytes_per_sec": 0, 00:10:54.316 "w_mbytes_per_sec": 0 00:10:54.316 }, 00:10:54.316 "claimed": false, 00:10:54.316 "zoned": false, 00:10:54.316 "supported_io_types": { 00:10:54.316 "read": true, 00:10:54.316 "write": true, 00:10:54.316 "unmap": true, 00:10:54.316 "flush": true, 00:10:54.316 "reset": true, 00:10:54.316 "nvme_admin": false, 00:10:54.316 "nvme_io": false, 00:10:54.316 "nvme_io_md": false, 00:10:54.316 "write_zeroes": true, 00:10:54.316 "zcopy": true, 00:10:54.316 "get_zone_info": false, 00:10:54.316 "zone_management": false, 00:10:54.316 "zone_append": false, 00:10:54.316 "compare": false, 00:10:54.316 "compare_and_write": false, 00:10:54.316 "abort": true, 00:10:54.316 "seek_hole": false, 00:10:54.316 "seek_data": false, 00:10:54.316 "copy": true, 00:10:54.316 "nvme_iov_md": false 00:10:54.316 }, 00:10:54.316 "memory_domains": [ 00:10:54.316 { 00:10:54.316 "dma_device_id": "system", 00:10:54.316 "dma_device_type": 1 00:10:54.316 }, 00:10:54.316 { 00:10:54.316 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:54.316 "dma_device_type": 2 00:10:54.316 } 00:10:54.316 ], 00:10:54.316 "driver_specific": {} 00:10:54.316 } 00:10:54.316 ] 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@907 -- # return 0 00:10:54.316 19:46:45 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:54.316 19:46:45 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # NOT wait 1368373 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@650 -- # local es=0 00:10:54.316 19:46:45 blockdev_general.bdev_error -- bdev/blockdev.sh@513 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@652 -- # valid_exec_arg wait 1368373 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@638 -- # local arg=wait 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@642 -- # type -t wait 00:10:54.316 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:10:54.317 19:46:45 blockdev_general.bdev_error -- common/autotest_common.sh@653 -- # wait 1368373 00:10:54.317 Running I/O for 5 seconds... 00:10:54.317 task offset: 90184 on job bdev=EE_Dev_1 fails 00:10:54.317 00:10:54.317 Latency(us) 00:10:54.317 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:54.317 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:54.317 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:10:54.317 EE_Dev_1 : 0.00 23060.80 90.08 5241.09 0.00 462.65 169.18 840.57 00:10:54.317 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:54.317 Dev_2 : 0.00 14222.22 55.56 0.00 0.00 833.28 163.84 1545.79 00:10:54.317 =================================================================================================================== 00:10:54.317 Total : 37283.02 145.64 5241.09 0.00 663.67 163.84 1545.79 00:10:54.317 [2024-07-24 19:46:45.898058] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:54.317 request: 00:10:54.317 { 00:10:54.317 "method": "perform_tests", 00:10:54.317 "req_id": 1 00:10:54.317 } 00:10:54.317 Got JSON-RPC error response 00:10:54.317 response: 00:10:54.317 { 00:10:54.317 "code": -32603, 00:10:54.317 "message": "bdevperf failed with error Operation not permitted" 00:10:54.317 } 00:10:54.885 19:46:46 blockdev_general.bdev_error -- common/autotest_common.sh@653 -- # es=255 00:10:54.885 19:46:46 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:10:54.885 19:46:46 blockdev_general.bdev_error -- common/autotest_common.sh@662 -- # es=127 00:10:54.885 19:46:46 blockdev_general.bdev_error -- common/autotest_common.sh@663 -- # case "$es" in 00:10:54.885 19:46:46 blockdev_general.bdev_error -- common/autotest_common.sh@670 -- # es=1 00:10:54.885 19:46:46 blockdev_general.bdev_error -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:10:54.885 00:10:54.885 real 0m9.318s 00:10:54.885 user 0m9.625s 00:10:54.885 sys 0m1.012s 00:10:54.885 19:46:46 blockdev_general.bdev_error -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:54.885 19:46:46 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:54.885 ************************************ 00:10:54.885 END TEST bdev_error 00:10:54.885 ************************************ 00:10:54.885 19:46:46 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_stat stat_test_suite '' 00:10:54.885 19:46:46 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:10:54.885 19:46:46 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:54.885 19:46:46 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:54.885 ************************************ 00:10:54.885 START TEST bdev_stat 00:10:54.885 ************************************ 00:10:54.885 19:46:46 blockdev_general.bdev_stat -- common/autotest_common.sh@1125 -- # stat_test_suite '' 00:10:54.885 19:46:46 blockdev_general.bdev_stat -- bdev/blockdev.sh@591 -- # STAT_DEV=Malloc_STAT 00:10:54.885 19:46:46 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # STAT_PID=1368584 00:10:54.885 19:46:46 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # echo 'Process Bdev IO statistics testing pid: 1368584' 00:10:54.885 Process Bdev IO statistics testing pid: 1368584 00:10:54.885 19:46:46 blockdev_general.bdev_stat -- bdev/blockdev.sh@594 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:10:54.885 19:46:46 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:10:54.885 19:46:46 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # waitforlisten 1368584 00:10:54.885 19:46:46 blockdev_general.bdev_stat -- common/autotest_common.sh@831 -- # '[' -z 1368584 ']' 00:10:54.885 19:46:46 blockdev_general.bdev_stat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:54.885 19:46:46 blockdev_general.bdev_stat -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:54.885 19:46:46 blockdev_general.bdev_stat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:54.885 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:54.886 19:46:46 blockdev_general.bdev_stat -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:54.886 19:46:46 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:54.886 [2024-07-24 19:46:46.430949] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:10:54.886 [2024-07-24 19:46:46.431016] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1368584 ] 00:10:55.144 [2024-07-24 19:46:46.562910] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:55.145 [2024-07-24 19:46:46.673404] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:55.145 [2024-07-24 19:46:46.673408] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:55.403 19:46:46 blockdev_general.bdev_stat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:55.403 19:46:46 blockdev_general.bdev_stat -- common/autotest_common.sh@864 -- # return 0 00:10:55.403 19:46:46 blockdev_general.bdev_stat -- bdev/blockdev.sh@600 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:10:55.403 19:46:46 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:55.403 19:46:46 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:55.403 Malloc_STAT 00:10:55.403 19:46:46 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:55.403 19:46:46 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # waitforbdev Malloc_STAT 00:10:55.403 19:46:46 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local bdev_name=Malloc_STAT 00:10:55.403 19:46:46 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:10:55.403 19:46:46 blockdev_general.bdev_stat -- common/autotest_common.sh@901 -- # local i 00:10:55.403 19:46:46 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:10:55.403 19:46:46 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:10:55.403 19:46:46 blockdev_general.bdev_stat -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:10:55.403 19:46:46 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:55.403 19:46:46 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:55.403 19:46:46 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:55.403 19:46:46 blockdev_general.bdev_stat -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:10:55.403 19:46:46 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:55.403 19:46:46 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:55.403 [ 00:10:55.403 { 00:10:55.403 "name": "Malloc_STAT", 00:10:55.403 "aliases": [ 00:10:55.403 "efb781e3-6313-4a85-9f23-860f13bebd98" 00:10:55.403 ], 00:10:55.403 "product_name": "Malloc disk", 00:10:55.403 "block_size": 512, 00:10:55.403 "num_blocks": 262144, 00:10:55.403 "uuid": "efb781e3-6313-4a85-9f23-860f13bebd98", 00:10:55.403 "assigned_rate_limits": { 00:10:55.403 "rw_ios_per_sec": 0, 00:10:55.403 "rw_mbytes_per_sec": 0, 00:10:55.403 "r_mbytes_per_sec": 0, 00:10:55.403 "w_mbytes_per_sec": 0 00:10:55.403 }, 00:10:55.403 "claimed": false, 00:10:55.403 "zoned": false, 00:10:55.403 "supported_io_types": { 00:10:55.403 "read": true, 00:10:55.403 "write": true, 00:10:55.403 "unmap": true, 00:10:55.403 "flush": true, 00:10:55.403 "reset": true, 00:10:55.403 "nvme_admin": false, 00:10:55.403 "nvme_io": false, 00:10:55.403 "nvme_io_md": false, 00:10:55.403 "write_zeroes": true, 00:10:55.403 "zcopy": true, 00:10:55.403 "get_zone_info": false, 00:10:55.403 "zone_management": false, 00:10:55.403 "zone_append": false, 00:10:55.403 "compare": false, 00:10:55.403 "compare_and_write": false, 00:10:55.403 "abort": true, 00:10:55.403 "seek_hole": false, 00:10:55.403 "seek_data": false, 00:10:55.403 "copy": true, 00:10:55.403 "nvme_iov_md": false 00:10:55.403 }, 00:10:55.403 "memory_domains": [ 00:10:55.403 { 00:10:55.404 "dma_device_id": "system", 00:10:55.404 "dma_device_type": 1 00:10:55.404 }, 00:10:55.404 { 00:10:55.404 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:55.404 "dma_device_type": 2 00:10:55.404 } 00:10:55.404 ], 00:10:55.404 "driver_specific": {} 00:10:55.404 } 00:10:55.404 ] 00:10:55.404 19:46:46 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:55.404 19:46:46 blockdev_general.bdev_stat -- common/autotest_common.sh@907 -- # return 0 00:10:55.404 19:46:46 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # sleep 2 00:10:55.404 19:46:46 blockdev_general.bdev_stat -- bdev/blockdev.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:55.662 Running I/O for 10 seconds... 00:10:57.566 19:46:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # stat_function_test Malloc_STAT 00:10:57.566 19:46:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@558 -- # local bdev_name=Malloc_STAT 00:10:57.566 19:46:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local iostats 00:10:57.566 19:46:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local io_count1 00:10:57.566 19:46:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count2 00:10:57.566 19:46:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local iostats_per_channel 00:10:57.566 19:46:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local io_count_per_channel1 00:10:57.566 19:46:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel2 00:10:57.566 19:46:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel_all=0 00:10:57.566 19:46:48 blockdev_general.bdev_stat -- bdev/blockdev.sh@567 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:10:57.566 19:46:48 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:57.566 19:46:48 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:57.566 19:46:49 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:57.566 19:46:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@567 -- # iostats='{ 00:10:57.566 "tick_rate": 2300000000, 00:10:57.566 "ticks": 7195793504598994, 00:10:57.566 "bdevs": [ 00:10:57.566 { 00:10:57.566 "name": "Malloc_STAT", 00:10:57.566 "bytes_read": 678474240, 00:10:57.566 "num_read_ops": 165636, 00:10:57.566 "bytes_written": 0, 00:10:57.566 "num_write_ops": 0, 00:10:57.566 "bytes_unmapped": 0, 00:10:57.566 "num_unmap_ops": 0, 00:10:57.566 "bytes_copied": 0, 00:10:57.566 "num_copy_ops": 0, 00:10:57.566 "read_latency_ticks": 2224087598554, 00:10:57.566 "max_read_latency_ticks": 17603074, 00:10:57.566 "min_read_latency_ticks": 271942, 00:10:57.566 "write_latency_ticks": 0, 00:10:57.566 "max_write_latency_ticks": 0, 00:10:57.566 "min_write_latency_ticks": 0, 00:10:57.566 "unmap_latency_ticks": 0, 00:10:57.566 "max_unmap_latency_ticks": 0, 00:10:57.566 "min_unmap_latency_ticks": 0, 00:10:57.566 "copy_latency_ticks": 0, 00:10:57.566 "max_copy_latency_ticks": 0, 00:10:57.566 "min_copy_latency_ticks": 0, 00:10:57.567 "io_error": {} 00:10:57.567 } 00:10:57.567 ] 00:10:57.567 }' 00:10:57.567 19:46:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # jq -r '.bdevs[0].num_read_ops' 00:10:57.567 19:46:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # io_count1=165636 00:10:57.567 19:46:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@570 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:10:57.567 19:46:49 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:57.567 19:46:49 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:57.567 19:46:49 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:57.567 19:46:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@570 -- # iostats_per_channel='{ 00:10:57.567 "tick_rate": 2300000000, 00:10:57.567 "ticks": 7195793665069624, 00:10:57.567 "name": "Malloc_STAT", 00:10:57.567 "channels": [ 00:10:57.567 { 00:10:57.567 "thread_id": 2, 00:10:57.567 "bytes_read": 383778816, 00:10:57.567 "num_read_ops": 93696, 00:10:57.567 "bytes_written": 0, 00:10:57.567 "num_write_ops": 0, 00:10:57.567 "bytes_unmapped": 0, 00:10:57.567 "num_unmap_ops": 0, 00:10:57.567 "bytes_copied": 0, 00:10:57.567 "num_copy_ops": 0, 00:10:57.567 "read_latency_ticks": 1152479132348, 00:10:57.567 "max_read_latency_ticks": 13075742, 00:10:57.567 "min_read_latency_ticks": 8410964, 00:10:57.567 "write_latency_ticks": 0, 00:10:57.567 "max_write_latency_ticks": 0, 00:10:57.567 "min_write_latency_ticks": 0, 00:10:57.567 "unmap_latency_ticks": 0, 00:10:57.567 "max_unmap_latency_ticks": 0, 00:10:57.567 "min_unmap_latency_ticks": 0, 00:10:57.567 "copy_latency_ticks": 0, 00:10:57.567 "max_copy_latency_ticks": 0, 00:10:57.567 "min_copy_latency_ticks": 0 00:10:57.567 }, 00:10:57.567 { 00:10:57.567 "thread_id": 3, 00:10:57.567 "bytes_read": 319815680, 00:10:57.567 "num_read_ops": 78080, 00:10:57.567 "bytes_written": 0, 00:10:57.567 "num_write_ops": 0, 00:10:57.567 "bytes_unmapped": 0, 00:10:57.567 "num_unmap_ops": 0, 00:10:57.567 "bytes_copied": 0, 00:10:57.567 "num_copy_ops": 0, 00:10:57.567 "read_latency_ticks": 1154196902526, 00:10:57.567 "max_read_latency_ticks": 17603074, 00:10:57.567 "min_read_latency_ticks": 9756324, 00:10:57.567 "write_latency_ticks": 0, 00:10:57.567 "max_write_latency_ticks": 0, 00:10:57.567 "min_write_latency_ticks": 0, 00:10:57.567 "unmap_latency_ticks": 0, 00:10:57.567 "max_unmap_latency_ticks": 0, 00:10:57.567 "min_unmap_latency_ticks": 0, 00:10:57.567 "copy_latency_ticks": 0, 00:10:57.567 "max_copy_latency_ticks": 0, 00:10:57.567 "min_copy_latency_ticks": 0 00:10:57.567 } 00:10:57.567 ] 00:10:57.567 }' 00:10:57.567 19:46:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # jq -r '.channels[0].num_read_ops' 00:10:57.567 19:46:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # io_count_per_channel1=93696 00:10:57.567 19:46:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel_all=93696 00:10:57.567 19:46:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # jq -r '.channels[1].num_read_ops' 00:10:57.826 19:46:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel2=78080 00:10:57.826 19:46:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel_all=171776 00:10:57.826 19:46:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@576 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:10:57.826 19:46:49 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:57.826 19:46:49 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:57.826 19:46:49 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:57.826 19:46:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@576 -- # iostats='{ 00:10:57.826 "tick_rate": 2300000000, 00:10:57.826 "ticks": 7195793950633738, 00:10:57.826 "bdevs": [ 00:10:57.826 { 00:10:57.826 "name": "Malloc_STAT", 00:10:57.826 "bytes_read": 747680256, 00:10:57.826 "num_read_ops": 182532, 00:10:57.826 "bytes_written": 0, 00:10:57.826 "num_write_ops": 0, 00:10:57.826 "bytes_unmapped": 0, 00:10:57.826 "num_unmap_ops": 0, 00:10:57.826 "bytes_copied": 0, 00:10:57.826 "num_copy_ops": 0, 00:10:57.826 "read_latency_ticks": 2451030784014, 00:10:57.826 "max_read_latency_ticks": 17603074, 00:10:57.826 "min_read_latency_ticks": 271942, 00:10:57.826 "write_latency_ticks": 0, 00:10:57.826 "max_write_latency_ticks": 0, 00:10:57.826 "min_write_latency_ticks": 0, 00:10:57.826 "unmap_latency_ticks": 0, 00:10:57.826 "max_unmap_latency_ticks": 0, 00:10:57.826 "min_unmap_latency_ticks": 0, 00:10:57.826 "copy_latency_ticks": 0, 00:10:57.826 "max_copy_latency_ticks": 0, 00:10:57.826 "min_copy_latency_ticks": 0, 00:10:57.826 "io_error": {} 00:10:57.826 } 00:10:57.826 ] 00:10:57.826 }' 00:10:57.826 19:46:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # jq -r '.bdevs[0].num_read_ops' 00:10:57.826 19:46:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # io_count2=182532 00:10:57.826 19:46:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@582 -- # '[' 171776 -lt 165636 ']' 00:10:57.826 19:46:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@582 -- # '[' 171776 -gt 182532 ']' 00:10:57.826 19:46:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@607 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:10:57.826 19:46:49 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:57.826 19:46:49 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:57.826 00:10:57.826 Latency(us) 00:10:57.826 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:57.826 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:10:57.826 Malloc_STAT : 2.17 47755.04 186.54 0.00 0.00 5347.81 1417.57 5698.78 00:10:57.826 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:57.826 Malloc_STAT : 2.17 39809.79 155.51 0.00 0.00 6414.03 1189.62 7693.36 00:10:57.826 =================================================================================================================== 00:10:57.826 Total : 87564.83 342.05 0.00 0.00 5832.72 1189.62 7693.36 00:10:57.826 0 00:10:57.827 19:46:49 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:57.827 19:46:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # killprocess 1368584 00:10:57.827 19:46:49 blockdev_general.bdev_stat -- common/autotest_common.sh@950 -- # '[' -z 1368584 ']' 00:10:57.827 19:46:49 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # kill -0 1368584 00:10:57.827 19:46:49 blockdev_general.bdev_stat -- common/autotest_common.sh@955 -- # uname 00:10:57.827 19:46:49 blockdev_general.bdev_stat -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:57.827 19:46:49 blockdev_general.bdev_stat -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1368584 00:10:57.827 19:46:49 blockdev_general.bdev_stat -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:10:57.827 19:46:49 blockdev_general.bdev_stat -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:10:57.827 19:46:49 blockdev_general.bdev_stat -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1368584' 00:10:57.827 killing process with pid 1368584 00:10:57.827 19:46:49 blockdev_general.bdev_stat -- common/autotest_common.sh@969 -- # kill 1368584 00:10:57.827 Received shutdown signal, test time was about 2.247637 seconds 00:10:57.827 00:10:57.827 Latency(us) 00:10:57.827 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:57.827 =================================================================================================================== 00:10:57.827 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:57.827 19:46:49 blockdev_general.bdev_stat -- common/autotest_common.sh@974 -- # wait 1368584 00:10:58.086 19:46:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # trap - SIGINT SIGTERM EXIT 00:10:58.086 00:10:58.086 real 0m3.201s 00:10:58.086 user 0m6.426s 00:10:58.086 sys 0m0.486s 00:10:58.086 19:46:49 blockdev_general.bdev_stat -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:58.086 19:46:49 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:58.086 ************************************ 00:10:58.086 END TEST bdev_stat 00:10:58.086 ************************************ 00:10:58.086 19:46:49 blockdev_general -- bdev/blockdev.sh@793 -- # [[ bdev == gpt ]] 00:10:58.086 19:46:49 blockdev_general -- bdev/blockdev.sh@797 -- # [[ bdev == crypto_sw ]] 00:10:58.086 19:46:49 blockdev_general -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:10:58.086 19:46:49 blockdev_general -- bdev/blockdev.sh@810 -- # cleanup 00:10:58.086 19:46:49 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:10:58.086 19:46:49 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:10:58.086 19:46:49 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:10:58.086 19:46:49 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:10:58.086 19:46:49 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:10:58.086 19:46:49 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:10:58.086 00:10:58.086 real 1m57.006s 00:10:58.086 user 7m11.623s 00:10:58.086 sys 0m23.179s 00:10:58.086 19:46:49 blockdev_general -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:58.086 19:46:49 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:58.086 ************************************ 00:10:58.086 END TEST blockdev_general 00:10:58.086 ************************************ 00:10:58.086 19:46:49 -- spdk/autotest.sh@194 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:10:58.086 19:46:49 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:10:58.086 19:46:49 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:58.086 19:46:49 -- common/autotest_common.sh@10 -- # set +x 00:10:58.346 ************************************ 00:10:58.346 START TEST bdev_raid 00:10:58.346 ************************************ 00:10:58.346 19:46:49 bdev_raid -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:10:58.346 * Looking for test storage... 00:10:58.346 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:10:58.346 19:46:49 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:10:58.346 19:46:49 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:10:58.346 19:46:49 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:10:58.346 19:46:49 bdev_raid -- bdev/bdev_raid.sh@927 -- # mkdir -p /raidtest 00:10:58.346 19:46:49 bdev_raid -- bdev/bdev_raid.sh@928 -- # trap 'cleanup; exit 1' EXIT 00:10:58.346 19:46:49 bdev_raid -- bdev/bdev_raid.sh@930 -- # base_blocklen=512 00:10:58.346 19:46:49 bdev_raid -- bdev/bdev_raid.sh@932 -- # run_test raid0_resize_superblock_test raid_resize_superblock_test 0 00:10:58.346 19:46:49 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:10:58.346 19:46:49 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:58.346 19:46:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:58.346 ************************************ 00:10:58.346 START TEST raid0_resize_superblock_test 00:10:58.346 ************************************ 00:10:58.346 19:46:49 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@1125 -- # raid_resize_superblock_test 0 00:10:58.346 19:46:49 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@868 -- # local raid_level=0 00:10:58.346 19:46:49 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@871 -- # raid_pid=1369182 00:10:58.346 19:46:49 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@872 -- # echo 'Process raid pid: 1369182' 00:10:58.346 Process raid pid: 1369182 00:10:58.346 19:46:49 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@870 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:58.346 19:46:49 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@873 -- # waitforlisten 1369182 /var/tmp/spdk-raid.sock 00:10:58.346 19:46:49 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1369182 ']' 00:10:58.346 19:46:49 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:58.346 19:46:49 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:58.346 19:46:49 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:58.346 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:58.346 19:46:49 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:58.346 19:46:49 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:58.346 [2024-07-24 19:46:49.938732] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:10:58.605 [2024-07-24 19:46:49.938803] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:58.605 [2024-07-24 19:46:50.072615] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:58.605 [2024-07-24 19:46:50.173135] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:58.864 [2024-07-24 19:46:50.235710] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:58.864 [2024-07-24 19:46:50.235741] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:59.432 19:46:50 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:59.432 19:46:50 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:10:59.432 19:46:50 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@875 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create -b malloc0 512 512 00:10:59.691 malloc0 00:10:59.691 19:46:51 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@877 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc0 -p pt0 00:10:59.950 [2024-07-24 19:46:51.464881] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:10:59.950 [2024-07-24 19:46:51.464934] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:59.950 [2024-07-24 19:46:51.464958] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x276d730 00:10:59.950 [2024-07-24 19:46:51.464971] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:59.950 [2024-07-24 19:46:51.466566] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:59.950 [2024-07-24 19:46:51.466595] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:10:59.950 pt0 00:10:59.950 19:46:51 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@878 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_create_lvstore pt0 lvs0 00:11:00.210 fdfe1198-f2ae-4905-a3f8-debdc684e0de 00:11:00.469 19:46:51 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@880 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_create -l lvs0 lvol0 64 00:11:00.469 b807930a-34c1-439f-b242-09e8abe8b385 00:11:00.728 19:46:52 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@881 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_create -l lvs0 lvol1 64 00:11:00.728 3c6b73c5-d4b5-4687-98d8-c8d7c87896e9 00:11:00.728 19:46:52 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@883 -- # case $raid_level in 00:11:00.728 19:46:52 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@884 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -n Raid -r 0 -z 64 -b 'lvs0/lvol0 lvs0/lvol1' -s 00:11:00.987 [2024-07-24 19:46:52.541550] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev b807930a-34c1-439f-b242-09e8abe8b385 is claimed 00:11:00.987 [2024-07-24 19:46:52.541626] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev 3c6b73c5-d4b5-4687-98d8-c8d7c87896e9 is claimed 00:11:00.987 [2024-07-24 19:46:52.541765] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x28047a0 00:11:00.987 [2024-07-24 19:46:52.541777] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 245760, blocklen 512 00:11:00.987 [2024-07-24 19:46:52.541963] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2804770 00:11:00.987 [2024-07-24 19:46:52.542121] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x28047a0 00:11:00.987 [2024-07-24 19:46:52.542132] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x28047a0 00:11:00.987 [2024-07-24 19:46:52.542237] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:00.987 19:46:52 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@889 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol0 00:11:00.987 19:46:52 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@889 -- # jq '.[].num_blocks' 00:11:01.246 19:46:52 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@889 -- # (( 64 == 64 )) 00:11:01.246 19:46:52 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@890 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol1 00:11:01.246 19:46:52 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@890 -- # jq '.[].num_blocks' 00:11:01.505 19:46:53 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@890 -- # (( 64 == 64 )) 00:11:01.505 19:46:53 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:11:01.505 19:46:53 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@894 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:01.505 19:46:53 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:11:01.505 19:46:53 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@894 -- # jq '.[].num_blocks' 00:11:01.764 [2024-07-24 19:46:53.291725] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:01.764 19:46:53 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:11:01.764 19:46:53 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:11:01.764 19:46:53 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@894 -- # (( 245760 == 245760 )) 00:11:01.764 19:46:53 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@899 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_resize lvs0/lvol0 100 00:11:02.023 [2024-07-24 19:46:53.532327] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:11:02.023 [2024-07-24 19:46:53.532351] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'b807930a-34c1-439f-b242-09e8abe8b385' was resized: old size 131072, new size 204800 00:11:02.023 19:46:53 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_resize lvs0/lvol1 100 00:11:02.281 [2024-07-24 19:46:53.772912] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:11:02.281 [2024-07-24 19:46:53.772933] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev '3c6b73c5-d4b5-4687-98d8-c8d7c87896e9' was resized: old size 131072, new size 204800 00:11:02.281 [2024-07-24 19:46:53.772957] bdev_raid.c:2315:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 245760 to 393216 00:11:02.281 19:46:53 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol0 00:11:02.282 19:46:53 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@903 -- # jq '.[].num_blocks' 00:11:02.551 19:46:54 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@903 -- # (( 100 == 100 )) 00:11:02.552 19:46:54 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol1 00:11:02.552 19:46:54 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@904 -- # jq '.[].num_blocks' 00:11:02.902 19:46:54 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@904 -- # (( 100 == 100 )) 00:11:02.902 19:46:54 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:11:02.902 19:46:54 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@908 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:02.902 19:46:54 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:11:02.902 19:46:54 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@908 -- # jq '.[].num_blocks' 00:11:03.162 [2024-07-24 19:46:54.563100] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:03.162 19:46:54 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:11:03.162 19:46:54 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:11:03.162 19:46:54 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@908 -- # (( 393216 == 393216 )) 00:11:03.162 19:46:54 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@912 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt0 00:11:03.421 [2024-07-24 19:46:54.807545] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev pt0 being removed: closing lvstore lvs0 00:11:03.421 [2024-07-24 19:46:54.807614] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: lvs0/lvol0 00:11:03.421 [2024-07-24 19:46:54.807624] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:03.421 [2024-07-24 19:46:54.807636] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: lvs0/lvol1 00:11:03.421 [2024-07-24 19:46:54.807729] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:03.421 [2024-07-24 19:46:54.807759] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:03.421 [2024-07-24 19:46:54.807771] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x28047a0 name Raid, state offline 00:11:03.421 19:46:54 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@913 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc0 -p pt0 00:11:03.680 [2024-07-24 19:46:55.056170] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:11:03.680 [2024-07-24 19:46:55.056219] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:03.680 [2024-07-24 19:46:55.056243] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x276d960 00:11:03.680 [2024-07-24 19:46:55.056255] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:03.680 [2024-07-24 19:46:55.057921] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:03.680 [2024-07-24 19:46:55.057952] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:11:03.680 [2024-07-24 19:46:55.059194] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev b807930a-34c1-439f-b242-09e8abe8b385 00:11:03.680 [2024-07-24 19:46:55.059237] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev b807930a-34c1-439f-b242-09e8abe8b385 is claimed 00:11:03.680 [2024-07-24 19:46:55.059331] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev 3c6b73c5-d4b5-4687-98d8-c8d7c87896e9 00:11:03.680 [2024-07-24 19:46:55.059352] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev 3c6b73c5-d4b5-4687-98d8-c8d7c87896e9 is claimed 00:11:03.680 [2024-07-24 19:46:55.059481] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev 3c6b73c5-d4b5-4687-98d8-c8d7c87896e9 (2) smaller than existing raid bdev Raid (3) 00:11:03.680 [2024-07-24 19:46:55.059523] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2905e80 00:11:03.680 [2024-07-24 19:46:55.059531] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 393216, blocklen 512 00:11:03.680 [2024-07-24 19:46:55.059697] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2803980 00:11:03.680 [2024-07-24 19:46:55.059843] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2905e80 00:11:03.680 [2024-07-24 19:46:55.059852] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x2905e80 00:11:03.680 [2024-07-24 19:46:55.059962] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:03.680 pt0 00:11:03.680 19:46:55 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:11:03.680 19:46:55 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@918 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:03.680 19:46:55 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:11:03.680 19:46:55 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@918 -- # jq '.[].num_blocks' 00:11:03.940 [2024-07-24 19:46:55.297050] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:03.940 19:46:55 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:11:03.940 19:46:55 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:11:03.940 19:46:55 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@918 -- # (( 393216 == 393216 )) 00:11:03.940 19:46:55 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@922 -- # killprocess 1369182 00:11:03.940 19:46:55 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1369182 ']' 00:11:03.940 19:46:55 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1369182 00:11:03.940 19:46:55 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@955 -- # uname 00:11:03.940 19:46:55 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:03.940 19:46:55 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1369182 00:11:03.940 19:46:55 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:03.940 19:46:55 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:03.940 19:46:55 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1369182' 00:11:03.940 killing process with pid 1369182 00:11:03.940 19:46:55 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@969 -- # kill 1369182 00:11:03.940 [2024-07-24 19:46:55.362520] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:03.940 [2024-07-24 19:46:55.362569] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:03.940 [2024-07-24 19:46:55.362609] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:03.940 [2024-07-24 19:46:55.362621] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2905e80 name Raid, state offline 00:11:03.940 19:46:55 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@974 -- # wait 1369182 00:11:03.940 [2024-07-24 19:46:55.453042] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:04.199 19:46:55 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@924 -- # return 0 00:11:04.199 00:11:04.199 real 0m5.806s 00:11:04.199 user 0m9.483s 00:11:04.199 sys 0m1.258s 00:11:04.199 19:46:55 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:04.199 19:46:55 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:04.199 ************************************ 00:11:04.199 END TEST raid0_resize_superblock_test 00:11:04.199 ************************************ 00:11:04.199 19:46:55 bdev_raid -- bdev/bdev_raid.sh@933 -- # run_test raid1_resize_superblock_test raid_resize_superblock_test 1 00:11:04.199 19:46:55 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:11:04.199 19:46:55 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:04.199 19:46:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:04.199 ************************************ 00:11:04.199 START TEST raid1_resize_superblock_test 00:11:04.199 ************************************ 00:11:04.199 19:46:55 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@1125 -- # raid_resize_superblock_test 1 00:11:04.199 19:46:55 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@868 -- # local raid_level=1 00:11:04.199 19:46:55 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@871 -- # raid_pid=1369950 00:11:04.199 19:46:55 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@870 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:04.199 19:46:55 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@872 -- # echo 'Process raid pid: 1369950' 00:11:04.199 Process raid pid: 1369950 00:11:04.199 19:46:55 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@873 -- # waitforlisten 1369950 /var/tmp/spdk-raid.sock 00:11:04.199 19:46:55 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1369950 ']' 00:11:04.199 19:46:55 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:04.199 19:46:55 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:04.199 19:46:55 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:04.199 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:04.199 19:46:55 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:04.199 19:46:55 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:04.458 [2024-07-24 19:46:55.815429] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:11:04.458 [2024-07-24 19:46:55.815494] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:04.458 [2024-07-24 19:46:55.950563] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:04.717 [2024-07-24 19:46:56.058187] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:04.717 [2024-07-24 19:46:56.132926] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:04.717 [2024-07-24 19:46:56.132966] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:05.285 19:46:56 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:05.286 19:46:56 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:11:05.286 19:46:56 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@875 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create -b malloc0 512 512 00:11:05.545 malloc0 00:11:05.545 19:46:57 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@877 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc0 -p pt0 00:11:05.804 [2024-07-24 19:46:57.340504] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:11:05.804 [2024-07-24 19:46:57.340553] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:05.804 [2024-07-24 19:46:57.340578] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f16730 00:11:05.804 [2024-07-24 19:46:57.340591] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:05.804 [2024-07-24 19:46:57.342300] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:05.804 [2024-07-24 19:46:57.342329] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:11:05.804 pt0 00:11:05.804 19:46:57 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@878 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_create_lvstore pt0 lvs0 00:11:06.372 9a9c9ffe-6e39-4dbe-a25a-ed2ffd2211c0 00:11:06.372 19:46:57 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@880 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_create -l lvs0 lvol0 64 00:11:06.372 0cbfe273-ea81-4b00-9883-c3ae5f83e5b8 00:11:06.632 19:46:57 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@881 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_create -l lvs0 lvol1 64 00:11:06.632 1a4b24c8-546e-49e3-a4ff-5e66d20f8289 00:11:06.632 19:46:58 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@883 -- # case $raid_level in 00:11:06.632 19:46:58 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@885 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -n Raid -r 1 -b 'lvs0/lvol0 lvs0/lvol1' -s 00:11:06.891 [2024-07-24 19:46:58.372296] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev 0cbfe273-ea81-4b00-9883-c3ae5f83e5b8 is claimed 00:11:06.891 [2024-07-24 19:46:58.372382] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev 1a4b24c8-546e-49e3-a4ff-5e66d20f8289 is claimed 00:11:06.891 [2024-07-24 19:46:58.372540] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fad7a0 00:11:06.891 [2024-07-24 19:46:58.372553] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 122880, blocklen 512 00:11:06.891 [2024-07-24 19:46:58.372749] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fad740 00:11:06.891 [2024-07-24 19:46:58.372924] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fad7a0 00:11:06.891 [2024-07-24 19:46:58.372935] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x1fad7a0 00:11:06.891 [2024-07-24 19:46:58.373046] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:06.891 19:46:58 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@889 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol0 00:11:06.891 19:46:58 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@889 -- # jq '.[].num_blocks' 00:11:07.150 19:46:58 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@889 -- # (( 64 == 64 )) 00:11:07.150 19:46:58 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@890 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol1 00:11:07.150 19:46:58 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@890 -- # jq '.[].num_blocks' 00:11:07.409 19:46:58 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@890 -- # (( 64 == 64 )) 00:11:07.410 19:46:58 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:11:07.410 19:46:58 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@895 -- # jq '.[].num_blocks' 00:11:07.410 19:46:58 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:11:07.410 19:46:58 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@895 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:07.669 [2024-07-24 19:46:59.114452] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:07.669 19:46:58 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:11:07.669 19:46:58 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:11:07.669 19:46:59 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@895 -- # (( 122880 == 122880 )) 00:11:07.669 19:46:59 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@899 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_resize lvs0/lvol0 100 00:11:07.929 [2024-07-24 19:46:59.363054] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:11:07.929 [2024-07-24 19:46:59.363081] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev '0cbfe273-ea81-4b00-9883-c3ae5f83e5b8' was resized: old size 131072, new size 204800 00:11:07.929 19:46:59 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_resize lvs0/lvol1 100 00:11:08.189 [2024-07-24 19:46:59.603668] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:11:08.189 [2024-07-24 19:46:59.603696] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev '1a4b24c8-546e-49e3-a4ff-5e66d20f8289' was resized: old size 131072, new size 204800 00:11:08.189 [2024-07-24 19:46:59.603721] bdev_raid.c:2315:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 122880 to 196608 00:11:08.189 19:46:59 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol0 00:11:08.189 19:46:59 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@903 -- # jq '.[].num_blocks' 00:11:08.448 19:46:59 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@903 -- # (( 100 == 100 )) 00:11:08.448 19:46:59 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol1 00:11:08.448 19:46:59 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@904 -- # jq '.[].num_blocks' 00:11:08.708 19:47:00 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@904 -- # (( 100 == 100 )) 00:11:08.708 19:47:00 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:11:08.708 19:47:00 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@909 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:08.708 19:47:00 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:11:08.708 19:47:00 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@909 -- # jq '.[].num_blocks' 00:11:09.276 [2024-07-24 19:47:00.594361] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:09.276 19:47:00 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:11:09.276 19:47:00 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:11:09.276 19:47:00 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@909 -- # (( 196608 == 196608 )) 00:11:09.276 19:47:00 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@912 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt0 00:11:09.276 [2024-07-24 19:47:00.850839] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev pt0 being removed: closing lvstore lvs0 00:11:09.276 [2024-07-24 19:47:00.850902] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: lvs0/lvol0 00:11:09.276 [2024-07-24 19:47:00.850925] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: lvs0/lvol1 00:11:09.276 [2024-07-24 19:47:00.851046] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:09.276 [2024-07-24 19:47:00.851187] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:09.276 [2024-07-24 19:47:00.851247] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:09.276 [2024-07-24 19:47:00.851260] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fad7a0 name Raid, state offline 00:11:09.276 19:47:00 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@913 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc0 -p pt0 00:11:09.535 [2024-07-24 19:47:01.091435] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:11:09.535 [2024-07-24 19:47:01.091469] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:09.535 [2024-07-24 19:47:01.091488] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f16960 00:11:09.535 [2024-07-24 19:47:01.091500] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:09.535 [2024-07-24 19:47:01.093093] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:09.535 [2024-07-24 19:47:01.093120] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:11:09.535 [2024-07-24 19:47:01.094323] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev 0cbfe273-ea81-4b00-9883-c3ae5f83e5b8 00:11:09.535 [2024-07-24 19:47:01.094368] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev 0cbfe273-ea81-4b00-9883-c3ae5f83e5b8 is claimed 00:11:09.535 [2024-07-24 19:47:01.094467] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev 1a4b24c8-546e-49e3-a4ff-5e66d20f8289 00:11:09.535 [2024-07-24 19:47:01.094486] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev 1a4b24c8-546e-49e3-a4ff-5e66d20f8289 is claimed 00:11:09.535 [2024-07-24 19:47:01.094598] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev 1a4b24c8-546e-49e3-a4ff-5e66d20f8289 (2) smaller than existing raid bdev Raid (3) 00:11:09.535 [2024-07-24 19:47:01.094629] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x20aef50 00:11:09.535 [2024-07-24 19:47:01.094637] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:11:09.535 [2024-07-24 19:47:01.094797] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20af7c0 00:11:09.535 [2024-07-24 19:47:01.094941] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20aef50 00:11:09.535 [2024-07-24 19:47:01.094951] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x20aef50 00:11:09.535 [2024-07-24 19:47:01.095056] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:09.535 pt0 00:11:09.535 19:47:01 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:11:09.535 19:47:01 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@919 -- # jq '.[].num_blocks' 00:11:09.535 19:47:01 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:11:09.535 19:47:01 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@919 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:09.794 [2024-07-24 19:47:01.344338] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:09.794 19:47:01 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:11:09.794 19:47:01 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:11:09.794 19:47:01 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@919 -- # (( 196608 == 196608 )) 00:11:09.794 19:47:01 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@922 -- # killprocess 1369950 00:11:09.794 19:47:01 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1369950 ']' 00:11:09.794 19:47:01 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1369950 00:11:09.794 19:47:01 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@955 -- # uname 00:11:09.794 19:47:01 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:09.794 19:47:01 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1369950 00:11:10.053 19:47:01 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:10.053 19:47:01 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:10.053 19:47:01 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1369950' 00:11:10.053 killing process with pid 1369950 00:11:10.053 19:47:01 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@969 -- # kill 1369950 00:11:10.053 [2024-07-24 19:47:01.415481] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:10.053 [2024-07-24 19:47:01.415526] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:10.053 [2024-07-24 19:47:01.415570] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:10.053 [2024-07-24 19:47:01.415581] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20aef50 name Raid, state offline 00:11:10.053 19:47:01 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@974 -- # wait 1369950 00:11:10.053 [2024-07-24 19:47:01.497601] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:10.313 19:47:01 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@924 -- # return 0 00:11:10.313 00:11:10.313 real 0m5.958s 00:11:10.313 user 0m9.743s 00:11:10.313 sys 0m1.209s 00:11:10.313 19:47:01 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:10.313 19:47:01 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:10.313 ************************************ 00:11:10.313 END TEST raid1_resize_superblock_test 00:11:10.313 ************************************ 00:11:10.313 19:47:01 bdev_raid -- bdev/bdev_raid.sh@935 -- # uname -s 00:11:10.313 19:47:01 bdev_raid -- bdev/bdev_raid.sh@935 -- # '[' Linux = Linux ']' 00:11:10.313 19:47:01 bdev_raid -- bdev/bdev_raid.sh@935 -- # modprobe -n nbd 00:11:10.313 19:47:01 bdev_raid -- bdev/bdev_raid.sh@936 -- # has_nbd=true 00:11:10.313 19:47:01 bdev_raid -- bdev/bdev_raid.sh@937 -- # modprobe nbd 00:11:10.313 19:47:01 bdev_raid -- bdev/bdev_raid.sh@938 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:11:10.313 19:47:01 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:11:10.313 19:47:01 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:10.313 19:47:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:10.313 ************************************ 00:11:10.313 START TEST raid_function_test_raid0 00:11:10.313 ************************************ 00:11:10.313 19:47:01 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1125 -- # raid_function_test raid0 00:11:10.313 19:47:01 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:11:10.313 19:47:01 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:11:10.313 19:47:01 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:11:10.313 19:47:01 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=1370879 00:11:10.313 19:47:01 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 1370879' 00:11:10.313 Process raid pid: 1370879 00:11:10.313 19:47:01 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:10.313 19:47:01 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 1370879 /var/tmp/spdk-raid.sock 00:11:10.313 19:47:01 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@831 -- # '[' -z 1370879 ']' 00:11:10.313 19:47:01 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:10.313 19:47:01 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:10.313 19:47:01 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:10.313 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:10.313 19:47:01 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:10.313 19:47:01 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:11:10.313 [2024-07-24 19:47:01.892689] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:11:10.313 [2024-07-24 19:47:01.892757] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:10.573 [2024-07-24 19:47:02.014828] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:10.573 [2024-07-24 19:47:02.118096] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:10.832 [2024-07-24 19:47:02.183880] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:10.832 [2024-07-24 19:47:02.183908] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:11.399 19:47:02 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:11.399 19:47:02 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@864 -- # return 0 00:11:11.399 19:47:02 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:11:11.399 19:47:02 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:11:11.399 19:47:02 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:11.399 19:47:02 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:11:11.399 19:47:02 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:11:11.658 [2024-07-24 19:47:03.021111] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:11:11.658 [2024-07-24 19:47:03.022252] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:11:11.658 [2024-07-24 19:47:03.022314] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x146fb60 00:11:11.658 [2024-07-24 19:47:03.022325] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:11.658 [2024-07-24 19:47:03.022582] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x146ff40 00:11:11.658 [2024-07-24 19:47:03.022696] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x146fb60 00:11:11.658 [2024-07-24 19:47:03.022706] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x146fb60 00:11:11.658 [2024-07-24 19:47:03.022807] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:11.658 Base_1 00:11:11.658 Base_2 00:11:11.658 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:11.658 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:11:11.658 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:11:11.658 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:11:11.658 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:11:11.658 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:11:11.658 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:11.658 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:11:11.658 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:11:11.658 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:11:11.658 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:11:11.658 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:11:11.658 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:11:11.658 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:11.658 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:11:11.917 [2024-07-24 19:47:03.462299] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x146ff40 00:11:11.917 /dev/nbd0 00:11:11.917 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:11:11.917 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:11:11.917 19:47:03 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:11:11.917 19:47:03 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # local i 00:11:11.917 19:47:03 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:11.917 19:47:03 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:11.917 19:47:03 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:11:11.917 19:47:03 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@873 -- # break 00:11:11.917 19:47:03 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:11.917 19:47:03 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:11.917 19:47:03 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:11.917 1+0 records in 00:11:11.917 1+0 records out 00:11:11.917 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000245383 s, 16.7 MB/s 00:11:11.917 19:47:03 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:12.176 19:47:03 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # size=4096 00:11:12.176 19:47:03 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:12.176 19:47:03 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:12.176 19:47:03 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@889 -- # return 0 00:11:12.176 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:12.176 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:12.176 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:12.176 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:12.176 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:11:12.435 { 00:11:12.435 "nbd_device": "/dev/nbd0", 00:11:12.435 "bdev_name": "raid" 00:11:12.435 } 00:11:12.435 ]' 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:11:12.435 { 00:11:12.435 "nbd_device": "/dev/nbd0", 00:11:12.435 "bdev_name": "raid" 00:11:12.435 } 00:11:12.435 ]' 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:11:12.435 4096+0 records in 00:11:12.435 4096+0 records out 00:11:12.435 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0306575 s, 68.4 MB/s 00:11:12.435 19:47:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:11:12.694 4096+0 records in 00:11:12.694 4096+0 records out 00:11:12.694 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.258779 s, 8.1 MB/s 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:11:12.694 128+0 records in 00:11:12.694 128+0 records out 00:11:12.694 65536 bytes (66 kB, 64 KiB) copied, 0.000821941 s, 79.7 MB/s 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:11:12.694 2035+0 records in 00:11:12.694 2035+0 records out 00:11:12.694 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0105866 s, 98.4 MB/s 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:11:12.694 456+0 records in 00:11:12.694 456+0 records out 00:11:12.694 233472 bytes (233 kB, 228 KiB) copied, 0.00270019 s, 86.5 MB/s 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:12.694 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:11:12.953 [2024-07-24 19:47:04.504272] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:12.953 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:11:12.953 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:11:12.953 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:11:12.953 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:12.953 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:12.953 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:11:12.953 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:11:12.953 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:11:12.953 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:12.953 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:12.953 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:13.212 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:11:13.212 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:11:13.212 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:13.473 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:11:13.473 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:11:13.473 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:13.473 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:11:13.473 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:11:13.473 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:11:13.473 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:11:13.473 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:11:13.473 19:47:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 1370879 00:11:13.473 19:47:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@950 -- # '[' -z 1370879 ']' 00:11:13.473 19:47:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # kill -0 1370879 00:11:13.473 19:47:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@955 -- # uname 00:11:13.473 19:47:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:13.473 19:47:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1370879 00:11:13.473 19:47:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:13.473 19:47:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:13.473 19:47:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1370879' 00:11:13.473 killing process with pid 1370879 00:11:13.473 19:47:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@969 -- # kill 1370879 00:11:13.473 [2024-07-24 19:47:04.880485] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:13.473 [2024-07-24 19:47:04.880553] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:13.473 [2024-07-24 19:47:04.880595] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:13.473 [2024-07-24 19:47:04.880607] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x146fb60 name raid, state offline 00:11:13.473 19:47:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@974 -- # wait 1370879 00:11:13.473 [2024-07-24 19:47:04.897678] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:13.733 19:47:05 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:11:13.733 00:11:13.733 real 0m3.283s 00:11:13.733 user 0m4.305s 00:11:13.733 sys 0m1.224s 00:11:13.733 19:47:05 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:13.733 19:47:05 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:11:13.733 ************************************ 00:11:13.733 END TEST raid_function_test_raid0 00:11:13.733 ************************************ 00:11:13.733 19:47:05 bdev_raid -- bdev/bdev_raid.sh@939 -- # run_test raid_function_test_concat raid_function_test concat 00:11:13.733 19:47:05 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:11:13.733 19:47:05 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:13.733 19:47:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:13.733 ************************************ 00:11:13.733 START TEST raid_function_test_concat 00:11:13.733 ************************************ 00:11:13.733 19:47:05 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1125 -- # raid_function_test concat 00:11:13.733 19:47:05 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:11:13.733 19:47:05 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:11:13.733 19:47:05 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:11:13.733 19:47:05 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=1371325 00:11:13.733 19:47:05 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 1371325' 00:11:13.733 Process raid pid: 1371325 00:11:13.733 19:47:05 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:13.733 19:47:05 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 1371325 /var/tmp/spdk-raid.sock 00:11:13.733 19:47:05 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@831 -- # '[' -z 1371325 ']' 00:11:13.733 19:47:05 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:13.733 19:47:05 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:13.733 19:47:05 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:13.733 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:13.733 19:47:05 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:13.733 19:47:05 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:11:13.733 [2024-07-24 19:47:05.261398] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:11:13.733 [2024-07-24 19:47:05.261468] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:13.992 [2024-07-24 19:47:05.392280] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:13.992 [2024-07-24 19:47:05.494654] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:13.992 [2024-07-24 19:47:05.551426] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:13.992 [2024-07-24 19:47:05.551457] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:14.931 19:47:06 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:14.931 19:47:06 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@864 -- # return 0 00:11:14.931 19:47:06 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:11:14.931 19:47:06 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:11:14.931 19:47:06 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:14.931 19:47:06 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:11:14.931 19:47:06 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:11:14.931 [2024-07-24 19:47:06.460335] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:11:14.931 [2024-07-24 19:47:06.461483] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:11:14.931 [2024-07-24 19:47:06.461545] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1456b60 00:11:14.931 [2024-07-24 19:47:06.461557] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:14.931 [2024-07-24 19:47:06.461812] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1456f40 00:11:14.931 [2024-07-24 19:47:06.461927] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1456b60 00:11:14.931 [2024-07-24 19:47:06.461937] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x1456b60 00:11:14.931 [2024-07-24 19:47:06.462040] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:14.931 Base_1 00:11:14.931 Base_2 00:11:14.931 19:47:06 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:14.931 19:47:06 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:11:14.931 19:47:06 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:11:15.190 19:47:06 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:11:15.190 19:47:06 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:11:15.190 19:47:06 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:11:15.190 19:47:06 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:15.190 19:47:06 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:11:15.190 19:47:06 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:11:15.190 19:47:06 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:11:15.190 19:47:06 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:11:15.190 19:47:06 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:11:15.190 19:47:06 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:11:15.190 19:47:06 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:15.190 19:47:06 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:11:15.758 [2024-07-24 19:47:07.218366] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1456f40 00:11:15.758 /dev/nbd0 00:11:15.758 19:47:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:11:15.758 19:47:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:11:15.758 19:47:07 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:11:15.758 19:47:07 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # local i 00:11:15.758 19:47:07 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:15.758 19:47:07 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:15.758 19:47:07 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:11:15.758 19:47:07 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@873 -- # break 00:11:15.758 19:47:07 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:15.758 19:47:07 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:15.758 19:47:07 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:15.758 1+0 records in 00:11:15.758 1+0 records out 00:11:15.758 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000262629 s, 15.6 MB/s 00:11:15.758 19:47:07 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:15.758 19:47:07 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # size=4096 00:11:15.758 19:47:07 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:15.758 19:47:07 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:15.758 19:47:07 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@889 -- # return 0 00:11:15.758 19:47:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:15.758 19:47:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:15.758 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:15.758 19:47:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:15.758 19:47:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:16.016 19:47:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:11:16.016 { 00:11:16.016 "nbd_device": "/dev/nbd0", 00:11:16.016 "bdev_name": "raid" 00:11:16.016 } 00:11:16.016 ]' 00:11:16.016 19:47:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:11:16.017 { 00:11:16.017 "nbd_device": "/dev/nbd0", 00:11:16.017 "bdev_name": "raid" 00:11:16.017 } 00:11:16.017 ]' 00:11:16.017 19:47:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:16.017 19:47:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:11:16.017 19:47:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:16.017 19:47:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:11:16.017 19:47:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:11:16.017 19:47:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:11:16.017 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:11:16.017 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:11:16.017 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:11:16.017 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:11:16.017 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:11:16.017 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:16.017 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:11:16.017 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:11:16.017 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:11:16.017 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:11:16.017 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:11:16.017 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:11:16.017 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:11:16.017 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:11:16.017 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:11:16.017 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:11:16.017 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:11:16.017 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:11:16.017 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:11:16.017 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:11:16.275 4096+0 records in 00:11:16.275 4096+0 records out 00:11:16.275 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0301068 s, 69.7 MB/s 00:11:16.275 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:11:16.535 4096+0 records in 00:11:16.535 4096+0 records out 00:11:16.535 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.316545 s, 6.6 MB/s 00:11:16.535 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:11:16.535 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:16.535 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:11:16.535 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:16.535 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:11:16.535 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:11:16.535 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:11:16.535 128+0 records in 00:11:16.535 128+0 records out 00:11:16.535 65536 bytes (66 kB, 64 KiB) copied, 0.000809966 s, 80.9 MB/s 00:11:16.535 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:11:16.535 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:16.535 19:47:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:16.535 19:47:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:16.535 19:47:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:16.535 19:47:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:11:16.535 19:47:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:11:16.535 19:47:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:11:16.535 2035+0 records in 00:11:16.535 2035+0 records out 00:11:16.535 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.010936 s, 95.3 MB/s 00:11:16.535 19:47:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:11:16.535 19:47:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:16.535 19:47:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:16.535 19:47:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:16.535 19:47:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:16.535 19:47:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:11:16.535 19:47:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:11:16.535 19:47:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:11:16.535 456+0 records in 00:11:16.535 456+0 records out 00:11:16.535 233472 bytes (233 kB, 228 KiB) copied, 0.00267855 s, 87.2 MB/s 00:11:16.535 19:47:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:11:16.535 19:47:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:16.535 19:47:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:16.535 19:47:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:16.535 19:47:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:16.535 19:47:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:11:16.535 19:47:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:11:16.535 19:47:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:16.535 19:47:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:11:16.535 19:47:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:11:16.535 19:47:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:11:16.535 19:47:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:16.535 19:47:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:11:16.794 [2024-07-24 19:47:08.314229] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:16.794 19:47:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:11:16.794 19:47:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:11:16.794 19:47:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:11:16.794 19:47:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:16.794 19:47:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:16.794 19:47:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:11:16.794 19:47:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:11:16.794 19:47:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:11:16.794 19:47:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:16.794 19:47:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:16.794 19:47:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:17.053 19:47:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:11:17.053 19:47:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:11:17.053 19:47:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:17.053 19:47:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:11:17.053 19:47:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:11:17.053 19:47:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:17.312 19:47:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:11:17.312 19:47:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:11:17.312 19:47:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:11:17.312 19:47:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:11:17.312 19:47:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:11:17.312 19:47:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 1371325 00:11:17.312 19:47:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@950 -- # '[' -z 1371325 ']' 00:11:17.312 19:47:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # kill -0 1371325 00:11:17.312 19:47:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@955 -- # uname 00:11:17.312 19:47:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:17.312 19:47:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1371325 00:11:17.312 19:47:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:17.312 19:47:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:17.312 19:47:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1371325' 00:11:17.312 killing process with pid 1371325 00:11:17.312 19:47:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@969 -- # kill 1371325 00:11:17.312 [2024-07-24 19:47:08.699185] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:17.312 [2024-07-24 19:47:08.699254] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:17.313 [2024-07-24 19:47:08.699303] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:17.313 [2024-07-24 19:47:08.699318] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1456b60 name raid, state offline 00:11:17.313 19:47:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@974 -- # wait 1371325 00:11:17.313 [2024-07-24 19:47:08.716804] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:17.572 19:47:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:11:17.572 00:11:17.572 real 0m3.741s 00:11:17.572 user 0m5.085s 00:11:17.572 sys 0m1.288s 00:11:17.572 19:47:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:17.572 19:47:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:11:17.572 ************************************ 00:11:17.572 END TEST raid_function_test_concat 00:11:17.572 ************************************ 00:11:17.572 19:47:08 bdev_raid -- bdev/bdev_raid.sh@942 -- # run_test raid0_resize_test raid_resize_test 0 00:11:17.572 19:47:08 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:11:17.572 19:47:08 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:17.572 19:47:08 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:17.572 ************************************ 00:11:17.572 START TEST raid0_resize_test 00:11:17.572 ************************************ 00:11:17.572 19:47:09 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1125 -- # raid_resize_test 0 00:11:17.572 19:47:09 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local raid_level=0 00:11:17.572 19:47:09 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local blksize=512 00:11:17.572 19:47:09 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local bdev_size_mb=32 00:11:17.572 19:47:09 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local new_bdev_size_mb=64 00:11:17.572 19:47:09 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local blkcnt 00:11:17.572 19:47:09 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local raid_size_mb 00:11:17.572 19:47:09 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@353 -- # local new_raid_size_mb 00:11:17.572 19:47:09 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # local expected_size 00:11:17.572 19:47:09 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # raid_pid=1371930 00:11:17.572 19:47:09 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@358 -- # echo 'Process raid pid: 1371930' 00:11:17.572 Process raid pid: 1371930 00:11:17.572 19:47:09 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:17.572 19:47:09 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # waitforlisten 1371930 /var/tmp/spdk-raid.sock 00:11:17.572 19:47:09 bdev_raid.raid0_resize_test -- common/autotest_common.sh@831 -- # '[' -z 1371930 ']' 00:11:17.572 19:47:09 bdev_raid.raid0_resize_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:17.572 19:47:09 bdev_raid.raid0_resize_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:17.572 19:47:09 bdev_raid.raid0_resize_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:17.572 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:17.572 19:47:09 bdev_raid.raid0_resize_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:17.572 19:47:09 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:11:17.572 [2024-07-24 19:47:09.085902] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:11:17.572 [2024-07-24 19:47:09.085956] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:17.831 [2024-07-24 19:47:09.200607] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:17.831 [2024-07-24 19:47:09.304699] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:17.831 [2024-07-24 19:47:09.360002] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:17.831 [2024-07-24 19:47:09.360027] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:18.765 19:47:10 bdev_raid.raid0_resize_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:18.765 19:47:10 bdev_raid.raid0_resize_test -- common/autotest_common.sh@864 -- # return 0 00:11:18.765 19:47:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@361 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:11:18.765 Base_1 00:11:18.765 19:47:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:11:19.023 Base_2 00:11:19.023 19:47:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@364 -- # '[' 0 -eq 0 ']' 00:11:19.023 19:47:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:11:19.282 [2024-07-24 19:47:10.750543] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:11:19.282 [2024-07-24 19:47:10.751939] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:11:19.282 [2024-07-24 19:47:10.751989] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c23d10 00:11:19.282 [2024-07-24 19:47:10.752000] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:19.282 [2024-07-24 19:47:10.752195] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c23ff0 00:11:19.282 [2024-07-24 19:47:10.752282] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c23d10 00:11:19.282 [2024-07-24 19:47:10.752292] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x1c23d10 00:11:19.282 [2024-07-24 19:47:10.752387] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:19.282 19:47:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@371 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:11:19.541 [2024-07-24 19:47:10.991152] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:11:19.541 [2024-07-24 19:47:10.991170] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:11:19.541 true 00:11:19.541 19:47:11 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@374 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:19.541 19:47:11 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@374 -- # jq '.[].num_blocks' 00:11:19.800 [2024-07-24 19:47:11.243988] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:19.800 19:47:11 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@374 -- # blkcnt=131072 00:11:19.800 19:47:11 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@375 -- # raid_size_mb=64 00:11:19.800 19:47:11 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # '[' 0 -eq 0 ']' 00:11:19.800 19:47:11 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@377 -- # expected_size=64 00:11:19.800 19:47:11 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 64 '!=' 64 ']' 00:11:19.800 19:47:11 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:11:20.367 [2024-07-24 19:47:11.745142] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:11:20.367 [2024-07-24 19:47:11.745166] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:11:20.367 [2024-07-24 19:47:11.745192] bdev_raid.c:2315:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:11:20.367 true 00:11:20.367 19:47:11 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@390 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:20.367 19:47:11 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@390 -- # jq '.[].num_blocks' 00:11:20.367 [2024-07-24 19:47:11.937805] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:20.625 19:47:11 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@390 -- # blkcnt=262144 00:11:20.625 19:47:11 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@391 -- # raid_size_mb=128 00:11:20.625 19:47:11 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@392 -- # '[' 0 -eq 0 ']' 00:11:20.625 19:47:11 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@393 -- # expected_size=128 00:11:20.625 19:47:11 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@397 -- # '[' 128 '!=' 128 ']' 00:11:20.625 19:47:11 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@402 -- # killprocess 1371930 00:11:20.625 19:47:11 bdev_raid.raid0_resize_test -- common/autotest_common.sh@950 -- # '[' -z 1371930 ']' 00:11:20.625 19:47:11 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # kill -0 1371930 00:11:20.625 19:47:11 bdev_raid.raid0_resize_test -- common/autotest_common.sh@955 -- # uname 00:11:20.625 19:47:11 bdev_raid.raid0_resize_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:20.625 19:47:11 bdev_raid.raid0_resize_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1371930 00:11:20.625 19:47:12 bdev_raid.raid0_resize_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:20.625 19:47:12 bdev_raid.raid0_resize_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:20.625 19:47:12 bdev_raid.raid0_resize_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1371930' 00:11:20.625 killing process with pid 1371930 00:11:20.625 19:47:12 bdev_raid.raid0_resize_test -- common/autotest_common.sh@969 -- # kill 1371930 00:11:20.625 [2024-07-24 19:47:12.018639] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:20.625 [2024-07-24 19:47:12.018690] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:20.625 [2024-07-24 19:47:12.018733] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:20.625 [2024-07-24 19:47:12.018744] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c23d10 name Raid, state offline 00:11:20.625 19:47:12 bdev_raid.raid0_resize_test -- common/autotest_common.sh@974 -- # wait 1371930 00:11:20.625 [2024-07-24 19:47:12.020004] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:20.625 19:47:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@404 -- # return 0 00:11:20.625 00:11:20.625 real 0m3.181s 00:11:20.625 user 0m5.002s 00:11:20.625 sys 0m0.660s 00:11:20.625 19:47:12 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:20.625 19:47:12 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:11:20.625 ************************************ 00:11:20.625 END TEST raid0_resize_test 00:11:20.625 ************************************ 00:11:20.927 19:47:12 bdev_raid -- bdev/bdev_raid.sh@943 -- # run_test raid1_resize_test raid_resize_test 1 00:11:20.927 19:47:12 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:11:20.927 19:47:12 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:20.927 19:47:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:20.927 ************************************ 00:11:20.927 START TEST raid1_resize_test 00:11:20.927 ************************************ 00:11:20.927 19:47:12 bdev_raid.raid1_resize_test -- common/autotest_common.sh@1125 -- # raid_resize_test 1 00:11:20.927 19:47:12 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@347 -- # local raid_level=1 00:11:20.927 19:47:12 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@348 -- # local blksize=512 00:11:20.927 19:47:12 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@349 -- # local bdev_size_mb=32 00:11:20.927 19:47:12 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@350 -- # local new_bdev_size_mb=64 00:11:20.927 19:47:12 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@351 -- # local blkcnt 00:11:20.927 19:47:12 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@352 -- # local raid_size_mb 00:11:20.927 19:47:12 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@353 -- # local new_raid_size_mb 00:11:20.927 19:47:12 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@354 -- # local expected_size 00:11:20.927 19:47:12 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@357 -- # raid_pid=1372331 00:11:20.927 19:47:12 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@358 -- # echo 'Process raid pid: 1372331' 00:11:20.927 Process raid pid: 1372331 00:11:20.927 19:47:12 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@356 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:20.927 19:47:12 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@359 -- # waitforlisten 1372331 /var/tmp/spdk-raid.sock 00:11:20.927 19:47:12 bdev_raid.raid1_resize_test -- common/autotest_common.sh@831 -- # '[' -z 1372331 ']' 00:11:20.927 19:47:12 bdev_raid.raid1_resize_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:20.927 19:47:12 bdev_raid.raid1_resize_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:20.927 19:47:12 bdev_raid.raid1_resize_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:20.927 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:20.927 19:47:12 bdev_raid.raid1_resize_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:20.927 19:47:12 bdev_raid.raid1_resize_test -- common/autotest_common.sh@10 -- # set +x 00:11:20.927 [2024-07-24 19:47:12.361723] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:11:20.927 [2024-07-24 19:47:12.361789] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:21.260 [2024-07-24 19:47:12.482335] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:21.260 [2024-07-24 19:47:12.589912] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:21.260 [2024-07-24 19:47:12.657118] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:21.260 [2024-07-24 19:47:12.657156] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:21.827 19:47:13 bdev_raid.raid1_resize_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:21.827 19:47:13 bdev_raid.raid1_resize_test -- common/autotest_common.sh@864 -- # return 0 00:11:21.827 19:47:13 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@361 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:11:22.087 Base_1 00:11:22.087 19:47:13 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:11:22.345 Base_2 00:11:22.345 19:47:13 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@364 -- # '[' 1 -eq 0 ']' 00:11:22.345 19:47:13 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@367 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r 1 -b 'Base_1 Base_2' -n Raid 00:11:22.604 [2024-07-24 19:47:14.005640] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:11:22.604 [2024-07-24 19:47:14.007141] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:11:22.604 [2024-07-24 19:47:14.007197] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x9bfd10 00:11:22.604 [2024-07-24 19:47:14.007208] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:11:22.604 [2024-07-24 19:47:14.007434] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9bfff0 00:11:22.604 [2024-07-24 19:47:14.007528] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9bfd10 00:11:22.604 [2024-07-24 19:47:14.007538] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x9bfd10 00:11:22.604 [2024-07-24 19:47:14.007645] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:22.604 19:47:14 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@371 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:11:22.863 [2024-07-24 19:47:14.250272] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:11:22.863 [2024-07-24 19:47:14.250291] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:11:22.863 true 00:11:22.863 19:47:14 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@374 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:22.863 19:47:14 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@374 -- # jq '.[].num_blocks' 00:11:23.122 [2024-07-24 19:47:14.491071] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:23.122 19:47:14 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@374 -- # blkcnt=65536 00:11:23.122 19:47:14 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@375 -- # raid_size_mb=32 00:11:23.122 19:47:14 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@376 -- # '[' 1 -eq 0 ']' 00:11:23.122 19:47:14 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@379 -- # expected_size=32 00:11:23.122 19:47:14 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 32 '!=' 32 ']' 00:11:23.122 19:47:14 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:11:23.690 [2024-07-24 19:47:14.996218] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:11:23.690 [2024-07-24 19:47:14.996240] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:11:23.690 [2024-07-24 19:47:14.996265] bdev_raid.c:2315:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 65536 to 131072 00:11:23.690 true 00:11:23.690 19:47:15 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@390 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:23.690 19:47:15 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@390 -- # jq '.[].num_blocks' 00:11:23.690 [2024-07-24 19:47:15.253050] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:23.690 19:47:15 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@390 -- # blkcnt=131072 00:11:23.690 19:47:15 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@391 -- # raid_size_mb=64 00:11:23.690 19:47:15 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@392 -- # '[' 1 -eq 0 ']' 00:11:23.691 19:47:15 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@395 -- # expected_size=64 00:11:23.691 19:47:15 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@397 -- # '[' 64 '!=' 64 ']' 00:11:23.691 19:47:15 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@402 -- # killprocess 1372331 00:11:23.691 19:47:15 bdev_raid.raid1_resize_test -- common/autotest_common.sh@950 -- # '[' -z 1372331 ']' 00:11:23.691 19:47:15 bdev_raid.raid1_resize_test -- common/autotest_common.sh@954 -- # kill -0 1372331 00:11:23.691 19:47:15 bdev_raid.raid1_resize_test -- common/autotest_common.sh@955 -- # uname 00:11:23.691 19:47:15 bdev_raid.raid1_resize_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:23.691 19:47:15 bdev_raid.raid1_resize_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1372331 00:11:23.950 19:47:15 bdev_raid.raid1_resize_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:23.950 19:47:15 bdev_raid.raid1_resize_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:23.950 19:47:15 bdev_raid.raid1_resize_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1372331' 00:11:23.950 killing process with pid 1372331 00:11:23.950 19:47:15 bdev_raid.raid1_resize_test -- common/autotest_common.sh@969 -- # kill 1372331 00:11:23.950 [2024-07-24 19:47:15.323418] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:23.950 [2024-07-24 19:47:15.323466] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:23.950 19:47:15 bdev_raid.raid1_resize_test -- common/autotest_common.sh@974 -- # wait 1372331 00:11:23.950 [2024-07-24 19:47:15.323825] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:23.950 [2024-07-24 19:47:15.323838] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9bfd10 name Raid, state offline 00:11:23.950 [2024-07-24 19:47:15.324764] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:23.950 19:47:15 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@404 -- # return 0 00:11:23.950 00:11:23.950 real 0m3.214s 00:11:23.950 user 0m5.022s 00:11:23.950 sys 0m0.704s 00:11:23.950 19:47:15 bdev_raid.raid1_resize_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:23.950 19:47:15 bdev_raid.raid1_resize_test -- common/autotest_common.sh@10 -- # set +x 00:11:23.950 ************************************ 00:11:23.950 END TEST raid1_resize_test 00:11:23.950 ************************************ 00:11:24.210 19:47:15 bdev_raid -- bdev/bdev_raid.sh@945 -- # for n in {2..4} 00:11:24.210 19:47:15 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:11:24.210 19:47:15 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:11:24.210 19:47:15 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:11:24.210 19:47:15 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:24.210 19:47:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:24.210 ************************************ 00:11:24.210 START TEST raid_state_function_test 00:11:24.210 ************************************ 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 2 false 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1372889 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1372889' 00:11:24.210 Process raid pid: 1372889 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1372889 /var/tmp/spdk-raid.sock 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 1372889 ']' 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:24.210 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:24.210 19:47:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:24.210 [2024-07-24 19:47:15.666929] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:11:24.210 [2024-07-24 19:47:15.667001] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:24.210 [2024-07-24 19:47:15.799876] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:24.469 [2024-07-24 19:47:15.907473] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:24.469 [2024-07-24 19:47:15.972173] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:24.469 [2024-07-24 19:47:15.972210] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:25.405 19:47:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:25.405 19:47:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:11:25.405 19:47:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:25.664 [2024-07-24 19:47:17.128199] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:25.664 [2024-07-24 19:47:17.128242] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:25.664 [2024-07-24 19:47:17.128252] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:25.664 [2024-07-24 19:47:17.128264] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:25.664 19:47:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:25.664 19:47:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:25.664 19:47:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:25.664 19:47:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:25.664 19:47:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:25.664 19:47:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:25.664 19:47:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:25.664 19:47:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:25.664 19:47:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:25.664 19:47:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:25.664 19:47:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:25.664 19:47:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:25.923 19:47:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:25.923 "name": "Existed_Raid", 00:11:25.923 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:25.923 "strip_size_kb": 64, 00:11:25.923 "state": "configuring", 00:11:25.923 "raid_level": "raid0", 00:11:25.923 "superblock": false, 00:11:25.923 "num_base_bdevs": 2, 00:11:25.923 "num_base_bdevs_discovered": 0, 00:11:25.923 "num_base_bdevs_operational": 2, 00:11:25.923 "base_bdevs_list": [ 00:11:25.923 { 00:11:25.923 "name": "BaseBdev1", 00:11:25.923 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:25.923 "is_configured": false, 00:11:25.923 "data_offset": 0, 00:11:25.923 "data_size": 0 00:11:25.923 }, 00:11:25.923 { 00:11:25.923 "name": "BaseBdev2", 00:11:25.923 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:25.923 "is_configured": false, 00:11:25.923 "data_offset": 0, 00:11:25.923 "data_size": 0 00:11:25.923 } 00:11:25.923 ] 00:11:25.923 }' 00:11:25.923 19:47:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:25.923 19:47:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:26.490 19:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:26.749 [2024-07-24 19:47:18.226980] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:26.749 [2024-07-24 19:47:18.227006] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19719f0 name Existed_Raid, state configuring 00:11:26.749 19:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:27.006 [2024-07-24 19:47:18.403467] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:27.006 [2024-07-24 19:47:18.403494] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:27.006 [2024-07-24 19:47:18.403503] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:27.006 [2024-07-24 19:47:18.403514] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:27.006 19:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:27.264 [2024-07-24 19:47:18.662110] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:27.264 BaseBdev1 00:11:27.264 19:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:27.264 19:47:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:11:27.264 19:47:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:11:27.264 19:47:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:11:27.264 19:47:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:11:27.264 19:47:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:11:27.264 19:47:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:27.522 19:47:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:27.781 [ 00:11:27.781 { 00:11:27.781 "name": "BaseBdev1", 00:11:27.781 "aliases": [ 00:11:27.781 "6a571d5b-66e3-4817-9b8a-c8711fceb026" 00:11:27.781 ], 00:11:27.781 "product_name": "Malloc disk", 00:11:27.781 "block_size": 512, 00:11:27.781 "num_blocks": 65536, 00:11:27.781 "uuid": "6a571d5b-66e3-4817-9b8a-c8711fceb026", 00:11:27.781 "assigned_rate_limits": { 00:11:27.781 "rw_ios_per_sec": 0, 00:11:27.781 "rw_mbytes_per_sec": 0, 00:11:27.781 "r_mbytes_per_sec": 0, 00:11:27.781 "w_mbytes_per_sec": 0 00:11:27.781 }, 00:11:27.781 "claimed": true, 00:11:27.781 "claim_type": "exclusive_write", 00:11:27.781 "zoned": false, 00:11:27.781 "supported_io_types": { 00:11:27.781 "read": true, 00:11:27.781 "write": true, 00:11:27.781 "unmap": true, 00:11:27.781 "flush": true, 00:11:27.781 "reset": true, 00:11:27.781 "nvme_admin": false, 00:11:27.781 "nvme_io": false, 00:11:27.781 "nvme_io_md": false, 00:11:27.781 "write_zeroes": true, 00:11:27.781 "zcopy": true, 00:11:27.781 "get_zone_info": false, 00:11:27.781 "zone_management": false, 00:11:27.781 "zone_append": false, 00:11:27.781 "compare": false, 00:11:27.781 "compare_and_write": false, 00:11:27.781 "abort": true, 00:11:27.781 "seek_hole": false, 00:11:27.781 "seek_data": false, 00:11:27.781 "copy": true, 00:11:27.781 "nvme_iov_md": false 00:11:27.781 }, 00:11:27.781 "memory_domains": [ 00:11:27.781 { 00:11:27.781 "dma_device_id": "system", 00:11:27.781 "dma_device_type": 1 00:11:27.781 }, 00:11:27.781 { 00:11:27.781 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:27.781 "dma_device_type": 2 00:11:27.781 } 00:11:27.781 ], 00:11:27.781 "driver_specific": {} 00:11:27.781 } 00:11:27.781 ] 00:11:27.781 19:47:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:11:27.781 19:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:27.781 19:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:27.781 19:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:27.781 19:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:27.781 19:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:27.781 19:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:27.781 19:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:27.781 19:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:27.781 19:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:27.781 19:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:27.781 19:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:27.781 19:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:28.041 19:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:28.041 "name": "Existed_Raid", 00:11:28.041 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:28.041 "strip_size_kb": 64, 00:11:28.041 "state": "configuring", 00:11:28.041 "raid_level": "raid0", 00:11:28.041 "superblock": false, 00:11:28.041 "num_base_bdevs": 2, 00:11:28.041 "num_base_bdevs_discovered": 1, 00:11:28.041 "num_base_bdevs_operational": 2, 00:11:28.041 "base_bdevs_list": [ 00:11:28.041 { 00:11:28.041 "name": "BaseBdev1", 00:11:28.041 "uuid": "6a571d5b-66e3-4817-9b8a-c8711fceb026", 00:11:28.041 "is_configured": true, 00:11:28.041 "data_offset": 0, 00:11:28.041 "data_size": 65536 00:11:28.041 }, 00:11:28.041 { 00:11:28.041 "name": "BaseBdev2", 00:11:28.041 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:28.041 "is_configured": false, 00:11:28.041 "data_offset": 0, 00:11:28.041 "data_size": 0 00:11:28.041 } 00:11:28.041 ] 00:11:28.041 }' 00:11:28.041 19:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:28.041 19:47:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:28.979 19:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:28.979 [2024-07-24 19:47:20.527043] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:28.979 [2024-07-24 19:47:20.527080] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19712e0 name Existed_Raid, state configuring 00:11:28.979 19:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:29.239 [2024-07-24 19:47:20.779753] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:29.239 [2024-07-24 19:47:20.781217] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:29.239 [2024-07-24 19:47:20.781248] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:29.239 19:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:29.239 19:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:29.239 19:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:29.239 19:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:29.239 19:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:29.239 19:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:29.239 19:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:29.239 19:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:29.239 19:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:29.239 19:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:29.239 19:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:29.239 19:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:29.239 19:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:29.239 19:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:29.498 19:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:29.499 "name": "Existed_Raid", 00:11:29.499 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:29.499 "strip_size_kb": 64, 00:11:29.499 "state": "configuring", 00:11:29.499 "raid_level": "raid0", 00:11:29.499 "superblock": false, 00:11:29.499 "num_base_bdevs": 2, 00:11:29.499 "num_base_bdevs_discovered": 1, 00:11:29.499 "num_base_bdevs_operational": 2, 00:11:29.499 "base_bdevs_list": [ 00:11:29.499 { 00:11:29.499 "name": "BaseBdev1", 00:11:29.499 "uuid": "6a571d5b-66e3-4817-9b8a-c8711fceb026", 00:11:29.499 "is_configured": true, 00:11:29.499 "data_offset": 0, 00:11:29.499 "data_size": 65536 00:11:29.499 }, 00:11:29.499 { 00:11:29.499 "name": "BaseBdev2", 00:11:29.499 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:29.499 "is_configured": false, 00:11:29.499 "data_offset": 0, 00:11:29.499 "data_size": 0 00:11:29.499 } 00:11:29.499 ] 00:11:29.499 }' 00:11:29.499 19:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:29.499 19:47:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:30.435 19:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:30.695 [2024-07-24 19:47:22.090599] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:30.695 [2024-07-24 19:47:22.090634] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x19720d0 00:11:30.695 [2024-07-24 19:47:22.090643] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:30.695 [2024-07-24 19:47:22.090830] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b15ab0 00:11:30.695 [2024-07-24 19:47:22.090953] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19720d0 00:11:30.695 [2024-07-24 19:47:22.090963] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x19720d0 00:11:30.695 [2024-07-24 19:47:22.091129] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:30.695 BaseBdev2 00:11:30.695 19:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:30.695 19:47:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:11:30.695 19:47:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:11:30.695 19:47:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:11:30.695 19:47:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:11:30.695 19:47:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:11:30.695 19:47:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:30.954 19:47:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:30.954 [ 00:11:30.954 { 00:11:30.954 "name": "BaseBdev2", 00:11:30.954 "aliases": [ 00:11:30.954 "79d3e679-f7ea-4acd-8fa6-390c894e89d3" 00:11:30.954 ], 00:11:30.954 "product_name": "Malloc disk", 00:11:30.954 "block_size": 512, 00:11:30.954 "num_blocks": 65536, 00:11:30.954 "uuid": "79d3e679-f7ea-4acd-8fa6-390c894e89d3", 00:11:30.954 "assigned_rate_limits": { 00:11:30.954 "rw_ios_per_sec": 0, 00:11:30.954 "rw_mbytes_per_sec": 0, 00:11:30.954 "r_mbytes_per_sec": 0, 00:11:30.954 "w_mbytes_per_sec": 0 00:11:30.954 }, 00:11:30.954 "claimed": true, 00:11:30.954 "claim_type": "exclusive_write", 00:11:30.954 "zoned": false, 00:11:30.954 "supported_io_types": { 00:11:30.954 "read": true, 00:11:30.954 "write": true, 00:11:30.954 "unmap": true, 00:11:30.954 "flush": true, 00:11:30.954 "reset": true, 00:11:30.954 "nvme_admin": false, 00:11:30.954 "nvme_io": false, 00:11:30.954 "nvme_io_md": false, 00:11:30.954 "write_zeroes": true, 00:11:30.954 "zcopy": true, 00:11:30.954 "get_zone_info": false, 00:11:30.954 "zone_management": false, 00:11:30.954 "zone_append": false, 00:11:30.954 "compare": false, 00:11:30.954 "compare_and_write": false, 00:11:30.954 "abort": true, 00:11:30.954 "seek_hole": false, 00:11:30.954 "seek_data": false, 00:11:30.954 "copy": true, 00:11:30.954 "nvme_iov_md": false 00:11:30.954 }, 00:11:30.954 "memory_domains": [ 00:11:30.954 { 00:11:30.954 "dma_device_id": "system", 00:11:30.954 "dma_device_type": 1 00:11:30.954 }, 00:11:30.954 { 00:11:30.954 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:30.954 "dma_device_type": 2 00:11:30.954 } 00:11:30.954 ], 00:11:30.954 "driver_specific": {} 00:11:30.954 } 00:11:30.954 ] 00:11:30.954 19:47:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:11:30.954 19:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:30.954 19:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:30.954 19:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:11:30.954 19:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:30.954 19:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:30.954 19:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:30.954 19:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:30.954 19:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:30.954 19:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:30.954 19:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:30.954 19:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:30.954 19:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:30.954 19:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:30.954 19:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:31.214 19:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:31.214 "name": "Existed_Raid", 00:11:31.214 "uuid": "8e5b5844-621c-40db-b22b-ca03d0d3582b", 00:11:31.214 "strip_size_kb": 64, 00:11:31.214 "state": "online", 00:11:31.214 "raid_level": "raid0", 00:11:31.214 "superblock": false, 00:11:31.214 "num_base_bdevs": 2, 00:11:31.214 "num_base_bdevs_discovered": 2, 00:11:31.214 "num_base_bdevs_operational": 2, 00:11:31.214 "base_bdevs_list": [ 00:11:31.214 { 00:11:31.214 "name": "BaseBdev1", 00:11:31.214 "uuid": "6a571d5b-66e3-4817-9b8a-c8711fceb026", 00:11:31.214 "is_configured": true, 00:11:31.214 "data_offset": 0, 00:11:31.214 "data_size": 65536 00:11:31.214 }, 00:11:31.214 { 00:11:31.214 "name": "BaseBdev2", 00:11:31.214 "uuid": "79d3e679-f7ea-4acd-8fa6-390c894e89d3", 00:11:31.214 "is_configured": true, 00:11:31.214 "data_offset": 0, 00:11:31.214 "data_size": 65536 00:11:31.214 } 00:11:31.214 ] 00:11:31.214 }' 00:11:31.214 19:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:31.214 19:47:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:31.782 19:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:31.782 19:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:31.782 19:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:31.782 19:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:31.782 19:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:31.782 19:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:31.783 19:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:31.783 19:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:32.041 [2024-07-24 19:47:23.426419] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:32.041 19:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:32.041 "name": "Existed_Raid", 00:11:32.041 "aliases": [ 00:11:32.041 "8e5b5844-621c-40db-b22b-ca03d0d3582b" 00:11:32.041 ], 00:11:32.041 "product_name": "Raid Volume", 00:11:32.041 "block_size": 512, 00:11:32.041 "num_blocks": 131072, 00:11:32.041 "uuid": "8e5b5844-621c-40db-b22b-ca03d0d3582b", 00:11:32.041 "assigned_rate_limits": { 00:11:32.041 "rw_ios_per_sec": 0, 00:11:32.041 "rw_mbytes_per_sec": 0, 00:11:32.041 "r_mbytes_per_sec": 0, 00:11:32.041 "w_mbytes_per_sec": 0 00:11:32.041 }, 00:11:32.041 "claimed": false, 00:11:32.041 "zoned": false, 00:11:32.041 "supported_io_types": { 00:11:32.041 "read": true, 00:11:32.041 "write": true, 00:11:32.041 "unmap": true, 00:11:32.041 "flush": true, 00:11:32.041 "reset": true, 00:11:32.041 "nvme_admin": false, 00:11:32.041 "nvme_io": false, 00:11:32.041 "nvme_io_md": false, 00:11:32.041 "write_zeroes": true, 00:11:32.041 "zcopy": false, 00:11:32.041 "get_zone_info": false, 00:11:32.041 "zone_management": false, 00:11:32.041 "zone_append": false, 00:11:32.041 "compare": false, 00:11:32.041 "compare_and_write": false, 00:11:32.041 "abort": false, 00:11:32.041 "seek_hole": false, 00:11:32.042 "seek_data": false, 00:11:32.042 "copy": false, 00:11:32.042 "nvme_iov_md": false 00:11:32.042 }, 00:11:32.042 "memory_domains": [ 00:11:32.042 { 00:11:32.042 "dma_device_id": "system", 00:11:32.042 "dma_device_type": 1 00:11:32.042 }, 00:11:32.042 { 00:11:32.042 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:32.042 "dma_device_type": 2 00:11:32.042 }, 00:11:32.042 { 00:11:32.042 "dma_device_id": "system", 00:11:32.042 "dma_device_type": 1 00:11:32.042 }, 00:11:32.042 { 00:11:32.042 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:32.042 "dma_device_type": 2 00:11:32.042 } 00:11:32.042 ], 00:11:32.042 "driver_specific": { 00:11:32.042 "raid": { 00:11:32.042 "uuid": "8e5b5844-621c-40db-b22b-ca03d0d3582b", 00:11:32.042 "strip_size_kb": 64, 00:11:32.042 "state": "online", 00:11:32.042 "raid_level": "raid0", 00:11:32.042 "superblock": false, 00:11:32.042 "num_base_bdevs": 2, 00:11:32.042 "num_base_bdevs_discovered": 2, 00:11:32.042 "num_base_bdevs_operational": 2, 00:11:32.042 "base_bdevs_list": [ 00:11:32.042 { 00:11:32.042 "name": "BaseBdev1", 00:11:32.042 "uuid": "6a571d5b-66e3-4817-9b8a-c8711fceb026", 00:11:32.042 "is_configured": true, 00:11:32.042 "data_offset": 0, 00:11:32.042 "data_size": 65536 00:11:32.042 }, 00:11:32.042 { 00:11:32.042 "name": "BaseBdev2", 00:11:32.042 "uuid": "79d3e679-f7ea-4acd-8fa6-390c894e89d3", 00:11:32.042 "is_configured": true, 00:11:32.042 "data_offset": 0, 00:11:32.042 "data_size": 65536 00:11:32.042 } 00:11:32.042 ] 00:11:32.042 } 00:11:32.042 } 00:11:32.042 }' 00:11:32.042 19:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:32.042 19:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:32.042 BaseBdev2' 00:11:32.042 19:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:32.042 19:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:32.042 19:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:32.301 19:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:32.301 "name": "BaseBdev1", 00:11:32.301 "aliases": [ 00:11:32.301 "6a571d5b-66e3-4817-9b8a-c8711fceb026" 00:11:32.301 ], 00:11:32.301 "product_name": "Malloc disk", 00:11:32.301 "block_size": 512, 00:11:32.301 "num_blocks": 65536, 00:11:32.301 "uuid": "6a571d5b-66e3-4817-9b8a-c8711fceb026", 00:11:32.301 "assigned_rate_limits": { 00:11:32.301 "rw_ios_per_sec": 0, 00:11:32.301 "rw_mbytes_per_sec": 0, 00:11:32.301 "r_mbytes_per_sec": 0, 00:11:32.301 "w_mbytes_per_sec": 0 00:11:32.301 }, 00:11:32.301 "claimed": true, 00:11:32.301 "claim_type": "exclusive_write", 00:11:32.301 "zoned": false, 00:11:32.301 "supported_io_types": { 00:11:32.301 "read": true, 00:11:32.301 "write": true, 00:11:32.301 "unmap": true, 00:11:32.301 "flush": true, 00:11:32.301 "reset": true, 00:11:32.301 "nvme_admin": false, 00:11:32.301 "nvme_io": false, 00:11:32.301 "nvme_io_md": false, 00:11:32.301 "write_zeroes": true, 00:11:32.301 "zcopy": true, 00:11:32.301 "get_zone_info": false, 00:11:32.301 "zone_management": false, 00:11:32.301 "zone_append": false, 00:11:32.301 "compare": false, 00:11:32.301 "compare_and_write": false, 00:11:32.301 "abort": true, 00:11:32.301 "seek_hole": false, 00:11:32.301 "seek_data": false, 00:11:32.301 "copy": true, 00:11:32.301 "nvme_iov_md": false 00:11:32.301 }, 00:11:32.301 "memory_domains": [ 00:11:32.301 { 00:11:32.301 "dma_device_id": "system", 00:11:32.301 "dma_device_type": 1 00:11:32.301 }, 00:11:32.301 { 00:11:32.301 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:32.301 "dma_device_type": 2 00:11:32.301 } 00:11:32.301 ], 00:11:32.301 "driver_specific": {} 00:11:32.301 }' 00:11:32.301 19:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:32.301 19:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:32.301 19:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:32.301 19:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:32.301 19:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:32.301 19:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:32.301 19:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:32.561 19:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:32.561 19:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:32.561 19:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:32.561 19:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:32.561 19:47:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:32.561 19:47:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:32.561 19:47:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:32.561 19:47:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:32.820 19:47:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:32.820 "name": "BaseBdev2", 00:11:32.820 "aliases": [ 00:11:32.820 "79d3e679-f7ea-4acd-8fa6-390c894e89d3" 00:11:32.820 ], 00:11:32.820 "product_name": "Malloc disk", 00:11:32.820 "block_size": 512, 00:11:32.820 "num_blocks": 65536, 00:11:32.820 "uuid": "79d3e679-f7ea-4acd-8fa6-390c894e89d3", 00:11:32.820 "assigned_rate_limits": { 00:11:32.820 "rw_ios_per_sec": 0, 00:11:32.820 "rw_mbytes_per_sec": 0, 00:11:32.820 "r_mbytes_per_sec": 0, 00:11:32.820 "w_mbytes_per_sec": 0 00:11:32.820 }, 00:11:32.820 "claimed": true, 00:11:32.820 "claim_type": "exclusive_write", 00:11:32.820 "zoned": false, 00:11:32.820 "supported_io_types": { 00:11:32.820 "read": true, 00:11:32.820 "write": true, 00:11:32.820 "unmap": true, 00:11:32.820 "flush": true, 00:11:32.820 "reset": true, 00:11:32.820 "nvme_admin": false, 00:11:32.820 "nvme_io": false, 00:11:32.820 "nvme_io_md": false, 00:11:32.820 "write_zeroes": true, 00:11:32.820 "zcopy": true, 00:11:32.820 "get_zone_info": false, 00:11:32.820 "zone_management": false, 00:11:32.820 "zone_append": false, 00:11:32.820 "compare": false, 00:11:32.820 "compare_and_write": false, 00:11:32.820 "abort": true, 00:11:32.820 "seek_hole": false, 00:11:32.820 "seek_data": false, 00:11:32.820 "copy": true, 00:11:32.820 "nvme_iov_md": false 00:11:32.820 }, 00:11:32.820 "memory_domains": [ 00:11:32.820 { 00:11:32.820 "dma_device_id": "system", 00:11:32.820 "dma_device_type": 1 00:11:32.820 }, 00:11:32.820 { 00:11:32.820 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:32.820 "dma_device_type": 2 00:11:32.820 } 00:11:32.820 ], 00:11:32.820 "driver_specific": {} 00:11:32.820 }' 00:11:32.820 19:47:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:32.820 19:47:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:32.820 19:47:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:32.820 19:47:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:32.820 19:47:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:33.080 19:47:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:33.080 19:47:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:33.080 19:47:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:33.080 19:47:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:33.080 19:47:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:33.080 19:47:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:33.080 19:47:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:33.080 19:47:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:33.648 [2024-07-24 19:47:25.114673] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:33.649 [2024-07-24 19:47:25.114700] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:33.649 [2024-07-24 19:47:25.114740] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:33.649 19:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:33.649 19:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:11:33.649 19:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:33.649 19:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:33.649 19:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:33.649 19:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:11:33.649 19:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:33.649 19:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:33.649 19:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:33.649 19:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:33.649 19:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:33.649 19:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:33.649 19:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:33.649 19:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:33.649 19:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:33.649 19:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:33.649 19:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:33.908 19:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:33.908 "name": "Existed_Raid", 00:11:33.908 "uuid": "8e5b5844-621c-40db-b22b-ca03d0d3582b", 00:11:33.908 "strip_size_kb": 64, 00:11:33.908 "state": "offline", 00:11:33.908 "raid_level": "raid0", 00:11:33.908 "superblock": false, 00:11:33.908 "num_base_bdevs": 2, 00:11:33.908 "num_base_bdevs_discovered": 1, 00:11:33.908 "num_base_bdevs_operational": 1, 00:11:33.908 "base_bdevs_list": [ 00:11:33.908 { 00:11:33.908 "name": null, 00:11:33.908 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:33.908 "is_configured": false, 00:11:33.908 "data_offset": 0, 00:11:33.908 "data_size": 65536 00:11:33.908 }, 00:11:33.908 { 00:11:33.908 "name": "BaseBdev2", 00:11:33.908 "uuid": "79d3e679-f7ea-4acd-8fa6-390c894e89d3", 00:11:33.908 "is_configured": true, 00:11:33.908 "data_offset": 0, 00:11:33.908 "data_size": 65536 00:11:33.908 } 00:11:33.908 ] 00:11:33.908 }' 00:11:33.908 19:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:33.908 19:47:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:34.477 19:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:34.477 19:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:34.477 19:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:34.477 19:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:34.736 19:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:34.736 19:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:34.736 19:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:35.305 [2024-07-24 19:47:26.656745] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:35.305 [2024-07-24 19:47:26.656794] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19720d0 name Existed_Raid, state offline 00:11:35.305 19:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:35.305 19:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:35.305 19:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:35.305 19:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:35.565 19:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:35.565 19:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:35.565 19:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:35.565 19:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1372889 00:11:35.565 19:47:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 1372889 ']' 00:11:35.565 19:47:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 1372889 00:11:35.565 19:47:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:11:35.565 19:47:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:35.565 19:47:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1372889 00:11:35.565 19:47:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:35.565 19:47:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:35.565 19:47:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1372889' 00:11:35.565 killing process with pid 1372889 00:11:35.565 19:47:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 1372889 00:11:35.565 [2024-07-24 19:47:26.996064] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:35.565 19:47:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 1372889 00:11:35.565 [2024-07-24 19:47:26.997027] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:11:35.824 00:11:35.824 real 0m11.628s 00:11:35.824 user 0m20.639s 00:11:35.824 sys 0m2.174s 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:35.824 ************************************ 00:11:35.824 END TEST raid_state_function_test 00:11:35.824 ************************************ 00:11:35.824 19:47:27 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:11:35.824 19:47:27 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:11:35.824 19:47:27 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:35.824 19:47:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:35.824 ************************************ 00:11:35.824 START TEST raid_state_function_test_sb 00:11:35.824 ************************************ 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 2 true 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1374616 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1374616' 00:11:35.824 Process raid pid: 1374616 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1374616 /var/tmp/spdk-raid.sock 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1374616 ']' 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:35.824 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:35.824 19:47:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:35.824 [2024-07-24 19:47:27.381640] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:11:35.824 [2024-07-24 19:47:27.381708] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:36.083 [2024-07-24 19:47:27.510584] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:36.083 [2024-07-24 19:47:27.621172] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:36.342 [2024-07-24 19:47:27.689957] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:36.342 [2024-07-24 19:47:27.689995] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:36.910 19:47:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:36.910 19:47:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:11:36.910 19:47:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:37.168 [2024-07-24 19:47:28.525453] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:37.168 [2024-07-24 19:47:28.525491] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:37.168 [2024-07-24 19:47:28.525502] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:37.168 [2024-07-24 19:47:28.525514] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:37.168 19:47:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:37.168 19:47:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:37.168 19:47:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:37.168 19:47:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:37.168 19:47:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:37.168 19:47:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:37.168 19:47:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:37.168 19:47:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:37.168 19:47:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:37.168 19:47:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:37.168 19:47:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:37.168 19:47:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:37.427 19:47:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:37.427 "name": "Existed_Raid", 00:11:37.427 "uuid": "736c5674-8285-457b-b273-2fe8cbf7f95d", 00:11:37.427 "strip_size_kb": 64, 00:11:37.427 "state": "configuring", 00:11:37.427 "raid_level": "raid0", 00:11:37.427 "superblock": true, 00:11:37.427 "num_base_bdevs": 2, 00:11:37.427 "num_base_bdevs_discovered": 0, 00:11:37.427 "num_base_bdevs_operational": 2, 00:11:37.427 "base_bdevs_list": [ 00:11:37.427 { 00:11:37.427 "name": "BaseBdev1", 00:11:37.427 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:37.427 "is_configured": false, 00:11:37.427 "data_offset": 0, 00:11:37.427 "data_size": 0 00:11:37.427 }, 00:11:37.427 { 00:11:37.427 "name": "BaseBdev2", 00:11:37.427 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:37.427 "is_configured": false, 00:11:37.427 "data_offset": 0, 00:11:37.427 "data_size": 0 00:11:37.427 } 00:11:37.427 ] 00:11:37.427 }' 00:11:37.427 19:47:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:37.427 19:47:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:37.993 19:47:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:38.251 [2024-07-24 19:47:29.620342] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:38.251 [2024-07-24 19:47:29.620369] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12fd9f0 name Existed_Raid, state configuring 00:11:38.251 19:47:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:38.510 [2024-07-24 19:47:29.869026] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:38.510 [2024-07-24 19:47:29.869052] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:38.510 [2024-07-24 19:47:29.869062] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:38.510 [2024-07-24 19:47:29.869073] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:38.510 19:47:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:38.770 [2024-07-24 19:47:30.139685] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:38.770 BaseBdev1 00:11:38.770 19:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:38.770 19:47:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:11:38.770 19:47:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:11:38.770 19:47:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:11:38.770 19:47:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:11:38.770 19:47:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:11:38.770 19:47:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:39.028 19:47:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:39.287 [ 00:11:39.287 { 00:11:39.287 "name": "BaseBdev1", 00:11:39.287 "aliases": [ 00:11:39.287 "dbbd7bcb-c29c-48da-b7b3-d9f0d4a0a842" 00:11:39.287 ], 00:11:39.287 "product_name": "Malloc disk", 00:11:39.287 "block_size": 512, 00:11:39.287 "num_blocks": 65536, 00:11:39.288 "uuid": "dbbd7bcb-c29c-48da-b7b3-d9f0d4a0a842", 00:11:39.288 "assigned_rate_limits": { 00:11:39.288 "rw_ios_per_sec": 0, 00:11:39.288 "rw_mbytes_per_sec": 0, 00:11:39.288 "r_mbytes_per_sec": 0, 00:11:39.288 "w_mbytes_per_sec": 0 00:11:39.288 }, 00:11:39.288 "claimed": true, 00:11:39.288 "claim_type": "exclusive_write", 00:11:39.288 "zoned": false, 00:11:39.288 "supported_io_types": { 00:11:39.288 "read": true, 00:11:39.288 "write": true, 00:11:39.288 "unmap": true, 00:11:39.288 "flush": true, 00:11:39.288 "reset": true, 00:11:39.288 "nvme_admin": false, 00:11:39.288 "nvme_io": false, 00:11:39.288 "nvme_io_md": false, 00:11:39.288 "write_zeroes": true, 00:11:39.288 "zcopy": true, 00:11:39.288 "get_zone_info": false, 00:11:39.288 "zone_management": false, 00:11:39.288 "zone_append": false, 00:11:39.288 "compare": false, 00:11:39.288 "compare_and_write": false, 00:11:39.288 "abort": true, 00:11:39.288 "seek_hole": false, 00:11:39.288 "seek_data": false, 00:11:39.288 "copy": true, 00:11:39.288 "nvme_iov_md": false 00:11:39.288 }, 00:11:39.288 "memory_domains": [ 00:11:39.288 { 00:11:39.288 "dma_device_id": "system", 00:11:39.288 "dma_device_type": 1 00:11:39.288 }, 00:11:39.288 { 00:11:39.288 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:39.288 "dma_device_type": 2 00:11:39.288 } 00:11:39.288 ], 00:11:39.288 "driver_specific": {} 00:11:39.288 } 00:11:39.288 ] 00:11:39.288 19:47:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:11:39.288 19:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:39.288 19:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:39.288 19:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:39.288 19:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:39.288 19:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:39.288 19:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:39.288 19:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:39.288 19:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:39.288 19:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:39.288 19:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:39.288 19:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:39.288 19:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:39.581 19:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:39.581 "name": "Existed_Raid", 00:11:39.581 "uuid": "1342a7b8-87b2-4c27-a33b-a31ac193ce10", 00:11:39.581 "strip_size_kb": 64, 00:11:39.581 "state": "configuring", 00:11:39.581 "raid_level": "raid0", 00:11:39.581 "superblock": true, 00:11:39.581 "num_base_bdevs": 2, 00:11:39.581 "num_base_bdevs_discovered": 1, 00:11:39.581 "num_base_bdevs_operational": 2, 00:11:39.581 "base_bdevs_list": [ 00:11:39.581 { 00:11:39.581 "name": "BaseBdev1", 00:11:39.581 "uuid": "dbbd7bcb-c29c-48da-b7b3-d9f0d4a0a842", 00:11:39.581 "is_configured": true, 00:11:39.581 "data_offset": 2048, 00:11:39.581 "data_size": 63488 00:11:39.581 }, 00:11:39.581 { 00:11:39.581 "name": "BaseBdev2", 00:11:39.581 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:39.581 "is_configured": false, 00:11:39.581 "data_offset": 0, 00:11:39.581 "data_size": 0 00:11:39.581 } 00:11:39.581 ] 00:11:39.581 }' 00:11:39.581 19:47:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:39.581 19:47:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:40.204 19:47:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:40.204 [2024-07-24 19:47:31.703842] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:40.204 [2024-07-24 19:47:31.703884] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12fd2e0 name Existed_Raid, state configuring 00:11:40.204 19:47:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:40.463 [2024-07-24 19:47:31.948526] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:40.463 [2024-07-24 19:47:31.950012] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:40.463 [2024-07-24 19:47:31.950044] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:40.463 19:47:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:40.463 19:47:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:40.463 19:47:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:40.463 19:47:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:40.463 19:47:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:40.463 19:47:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:40.463 19:47:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:40.463 19:47:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:40.463 19:47:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:40.463 19:47:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:40.463 19:47:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:40.463 19:47:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:40.463 19:47:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:40.463 19:47:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:40.722 19:47:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:40.722 "name": "Existed_Raid", 00:11:40.722 "uuid": "35f45880-a0f6-4b23-ae8c-4ea4bcc11fd0", 00:11:40.722 "strip_size_kb": 64, 00:11:40.722 "state": "configuring", 00:11:40.722 "raid_level": "raid0", 00:11:40.722 "superblock": true, 00:11:40.722 "num_base_bdevs": 2, 00:11:40.722 "num_base_bdevs_discovered": 1, 00:11:40.722 "num_base_bdevs_operational": 2, 00:11:40.722 "base_bdevs_list": [ 00:11:40.722 { 00:11:40.722 "name": "BaseBdev1", 00:11:40.722 "uuid": "dbbd7bcb-c29c-48da-b7b3-d9f0d4a0a842", 00:11:40.722 "is_configured": true, 00:11:40.722 "data_offset": 2048, 00:11:40.722 "data_size": 63488 00:11:40.722 }, 00:11:40.722 { 00:11:40.722 "name": "BaseBdev2", 00:11:40.722 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:40.722 "is_configured": false, 00:11:40.722 "data_offset": 0, 00:11:40.722 "data_size": 0 00:11:40.722 } 00:11:40.722 ] 00:11:40.722 }' 00:11:40.722 19:47:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:40.722 19:47:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:41.289 19:47:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:41.549 [2024-07-24 19:47:33.074797] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:41.549 [2024-07-24 19:47:33.074942] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x12fe0d0 00:11:41.549 [2024-07-24 19:47:33.074955] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:41.549 [2024-07-24 19:47:33.075126] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14b1a50 00:11:41.549 [2024-07-24 19:47:33.075250] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12fe0d0 00:11:41.550 [2024-07-24 19:47:33.075260] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x12fe0d0 00:11:41.550 [2024-07-24 19:47:33.075350] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:41.550 BaseBdev2 00:11:41.550 19:47:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:41.550 19:47:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:11:41.550 19:47:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:11:41.550 19:47:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:11:41.550 19:47:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:11:41.550 19:47:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:11:41.550 19:47:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:41.809 19:47:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:42.067 [ 00:11:42.067 { 00:11:42.067 "name": "BaseBdev2", 00:11:42.067 "aliases": [ 00:11:42.067 "710ccb0d-e352-40c0-9110-dc16f519a199" 00:11:42.067 ], 00:11:42.067 "product_name": "Malloc disk", 00:11:42.067 "block_size": 512, 00:11:42.067 "num_blocks": 65536, 00:11:42.067 "uuid": "710ccb0d-e352-40c0-9110-dc16f519a199", 00:11:42.067 "assigned_rate_limits": { 00:11:42.067 "rw_ios_per_sec": 0, 00:11:42.067 "rw_mbytes_per_sec": 0, 00:11:42.067 "r_mbytes_per_sec": 0, 00:11:42.067 "w_mbytes_per_sec": 0 00:11:42.067 }, 00:11:42.067 "claimed": true, 00:11:42.067 "claim_type": "exclusive_write", 00:11:42.067 "zoned": false, 00:11:42.067 "supported_io_types": { 00:11:42.067 "read": true, 00:11:42.067 "write": true, 00:11:42.067 "unmap": true, 00:11:42.067 "flush": true, 00:11:42.067 "reset": true, 00:11:42.067 "nvme_admin": false, 00:11:42.067 "nvme_io": false, 00:11:42.067 "nvme_io_md": false, 00:11:42.067 "write_zeroes": true, 00:11:42.067 "zcopy": true, 00:11:42.067 "get_zone_info": false, 00:11:42.067 "zone_management": false, 00:11:42.067 "zone_append": false, 00:11:42.067 "compare": false, 00:11:42.067 "compare_and_write": false, 00:11:42.067 "abort": true, 00:11:42.067 "seek_hole": false, 00:11:42.067 "seek_data": false, 00:11:42.067 "copy": true, 00:11:42.067 "nvme_iov_md": false 00:11:42.067 }, 00:11:42.067 "memory_domains": [ 00:11:42.067 { 00:11:42.067 "dma_device_id": "system", 00:11:42.067 "dma_device_type": 1 00:11:42.067 }, 00:11:42.067 { 00:11:42.067 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:42.067 "dma_device_type": 2 00:11:42.067 } 00:11:42.067 ], 00:11:42.067 "driver_specific": {} 00:11:42.067 } 00:11:42.067 ] 00:11:42.067 19:47:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:11:42.067 19:47:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:42.067 19:47:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:42.067 19:47:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:11:42.067 19:47:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:42.067 19:47:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:42.067 19:47:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:42.067 19:47:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:42.067 19:47:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:42.067 19:47:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:42.067 19:47:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:42.067 19:47:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:42.067 19:47:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:42.067 19:47:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:42.067 19:47:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:42.326 19:47:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:42.326 "name": "Existed_Raid", 00:11:42.326 "uuid": "35f45880-a0f6-4b23-ae8c-4ea4bcc11fd0", 00:11:42.326 "strip_size_kb": 64, 00:11:42.326 "state": "online", 00:11:42.326 "raid_level": "raid0", 00:11:42.326 "superblock": true, 00:11:42.326 "num_base_bdevs": 2, 00:11:42.326 "num_base_bdevs_discovered": 2, 00:11:42.326 "num_base_bdevs_operational": 2, 00:11:42.326 "base_bdevs_list": [ 00:11:42.326 { 00:11:42.326 "name": "BaseBdev1", 00:11:42.326 "uuid": "dbbd7bcb-c29c-48da-b7b3-d9f0d4a0a842", 00:11:42.326 "is_configured": true, 00:11:42.326 "data_offset": 2048, 00:11:42.326 "data_size": 63488 00:11:42.326 }, 00:11:42.326 { 00:11:42.326 "name": "BaseBdev2", 00:11:42.326 "uuid": "710ccb0d-e352-40c0-9110-dc16f519a199", 00:11:42.326 "is_configured": true, 00:11:42.326 "data_offset": 2048, 00:11:42.326 "data_size": 63488 00:11:42.326 } 00:11:42.326 ] 00:11:42.326 }' 00:11:42.326 19:47:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:42.326 19:47:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:42.894 19:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:42.894 19:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:42.894 19:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:42.894 19:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:42.894 19:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:42.894 19:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:42.894 19:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:42.894 19:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:43.154 [2024-07-24 19:47:34.659308] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:43.154 19:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:43.154 "name": "Existed_Raid", 00:11:43.154 "aliases": [ 00:11:43.154 "35f45880-a0f6-4b23-ae8c-4ea4bcc11fd0" 00:11:43.154 ], 00:11:43.154 "product_name": "Raid Volume", 00:11:43.154 "block_size": 512, 00:11:43.154 "num_blocks": 126976, 00:11:43.154 "uuid": "35f45880-a0f6-4b23-ae8c-4ea4bcc11fd0", 00:11:43.154 "assigned_rate_limits": { 00:11:43.154 "rw_ios_per_sec": 0, 00:11:43.154 "rw_mbytes_per_sec": 0, 00:11:43.154 "r_mbytes_per_sec": 0, 00:11:43.154 "w_mbytes_per_sec": 0 00:11:43.154 }, 00:11:43.154 "claimed": false, 00:11:43.154 "zoned": false, 00:11:43.154 "supported_io_types": { 00:11:43.154 "read": true, 00:11:43.154 "write": true, 00:11:43.154 "unmap": true, 00:11:43.154 "flush": true, 00:11:43.154 "reset": true, 00:11:43.154 "nvme_admin": false, 00:11:43.154 "nvme_io": false, 00:11:43.154 "nvme_io_md": false, 00:11:43.154 "write_zeroes": true, 00:11:43.154 "zcopy": false, 00:11:43.154 "get_zone_info": false, 00:11:43.154 "zone_management": false, 00:11:43.154 "zone_append": false, 00:11:43.154 "compare": false, 00:11:43.154 "compare_and_write": false, 00:11:43.154 "abort": false, 00:11:43.154 "seek_hole": false, 00:11:43.154 "seek_data": false, 00:11:43.154 "copy": false, 00:11:43.154 "nvme_iov_md": false 00:11:43.154 }, 00:11:43.154 "memory_domains": [ 00:11:43.154 { 00:11:43.154 "dma_device_id": "system", 00:11:43.154 "dma_device_type": 1 00:11:43.154 }, 00:11:43.154 { 00:11:43.154 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:43.154 "dma_device_type": 2 00:11:43.154 }, 00:11:43.154 { 00:11:43.154 "dma_device_id": "system", 00:11:43.154 "dma_device_type": 1 00:11:43.154 }, 00:11:43.154 { 00:11:43.154 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:43.154 "dma_device_type": 2 00:11:43.154 } 00:11:43.154 ], 00:11:43.154 "driver_specific": { 00:11:43.154 "raid": { 00:11:43.154 "uuid": "35f45880-a0f6-4b23-ae8c-4ea4bcc11fd0", 00:11:43.154 "strip_size_kb": 64, 00:11:43.154 "state": "online", 00:11:43.154 "raid_level": "raid0", 00:11:43.154 "superblock": true, 00:11:43.154 "num_base_bdevs": 2, 00:11:43.154 "num_base_bdevs_discovered": 2, 00:11:43.154 "num_base_bdevs_operational": 2, 00:11:43.154 "base_bdevs_list": [ 00:11:43.154 { 00:11:43.154 "name": "BaseBdev1", 00:11:43.154 "uuid": "dbbd7bcb-c29c-48da-b7b3-d9f0d4a0a842", 00:11:43.154 "is_configured": true, 00:11:43.154 "data_offset": 2048, 00:11:43.154 "data_size": 63488 00:11:43.154 }, 00:11:43.154 { 00:11:43.154 "name": "BaseBdev2", 00:11:43.154 "uuid": "710ccb0d-e352-40c0-9110-dc16f519a199", 00:11:43.154 "is_configured": true, 00:11:43.154 "data_offset": 2048, 00:11:43.154 "data_size": 63488 00:11:43.154 } 00:11:43.154 ] 00:11:43.154 } 00:11:43.154 } 00:11:43.154 }' 00:11:43.154 19:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:43.154 19:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:43.154 BaseBdev2' 00:11:43.154 19:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:43.154 19:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:43.154 19:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:43.413 19:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:43.413 "name": "BaseBdev1", 00:11:43.413 "aliases": [ 00:11:43.413 "dbbd7bcb-c29c-48da-b7b3-d9f0d4a0a842" 00:11:43.413 ], 00:11:43.413 "product_name": "Malloc disk", 00:11:43.413 "block_size": 512, 00:11:43.413 "num_blocks": 65536, 00:11:43.413 "uuid": "dbbd7bcb-c29c-48da-b7b3-d9f0d4a0a842", 00:11:43.413 "assigned_rate_limits": { 00:11:43.413 "rw_ios_per_sec": 0, 00:11:43.413 "rw_mbytes_per_sec": 0, 00:11:43.413 "r_mbytes_per_sec": 0, 00:11:43.413 "w_mbytes_per_sec": 0 00:11:43.413 }, 00:11:43.413 "claimed": true, 00:11:43.413 "claim_type": "exclusive_write", 00:11:43.413 "zoned": false, 00:11:43.413 "supported_io_types": { 00:11:43.413 "read": true, 00:11:43.413 "write": true, 00:11:43.413 "unmap": true, 00:11:43.413 "flush": true, 00:11:43.414 "reset": true, 00:11:43.414 "nvme_admin": false, 00:11:43.414 "nvme_io": false, 00:11:43.414 "nvme_io_md": false, 00:11:43.414 "write_zeroes": true, 00:11:43.414 "zcopy": true, 00:11:43.414 "get_zone_info": false, 00:11:43.414 "zone_management": false, 00:11:43.414 "zone_append": false, 00:11:43.414 "compare": false, 00:11:43.414 "compare_and_write": false, 00:11:43.414 "abort": true, 00:11:43.414 "seek_hole": false, 00:11:43.414 "seek_data": false, 00:11:43.414 "copy": true, 00:11:43.414 "nvme_iov_md": false 00:11:43.414 }, 00:11:43.414 "memory_domains": [ 00:11:43.414 { 00:11:43.414 "dma_device_id": "system", 00:11:43.414 "dma_device_type": 1 00:11:43.414 }, 00:11:43.414 { 00:11:43.414 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:43.414 "dma_device_type": 2 00:11:43.414 } 00:11:43.414 ], 00:11:43.414 "driver_specific": {} 00:11:43.414 }' 00:11:43.414 19:47:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:43.673 19:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:43.673 19:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:43.673 19:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:43.673 19:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:43.673 19:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:43.673 19:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:43.673 19:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:43.673 19:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:43.673 19:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:43.933 19:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:43.933 19:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:43.933 19:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:43.933 19:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:43.933 19:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:44.193 19:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:44.193 "name": "BaseBdev2", 00:11:44.193 "aliases": [ 00:11:44.193 "710ccb0d-e352-40c0-9110-dc16f519a199" 00:11:44.193 ], 00:11:44.193 "product_name": "Malloc disk", 00:11:44.193 "block_size": 512, 00:11:44.193 "num_blocks": 65536, 00:11:44.193 "uuid": "710ccb0d-e352-40c0-9110-dc16f519a199", 00:11:44.193 "assigned_rate_limits": { 00:11:44.193 "rw_ios_per_sec": 0, 00:11:44.193 "rw_mbytes_per_sec": 0, 00:11:44.193 "r_mbytes_per_sec": 0, 00:11:44.193 "w_mbytes_per_sec": 0 00:11:44.193 }, 00:11:44.193 "claimed": true, 00:11:44.193 "claim_type": "exclusive_write", 00:11:44.193 "zoned": false, 00:11:44.193 "supported_io_types": { 00:11:44.193 "read": true, 00:11:44.193 "write": true, 00:11:44.193 "unmap": true, 00:11:44.193 "flush": true, 00:11:44.193 "reset": true, 00:11:44.193 "nvme_admin": false, 00:11:44.193 "nvme_io": false, 00:11:44.193 "nvme_io_md": false, 00:11:44.193 "write_zeroes": true, 00:11:44.193 "zcopy": true, 00:11:44.193 "get_zone_info": false, 00:11:44.193 "zone_management": false, 00:11:44.193 "zone_append": false, 00:11:44.193 "compare": false, 00:11:44.193 "compare_and_write": false, 00:11:44.193 "abort": true, 00:11:44.193 "seek_hole": false, 00:11:44.193 "seek_data": false, 00:11:44.193 "copy": true, 00:11:44.193 "nvme_iov_md": false 00:11:44.193 }, 00:11:44.193 "memory_domains": [ 00:11:44.193 { 00:11:44.193 "dma_device_id": "system", 00:11:44.193 "dma_device_type": 1 00:11:44.193 }, 00:11:44.193 { 00:11:44.193 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:44.193 "dma_device_type": 2 00:11:44.193 } 00:11:44.193 ], 00:11:44.193 "driver_specific": {} 00:11:44.193 }' 00:11:44.193 19:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:44.193 19:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:44.193 19:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:44.193 19:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:44.193 19:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:44.193 19:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:44.193 19:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:44.193 19:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:44.452 19:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:44.452 19:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:44.452 19:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:44.452 19:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:44.452 19:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:44.712 [2024-07-24 19:47:36.126990] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:44.712 [2024-07-24 19:47:36.127018] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:44.712 [2024-07-24 19:47:36.127057] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:44.712 19:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:44.712 19:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:11:44.712 19:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:44.712 19:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:11:44.712 19:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:44.712 19:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:11:44.712 19:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:44.712 19:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:44.712 19:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:44.712 19:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:44.712 19:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:44.712 19:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:44.712 19:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:44.712 19:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:44.712 19:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:44.712 19:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:44.712 19:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:44.971 19:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:44.971 "name": "Existed_Raid", 00:11:44.971 "uuid": "35f45880-a0f6-4b23-ae8c-4ea4bcc11fd0", 00:11:44.971 "strip_size_kb": 64, 00:11:44.971 "state": "offline", 00:11:44.971 "raid_level": "raid0", 00:11:44.971 "superblock": true, 00:11:44.971 "num_base_bdevs": 2, 00:11:44.971 "num_base_bdevs_discovered": 1, 00:11:44.971 "num_base_bdevs_operational": 1, 00:11:44.971 "base_bdevs_list": [ 00:11:44.971 { 00:11:44.971 "name": null, 00:11:44.971 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:44.971 "is_configured": false, 00:11:44.971 "data_offset": 2048, 00:11:44.971 "data_size": 63488 00:11:44.971 }, 00:11:44.971 { 00:11:44.971 "name": "BaseBdev2", 00:11:44.971 "uuid": "710ccb0d-e352-40c0-9110-dc16f519a199", 00:11:44.971 "is_configured": true, 00:11:44.971 "data_offset": 2048, 00:11:44.971 "data_size": 63488 00:11:44.971 } 00:11:44.971 ] 00:11:44.971 }' 00:11:44.971 19:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:44.971 19:47:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:45.540 19:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:45.540 19:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:45.540 19:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:45.540 19:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:45.800 19:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:45.800 19:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:45.800 19:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:46.059 [2024-07-24 19:47:37.464405] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:46.059 [2024-07-24 19:47:37.464453] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12fe0d0 name Existed_Raid, state offline 00:11:46.059 19:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:46.059 19:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:46.059 19:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:46.059 19:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:46.318 19:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:46.318 19:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:46.318 19:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:46.318 19:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1374616 00:11:46.318 19:47:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1374616 ']' 00:11:46.318 19:47:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 1374616 00:11:46.318 19:47:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:11:46.318 19:47:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:46.318 19:47:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1374616 00:11:46.318 19:47:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:46.318 19:47:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:46.318 19:47:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1374616' 00:11:46.318 killing process with pid 1374616 00:11:46.318 19:47:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 1374616 00:11:46.318 [2024-07-24 19:47:37.795750] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:46.318 19:47:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 1374616 00:11:46.318 [2024-07-24 19:47:37.796667] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:46.578 19:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:11:46.578 00:11:46.578 real 0m10.691s 00:11:46.578 user 0m19.017s 00:11:46.578 sys 0m1.983s 00:11:46.578 19:47:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:46.578 19:47:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:46.578 ************************************ 00:11:46.578 END TEST raid_state_function_test_sb 00:11:46.578 ************************************ 00:11:46.578 19:47:38 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:11:46.578 19:47:38 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:11:46.578 19:47:38 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:46.578 19:47:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:46.578 ************************************ 00:11:46.578 START TEST raid_superblock_test 00:11:46.578 ************************************ 00:11:46.578 19:47:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid0 2 00:11:46.578 19:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=raid0 00:11:46.578 19:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=2 00:11:46.578 19:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:11:46.578 19:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:11:46.578 19:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:11:46.578 19:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:11:46.578 19:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:11:46.578 19:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:11:46.578 19:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:11:46.578 19:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:11:46.578 19:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:11:46.578 19:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:11:46.578 19:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:11:46.578 19:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' raid0 '!=' raid1 ']' 00:11:46.578 19:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # strip_size=64 00:11:46.578 19:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # strip_size_create_arg='-z 64' 00:11:46.578 19:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=1376200 00:11:46.578 19:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 1376200 /var/tmp/spdk-raid.sock 00:11:46.578 19:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:46.578 19:47:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1376200 ']' 00:11:46.578 19:47:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:46.578 19:47:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:46.578 19:47:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:46.578 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:46.578 19:47:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:46.578 19:47:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:46.578 [2024-07-24 19:47:38.145375] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:11:46.578 [2024-07-24 19:47:38.145450] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1376200 ] 00:11:46.838 [2024-07-24 19:47:38.275042] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:46.838 [2024-07-24 19:47:38.380088] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:47.098 [2024-07-24 19:47:38.444403] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:47.098 [2024-07-24 19:47:38.444437] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:47.668 19:47:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:47.668 19:47:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:11:47.668 19:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:11:47.668 19:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:11:47.668 19:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:11:47.668 19:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:11:47.668 19:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:11:47.668 19:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:47.668 19:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:11:47.668 19:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:47.668 19:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:11:47.926 malloc1 00:11:47.926 19:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:48.186 [2024-07-24 19:47:39.555006] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:48.186 [2024-07-24 19:47:39.555052] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:48.186 [2024-07-24 19:47:39.555074] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2216590 00:11:48.186 [2024-07-24 19:47:39.555087] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:48.186 [2024-07-24 19:47:39.556811] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:48.186 [2024-07-24 19:47:39.556841] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:48.186 pt1 00:11:48.186 19:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:11:48.186 19:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:11:48.186 19:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:11:48.186 19:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:11:48.186 19:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:48.186 19:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:48.186 19:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:11:48.186 19:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:48.186 19:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:48.445 malloc2 00:11:48.445 19:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:48.704 [2024-07-24 19:47:40.057759] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:48.704 [2024-07-24 19:47:40.057811] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:48.704 [2024-07-24 19:47:40.057828] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23bc690 00:11:48.704 [2024-07-24 19:47:40.057848] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:48.704 [2024-07-24 19:47:40.059437] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:48.704 [2024-07-24 19:47:40.059465] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:48.704 pt2 00:11:48.704 19:47:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:11:48.704 19:47:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:11:48.704 19:47:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:11:48.704 [2024-07-24 19:47:40.238248] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:48.704 [2024-07-24 19:47:40.239438] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:48.704 [2024-07-24 19:47:40.239581] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x23bd980 00:11:48.704 [2024-07-24 19:47:40.239594] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:48.704 [2024-07-24 19:47:40.239781] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23be730 00:11:48.704 [2024-07-24 19:47:40.239922] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23bd980 00:11:48.704 [2024-07-24 19:47:40.239932] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23bd980 00:11:48.704 [2024-07-24 19:47:40.240025] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:48.704 19:47:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:48.704 19:47:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:48.704 19:47:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:48.704 19:47:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:48.704 19:47:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:48.704 19:47:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:48.704 19:47:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:48.704 19:47:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:48.704 19:47:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:48.704 19:47:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:48.704 19:47:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:48.704 19:47:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:48.964 19:47:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:48.964 "name": "raid_bdev1", 00:11:48.964 "uuid": "35697ca3-a924-4a2c-9947-061f0b841a14", 00:11:48.964 "strip_size_kb": 64, 00:11:48.964 "state": "online", 00:11:48.964 "raid_level": "raid0", 00:11:48.964 "superblock": true, 00:11:48.964 "num_base_bdevs": 2, 00:11:48.964 "num_base_bdevs_discovered": 2, 00:11:48.964 "num_base_bdevs_operational": 2, 00:11:48.964 "base_bdevs_list": [ 00:11:48.964 { 00:11:48.964 "name": "pt1", 00:11:48.964 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:48.964 "is_configured": true, 00:11:48.964 "data_offset": 2048, 00:11:48.964 "data_size": 63488 00:11:48.964 }, 00:11:48.964 { 00:11:48.964 "name": "pt2", 00:11:48.964 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:48.964 "is_configured": true, 00:11:48.964 "data_offset": 2048, 00:11:48.964 "data_size": 63488 00:11:48.964 } 00:11:48.964 ] 00:11:48.964 }' 00:11:48.964 19:47:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:48.964 19:47:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:49.533 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:11:49.533 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:49.533 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:49.533 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:49.533 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:49.533 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:49.533 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:49.533 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:49.792 [2024-07-24 19:47:41.273200] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:49.792 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:49.792 "name": "raid_bdev1", 00:11:49.792 "aliases": [ 00:11:49.792 "35697ca3-a924-4a2c-9947-061f0b841a14" 00:11:49.792 ], 00:11:49.792 "product_name": "Raid Volume", 00:11:49.792 "block_size": 512, 00:11:49.792 "num_blocks": 126976, 00:11:49.792 "uuid": "35697ca3-a924-4a2c-9947-061f0b841a14", 00:11:49.793 "assigned_rate_limits": { 00:11:49.793 "rw_ios_per_sec": 0, 00:11:49.793 "rw_mbytes_per_sec": 0, 00:11:49.793 "r_mbytes_per_sec": 0, 00:11:49.793 "w_mbytes_per_sec": 0 00:11:49.793 }, 00:11:49.793 "claimed": false, 00:11:49.793 "zoned": false, 00:11:49.793 "supported_io_types": { 00:11:49.793 "read": true, 00:11:49.793 "write": true, 00:11:49.793 "unmap": true, 00:11:49.793 "flush": true, 00:11:49.793 "reset": true, 00:11:49.793 "nvme_admin": false, 00:11:49.793 "nvme_io": false, 00:11:49.793 "nvme_io_md": false, 00:11:49.793 "write_zeroes": true, 00:11:49.793 "zcopy": false, 00:11:49.793 "get_zone_info": false, 00:11:49.793 "zone_management": false, 00:11:49.793 "zone_append": false, 00:11:49.793 "compare": false, 00:11:49.793 "compare_and_write": false, 00:11:49.793 "abort": false, 00:11:49.793 "seek_hole": false, 00:11:49.793 "seek_data": false, 00:11:49.793 "copy": false, 00:11:49.793 "nvme_iov_md": false 00:11:49.793 }, 00:11:49.793 "memory_domains": [ 00:11:49.793 { 00:11:49.793 "dma_device_id": "system", 00:11:49.793 "dma_device_type": 1 00:11:49.793 }, 00:11:49.793 { 00:11:49.793 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:49.793 "dma_device_type": 2 00:11:49.793 }, 00:11:49.793 { 00:11:49.793 "dma_device_id": "system", 00:11:49.793 "dma_device_type": 1 00:11:49.793 }, 00:11:49.793 { 00:11:49.793 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:49.793 "dma_device_type": 2 00:11:49.793 } 00:11:49.793 ], 00:11:49.793 "driver_specific": { 00:11:49.793 "raid": { 00:11:49.793 "uuid": "35697ca3-a924-4a2c-9947-061f0b841a14", 00:11:49.793 "strip_size_kb": 64, 00:11:49.793 "state": "online", 00:11:49.793 "raid_level": "raid0", 00:11:49.793 "superblock": true, 00:11:49.793 "num_base_bdevs": 2, 00:11:49.793 "num_base_bdevs_discovered": 2, 00:11:49.793 "num_base_bdevs_operational": 2, 00:11:49.793 "base_bdevs_list": [ 00:11:49.793 { 00:11:49.793 "name": "pt1", 00:11:49.793 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:49.793 "is_configured": true, 00:11:49.793 "data_offset": 2048, 00:11:49.793 "data_size": 63488 00:11:49.793 }, 00:11:49.793 { 00:11:49.793 "name": "pt2", 00:11:49.793 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:49.793 "is_configured": true, 00:11:49.793 "data_offset": 2048, 00:11:49.793 "data_size": 63488 00:11:49.793 } 00:11:49.793 ] 00:11:49.793 } 00:11:49.793 } 00:11:49.793 }' 00:11:49.793 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:49.793 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:49.793 pt2' 00:11:49.793 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:49.793 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:49.793 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:50.052 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:50.052 "name": "pt1", 00:11:50.052 "aliases": [ 00:11:50.052 "00000000-0000-0000-0000-000000000001" 00:11:50.052 ], 00:11:50.052 "product_name": "passthru", 00:11:50.052 "block_size": 512, 00:11:50.052 "num_blocks": 65536, 00:11:50.052 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:50.052 "assigned_rate_limits": { 00:11:50.052 "rw_ios_per_sec": 0, 00:11:50.052 "rw_mbytes_per_sec": 0, 00:11:50.052 "r_mbytes_per_sec": 0, 00:11:50.052 "w_mbytes_per_sec": 0 00:11:50.052 }, 00:11:50.052 "claimed": true, 00:11:50.052 "claim_type": "exclusive_write", 00:11:50.053 "zoned": false, 00:11:50.053 "supported_io_types": { 00:11:50.053 "read": true, 00:11:50.053 "write": true, 00:11:50.053 "unmap": true, 00:11:50.053 "flush": true, 00:11:50.053 "reset": true, 00:11:50.053 "nvme_admin": false, 00:11:50.053 "nvme_io": false, 00:11:50.053 "nvme_io_md": false, 00:11:50.053 "write_zeroes": true, 00:11:50.053 "zcopy": true, 00:11:50.053 "get_zone_info": false, 00:11:50.053 "zone_management": false, 00:11:50.053 "zone_append": false, 00:11:50.053 "compare": false, 00:11:50.053 "compare_and_write": false, 00:11:50.053 "abort": true, 00:11:50.053 "seek_hole": false, 00:11:50.053 "seek_data": false, 00:11:50.053 "copy": true, 00:11:50.053 "nvme_iov_md": false 00:11:50.053 }, 00:11:50.053 "memory_domains": [ 00:11:50.053 { 00:11:50.053 "dma_device_id": "system", 00:11:50.053 "dma_device_type": 1 00:11:50.053 }, 00:11:50.053 { 00:11:50.053 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:50.053 "dma_device_type": 2 00:11:50.053 } 00:11:50.053 ], 00:11:50.053 "driver_specific": { 00:11:50.053 "passthru": { 00:11:50.053 "name": "pt1", 00:11:50.053 "base_bdev_name": "malloc1" 00:11:50.053 } 00:11:50.053 } 00:11:50.053 }' 00:11:50.053 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:50.053 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:50.312 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:50.312 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:50.312 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:50.312 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:50.312 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:50.312 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:50.312 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:50.312 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:50.571 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:50.571 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:50.571 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:50.572 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:50.572 19:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:50.572 19:47:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:50.572 "name": "pt2", 00:11:50.572 "aliases": [ 00:11:50.572 "00000000-0000-0000-0000-000000000002" 00:11:50.572 ], 00:11:50.572 "product_name": "passthru", 00:11:50.572 "block_size": 512, 00:11:50.572 "num_blocks": 65536, 00:11:50.572 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:50.572 "assigned_rate_limits": { 00:11:50.572 "rw_ios_per_sec": 0, 00:11:50.572 "rw_mbytes_per_sec": 0, 00:11:50.572 "r_mbytes_per_sec": 0, 00:11:50.572 "w_mbytes_per_sec": 0 00:11:50.572 }, 00:11:50.572 "claimed": true, 00:11:50.572 "claim_type": "exclusive_write", 00:11:50.572 "zoned": false, 00:11:50.572 "supported_io_types": { 00:11:50.572 "read": true, 00:11:50.572 "write": true, 00:11:50.572 "unmap": true, 00:11:50.572 "flush": true, 00:11:50.572 "reset": true, 00:11:50.572 "nvme_admin": false, 00:11:50.572 "nvme_io": false, 00:11:50.572 "nvme_io_md": false, 00:11:50.572 "write_zeroes": true, 00:11:50.572 "zcopy": true, 00:11:50.572 "get_zone_info": false, 00:11:50.572 "zone_management": false, 00:11:50.572 "zone_append": false, 00:11:50.572 "compare": false, 00:11:50.572 "compare_and_write": false, 00:11:50.572 "abort": true, 00:11:50.572 "seek_hole": false, 00:11:50.572 "seek_data": false, 00:11:50.572 "copy": true, 00:11:50.572 "nvme_iov_md": false 00:11:50.572 }, 00:11:50.572 "memory_domains": [ 00:11:50.572 { 00:11:50.572 "dma_device_id": "system", 00:11:50.572 "dma_device_type": 1 00:11:50.572 }, 00:11:50.572 { 00:11:50.572 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:50.572 "dma_device_type": 2 00:11:50.572 } 00:11:50.572 ], 00:11:50.572 "driver_specific": { 00:11:50.572 "passthru": { 00:11:50.572 "name": "pt2", 00:11:50.572 "base_bdev_name": "malloc2" 00:11:50.572 } 00:11:50.572 } 00:11:50.572 }' 00:11:50.572 19:47:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:50.830 19:47:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:50.830 19:47:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:50.831 19:47:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:50.831 19:47:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:50.831 19:47:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:50.831 19:47:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:50.831 19:47:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:51.089 19:47:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:51.089 19:47:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:51.089 19:47:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:51.089 19:47:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:51.089 19:47:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:51.089 19:47:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:11:51.348 [2024-07-24 19:47:42.753184] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:51.348 19:47:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=35697ca3-a924-4a2c-9947-061f0b841a14 00:11:51.348 19:47:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z 35697ca3-a924-4a2c-9947-061f0b841a14 ']' 00:11:51.348 19:47:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:51.607 [2024-07-24 19:47:42.997579] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:51.607 [2024-07-24 19:47:42.997598] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:51.607 [2024-07-24 19:47:42.997650] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:51.607 [2024-07-24 19:47:42.997694] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:51.607 [2024-07-24 19:47:42.997705] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23bd980 name raid_bdev1, state offline 00:11:51.607 19:47:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:51.607 19:47:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:11:51.866 19:47:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:11:51.866 19:47:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:11:51.866 19:47:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:11:51.866 19:47:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:52.125 19:47:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:11:52.125 19:47:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:52.385 19:47:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:11:52.385 19:47:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:11:52.644 19:47:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:11:52.644 19:47:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:11:52.644 19:47:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:11:52.644 19:47:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:11:52.644 19:47:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:52.644 19:47:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:11:52.644 19:47:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:52.644 19:47:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:11:52.644 19:47:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:52.644 19:47:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:11:52.644 19:47:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:52.644 19:47:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:11:52.644 19:47:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:11:52.644 [2024-07-24 19:47:44.236813] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:11:52.903 [2024-07-24 19:47:44.238152] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:11:52.903 [2024-07-24 19:47:44.238208] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:11:52.903 [2024-07-24 19:47:44.238249] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:11:52.903 [2024-07-24 19:47:44.238268] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:52.903 [2024-07-24 19:47:44.238278] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22167c0 name raid_bdev1, state configuring 00:11:52.903 request: 00:11:52.903 { 00:11:52.903 "name": "raid_bdev1", 00:11:52.903 "raid_level": "raid0", 00:11:52.903 "base_bdevs": [ 00:11:52.903 "malloc1", 00:11:52.903 "malloc2" 00:11:52.903 ], 00:11:52.903 "strip_size_kb": 64, 00:11:52.903 "superblock": false, 00:11:52.903 "method": "bdev_raid_create", 00:11:52.903 "req_id": 1 00:11:52.903 } 00:11:52.903 Got JSON-RPC error response 00:11:52.903 response: 00:11:52.903 { 00:11:52.903 "code": -17, 00:11:52.903 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:11:52.903 } 00:11:52.903 19:47:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:11:52.903 19:47:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:11:52.903 19:47:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:11:52.904 19:47:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:11:52.904 19:47:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:52.904 19:47:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:11:52.904 19:47:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:11:52.904 19:47:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:11:52.904 19:47:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:53.163 [2024-07-24 19:47:44.597704] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:53.163 [2024-07-24 19:47:44.597736] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:53.163 [2024-07-24 19:47:44.597752] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23bc460 00:11:53.163 [2024-07-24 19:47:44.597764] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:53.163 [2024-07-24 19:47:44.599186] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:53.163 [2024-07-24 19:47:44.599211] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:53.163 [2024-07-24 19:47:44.599265] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:11:53.163 [2024-07-24 19:47:44.599288] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:53.163 pt1 00:11:53.163 19:47:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:11:53.163 19:47:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:53.163 19:47:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:53.163 19:47:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:53.163 19:47:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:53.163 19:47:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:53.163 19:47:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:53.163 19:47:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:53.163 19:47:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:53.163 19:47:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:53.163 19:47:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:53.163 19:47:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:53.423 19:47:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:53.423 "name": "raid_bdev1", 00:11:53.423 "uuid": "35697ca3-a924-4a2c-9947-061f0b841a14", 00:11:53.423 "strip_size_kb": 64, 00:11:53.423 "state": "configuring", 00:11:53.423 "raid_level": "raid0", 00:11:53.423 "superblock": true, 00:11:53.423 "num_base_bdevs": 2, 00:11:53.423 "num_base_bdevs_discovered": 1, 00:11:53.423 "num_base_bdevs_operational": 2, 00:11:53.423 "base_bdevs_list": [ 00:11:53.423 { 00:11:53.423 "name": "pt1", 00:11:53.423 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:53.423 "is_configured": true, 00:11:53.423 "data_offset": 2048, 00:11:53.423 "data_size": 63488 00:11:53.423 }, 00:11:53.423 { 00:11:53.423 "name": null, 00:11:53.423 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:53.423 "is_configured": false, 00:11:53.423 "data_offset": 2048, 00:11:53.423 "data_size": 63488 00:11:53.423 } 00:11:53.423 ] 00:11:53.423 }' 00:11:53.423 19:47:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:53.423 19:47:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:53.991 19:47:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 2 -gt 2 ']' 00:11:53.991 19:47:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:11:53.991 19:47:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:11:53.991 19:47:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:54.251 [2024-07-24 19:47:45.676725] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:54.251 [2024-07-24 19:47:45.676766] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:54.251 [2024-07-24 19:47:45.676784] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23bdd20 00:11:54.251 [2024-07-24 19:47:45.676796] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:54.251 [2024-07-24 19:47:45.677125] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:54.251 [2024-07-24 19:47:45.677142] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:54.251 [2024-07-24 19:47:45.677198] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:11:54.251 [2024-07-24 19:47:45.677216] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:54.251 [2024-07-24 19:47:45.677310] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2215570 00:11:54.251 [2024-07-24 19:47:45.677320] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:54.251 [2024-07-24 19:47:45.677492] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23c1480 00:11:54.251 [2024-07-24 19:47:45.677616] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2215570 00:11:54.251 [2024-07-24 19:47:45.677626] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2215570 00:11:54.251 [2024-07-24 19:47:45.677718] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:54.251 pt2 00:11:54.251 19:47:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:11:54.251 19:47:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:11:54.251 19:47:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:54.251 19:47:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:54.251 19:47:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:54.251 19:47:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:54.251 19:47:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:54.251 19:47:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:54.251 19:47:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:54.251 19:47:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:54.251 19:47:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:54.251 19:47:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:54.251 19:47:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:54.251 19:47:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:54.510 19:47:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:54.510 "name": "raid_bdev1", 00:11:54.510 "uuid": "35697ca3-a924-4a2c-9947-061f0b841a14", 00:11:54.510 "strip_size_kb": 64, 00:11:54.510 "state": "online", 00:11:54.510 "raid_level": "raid0", 00:11:54.510 "superblock": true, 00:11:54.510 "num_base_bdevs": 2, 00:11:54.510 "num_base_bdevs_discovered": 2, 00:11:54.510 "num_base_bdevs_operational": 2, 00:11:54.510 "base_bdevs_list": [ 00:11:54.510 { 00:11:54.510 "name": "pt1", 00:11:54.510 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:54.510 "is_configured": true, 00:11:54.510 "data_offset": 2048, 00:11:54.510 "data_size": 63488 00:11:54.510 }, 00:11:54.510 { 00:11:54.510 "name": "pt2", 00:11:54.510 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:54.510 "is_configured": true, 00:11:54.510 "data_offset": 2048, 00:11:54.510 "data_size": 63488 00:11:54.510 } 00:11:54.510 ] 00:11:54.510 }' 00:11:54.510 19:47:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:54.510 19:47:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:55.079 19:47:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:11:55.079 19:47:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:55.079 19:47:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:55.079 19:47:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:55.079 19:47:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:55.079 19:47:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:55.079 19:47:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:55.079 19:47:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:55.338 [2024-07-24 19:47:46.707699] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:55.338 19:47:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:55.338 "name": "raid_bdev1", 00:11:55.338 "aliases": [ 00:11:55.338 "35697ca3-a924-4a2c-9947-061f0b841a14" 00:11:55.338 ], 00:11:55.338 "product_name": "Raid Volume", 00:11:55.338 "block_size": 512, 00:11:55.338 "num_blocks": 126976, 00:11:55.338 "uuid": "35697ca3-a924-4a2c-9947-061f0b841a14", 00:11:55.338 "assigned_rate_limits": { 00:11:55.338 "rw_ios_per_sec": 0, 00:11:55.338 "rw_mbytes_per_sec": 0, 00:11:55.338 "r_mbytes_per_sec": 0, 00:11:55.338 "w_mbytes_per_sec": 0 00:11:55.338 }, 00:11:55.338 "claimed": false, 00:11:55.338 "zoned": false, 00:11:55.338 "supported_io_types": { 00:11:55.338 "read": true, 00:11:55.338 "write": true, 00:11:55.338 "unmap": true, 00:11:55.338 "flush": true, 00:11:55.338 "reset": true, 00:11:55.338 "nvme_admin": false, 00:11:55.338 "nvme_io": false, 00:11:55.338 "nvme_io_md": false, 00:11:55.338 "write_zeroes": true, 00:11:55.338 "zcopy": false, 00:11:55.338 "get_zone_info": false, 00:11:55.338 "zone_management": false, 00:11:55.338 "zone_append": false, 00:11:55.338 "compare": false, 00:11:55.338 "compare_and_write": false, 00:11:55.338 "abort": false, 00:11:55.338 "seek_hole": false, 00:11:55.338 "seek_data": false, 00:11:55.338 "copy": false, 00:11:55.338 "nvme_iov_md": false 00:11:55.338 }, 00:11:55.338 "memory_domains": [ 00:11:55.338 { 00:11:55.338 "dma_device_id": "system", 00:11:55.338 "dma_device_type": 1 00:11:55.338 }, 00:11:55.338 { 00:11:55.338 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:55.338 "dma_device_type": 2 00:11:55.338 }, 00:11:55.338 { 00:11:55.338 "dma_device_id": "system", 00:11:55.338 "dma_device_type": 1 00:11:55.338 }, 00:11:55.338 { 00:11:55.338 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:55.338 "dma_device_type": 2 00:11:55.338 } 00:11:55.338 ], 00:11:55.338 "driver_specific": { 00:11:55.338 "raid": { 00:11:55.338 "uuid": "35697ca3-a924-4a2c-9947-061f0b841a14", 00:11:55.338 "strip_size_kb": 64, 00:11:55.338 "state": "online", 00:11:55.338 "raid_level": "raid0", 00:11:55.338 "superblock": true, 00:11:55.338 "num_base_bdevs": 2, 00:11:55.338 "num_base_bdevs_discovered": 2, 00:11:55.338 "num_base_bdevs_operational": 2, 00:11:55.338 "base_bdevs_list": [ 00:11:55.338 { 00:11:55.338 "name": "pt1", 00:11:55.338 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:55.338 "is_configured": true, 00:11:55.338 "data_offset": 2048, 00:11:55.338 "data_size": 63488 00:11:55.338 }, 00:11:55.338 { 00:11:55.338 "name": "pt2", 00:11:55.338 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:55.338 "is_configured": true, 00:11:55.338 "data_offset": 2048, 00:11:55.338 "data_size": 63488 00:11:55.338 } 00:11:55.338 ] 00:11:55.338 } 00:11:55.338 } 00:11:55.338 }' 00:11:55.338 19:47:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:55.338 19:47:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:55.338 pt2' 00:11:55.338 19:47:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:55.338 19:47:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:55.338 19:47:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:55.598 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:55.598 "name": "pt1", 00:11:55.598 "aliases": [ 00:11:55.598 "00000000-0000-0000-0000-000000000001" 00:11:55.598 ], 00:11:55.598 "product_name": "passthru", 00:11:55.598 "block_size": 512, 00:11:55.598 "num_blocks": 65536, 00:11:55.598 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:55.598 "assigned_rate_limits": { 00:11:55.598 "rw_ios_per_sec": 0, 00:11:55.598 "rw_mbytes_per_sec": 0, 00:11:55.598 "r_mbytes_per_sec": 0, 00:11:55.598 "w_mbytes_per_sec": 0 00:11:55.598 }, 00:11:55.598 "claimed": true, 00:11:55.598 "claim_type": "exclusive_write", 00:11:55.598 "zoned": false, 00:11:55.598 "supported_io_types": { 00:11:55.598 "read": true, 00:11:55.598 "write": true, 00:11:55.598 "unmap": true, 00:11:55.598 "flush": true, 00:11:55.598 "reset": true, 00:11:55.598 "nvme_admin": false, 00:11:55.598 "nvme_io": false, 00:11:55.598 "nvme_io_md": false, 00:11:55.598 "write_zeroes": true, 00:11:55.598 "zcopy": true, 00:11:55.598 "get_zone_info": false, 00:11:55.598 "zone_management": false, 00:11:55.598 "zone_append": false, 00:11:55.598 "compare": false, 00:11:55.598 "compare_and_write": false, 00:11:55.598 "abort": true, 00:11:55.598 "seek_hole": false, 00:11:55.598 "seek_data": false, 00:11:55.598 "copy": true, 00:11:55.598 "nvme_iov_md": false 00:11:55.598 }, 00:11:55.598 "memory_domains": [ 00:11:55.598 { 00:11:55.598 "dma_device_id": "system", 00:11:55.598 "dma_device_type": 1 00:11:55.598 }, 00:11:55.598 { 00:11:55.598 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:55.598 "dma_device_type": 2 00:11:55.598 } 00:11:55.598 ], 00:11:55.598 "driver_specific": { 00:11:55.598 "passthru": { 00:11:55.598 "name": "pt1", 00:11:55.598 "base_bdev_name": "malloc1" 00:11:55.598 } 00:11:55.598 } 00:11:55.598 }' 00:11:55.598 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:55.598 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:55.598 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:55.598 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:55.598 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:55.857 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:55.857 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:55.857 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:55.857 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:55.857 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:55.857 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:55.857 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:55.857 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:55.857 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:55.857 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:56.117 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:56.117 "name": "pt2", 00:11:56.117 "aliases": [ 00:11:56.117 "00000000-0000-0000-0000-000000000002" 00:11:56.117 ], 00:11:56.117 "product_name": "passthru", 00:11:56.117 "block_size": 512, 00:11:56.117 "num_blocks": 65536, 00:11:56.117 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:56.117 "assigned_rate_limits": { 00:11:56.117 "rw_ios_per_sec": 0, 00:11:56.117 "rw_mbytes_per_sec": 0, 00:11:56.117 "r_mbytes_per_sec": 0, 00:11:56.117 "w_mbytes_per_sec": 0 00:11:56.117 }, 00:11:56.117 "claimed": true, 00:11:56.117 "claim_type": "exclusive_write", 00:11:56.117 "zoned": false, 00:11:56.117 "supported_io_types": { 00:11:56.117 "read": true, 00:11:56.117 "write": true, 00:11:56.117 "unmap": true, 00:11:56.117 "flush": true, 00:11:56.117 "reset": true, 00:11:56.117 "nvme_admin": false, 00:11:56.117 "nvme_io": false, 00:11:56.117 "nvme_io_md": false, 00:11:56.117 "write_zeroes": true, 00:11:56.117 "zcopy": true, 00:11:56.117 "get_zone_info": false, 00:11:56.117 "zone_management": false, 00:11:56.117 "zone_append": false, 00:11:56.117 "compare": false, 00:11:56.117 "compare_and_write": false, 00:11:56.117 "abort": true, 00:11:56.117 "seek_hole": false, 00:11:56.117 "seek_data": false, 00:11:56.117 "copy": true, 00:11:56.117 "nvme_iov_md": false 00:11:56.117 }, 00:11:56.117 "memory_domains": [ 00:11:56.117 { 00:11:56.117 "dma_device_id": "system", 00:11:56.117 "dma_device_type": 1 00:11:56.117 }, 00:11:56.117 { 00:11:56.117 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:56.117 "dma_device_type": 2 00:11:56.117 } 00:11:56.117 ], 00:11:56.117 "driver_specific": { 00:11:56.117 "passthru": { 00:11:56.117 "name": "pt2", 00:11:56.117 "base_bdev_name": "malloc2" 00:11:56.117 } 00:11:56.117 } 00:11:56.117 }' 00:11:56.117 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:56.117 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:56.376 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:56.376 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:56.376 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:56.376 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:56.376 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:56.376 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:56.376 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:56.376 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:56.376 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:56.636 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:56.636 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:56.636 19:47:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:11:56.636 [2024-07-24 19:47:48.211691] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:56.895 19:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' 35697ca3-a924-4a2c-9947-061f0b841a14 '!=' 35697ca3-a924-4a2c-9947-061f0b841a14 ']' 00:11:56.895 19:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy raid0 00:11:56.895 19:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:56.895 19:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:56.895 19:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 1376200 00:11:56.895 19:47:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1376200 ']' 00:11:56.895 19:47:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1376200 00:11:56.895 19:47:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:11:56.895 19:47:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:56.895 19:47:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1376200 00:11:56.895 19:47:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:56.895 19:47:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:56.895 19:47:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1376200' 00:11:56.895 killing process with pid 1376200 00:11:56.895 19:47:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 1376200 00:11:56.895 [2024-07-24 19:47:48.283486] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:56.895 [2024-07-24 19:47:48.283540] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:56.895 [2024-07-24 19:47:48.283580] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:56.895 [2024-07-24 19:47:48.283591] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2215570 name raid_bdev1, state offline 00:11:56.895 19:47:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 1376200 00:11:56.895 [2024-07-24 19:47:48.302800] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:57.155 19:47:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:11:57.155 00:11:57.155 real 0m10.444s 00:11:57.155 user 0m18.557s 00:11:57.155 sys 0m2.002s 00:11:57.155 19:47:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:57.155 19:47:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:57.155 ************************************ 00:11:57.155 END TEST raid_superblock_test 00:11:57.155 ************************************ 00:11:57.155 19:47:48 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:11:57.155 19:47:48 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:11:57.155 19:47:48 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:57.155 19:47:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:57.155 ************************************ 00:11:57.155 START TEST raid_read_error_test 00:11:57.155 ************************************ 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 2 read 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid0 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=2 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid0 '!=' raid1 ']' 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.FvKAjWViqa 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1377796 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1377796 /var/tmp/spdk-raid.sock 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 1377796 ']' 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:57.155 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:57.155 19:47:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:57.155 [2024-07-24 19:47:48.687012] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:11:57.155 [2024-07-24 19:47:48.687065] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1377796 ] 00:11:57.414 [2024-07-24 19:47:48.798901] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:57.414 [2024-07-24 19:47:48.899094] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:57.414 [2024-07-24 19:47:48.970701] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:57.414 [2024-07-24 19:47:48.970743] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:57.983 19:47:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:57.983 19:47:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:11:57.983 19:47:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:11:57.983 19:47:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:58.244 BaseBdev1_malloc 00:11:58.244 19:47:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:58.519 true 00:11:58.519 19:47:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:58.793 [2024-07-24 19:47:50.229445] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:58.793 [2024-07-24 19:47:50.229495] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:58.793 [2024-07-24 19:47:50.229515] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11363a0 00:11:58.793 [2024-07-24 19:47:50.229528] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:58.793 [2024-07-24 19:47:50.231224] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:58.793 [2024-07-24 19:47:50.231253] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:58.793 BaseBdev1 00:11:58.793 19:47:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:11:58.793 19:47:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:59.052 BaseBdev2_malloc 00:11:59.052 19:47:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:59.309 true 00:11:59.309 19:47:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:59.567 [2024-07-24 19:47:50.903792] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:59.567 [2024-07-24 19:47:50.903836] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:59.567 [2024-07-24 19:47:50.903859] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11f5370 00:11:59.567 [2024-07-24 19:47:50.903871] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:59.567 [2024-07-24 19:47:50.905307] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:59.567 [2024-07-24 19:47:50.905334] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:59.567 BaseBdev2 00:11:59.567 19:47:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:59.567 [2024-07-24 19:47:51.140444] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:59.567 [2024-07-24 19:47:51.141604] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:59.567 [2024-07-24 19:47:51.141784] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x112c340 00:11:59.567 [2024-07-24 19:47:51.141797] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:59.567 [2024-07-24 19:47:51.141968] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x112d050 00:11:59.567 [2024-07-24 19:47:51.142107] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x112c340 00:11:59.567 [2024-07-24 19:47:51.142117] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x112c340 00:11:59.567 [2024-07-24 19:47:51.142215] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:59.824 19:47:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:59.824 19:47:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:59.824 19:47:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:59.824 19:47:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:59.824 19:47:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:59.824 19:47:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:59.824 19:47:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:59.824 19:47:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:59.824 19:47:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:59.824 19:47:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:59.824 19:47:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:59.824 19:47:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:00.081 19:47:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:00.081 "name": "raid_bdev1", 00:12:00.081 "uuid": "79e40e6b-b877-4717-be7a-da07bc3382fc", 00:12:00.081 "strip_size_kb": 64, 00:12:00.081 "state": "online", 00:12:00.081 "raid_level": "raid0", 00:12:00.081 "superblock": true, 00:12:00.081 "num_base_bdevs": 2, 00:12:00.081 "num_base_bdevs_discovered": 2, 00:12:00.081 "num_base_bdevs_operational": 2, 00:12:00.081 "base_bdevs_list": [ 00:12:00.081 { 00:12:00.081 "name": "BaseBdev1", 00:12:00.081 "uuid": "bc8f519e-71c9-5e7d-a6c9-1c347a45570d", 00:12:00.081 "is_configured": true, 00:12:00.081 "data_offset": 2048, 00:12:00.081 "data_size": 63488 00:12:00.081 }, 00:12:00.081 { 00:12:00.081 "name": "BaseBdev2", 00:12:00.081 "uuid": "2ec5e069-eb27-5991-8964-cd432417d331", 00:12:00.081 "is_configured": true, 00:12:00.081 "data_offset": 2048, 00:12:00.081 "data_size": 63488 00:12:00.081 } 00:12:00.081 ] 00:12:00.081 }' 00:12:00.081 19:47:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:00.081 19:47:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:00.646 19:47:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:00.646 19:47:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:12:00.646 [2024-07-24 19:47:52.107326] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11f68f0 00:12:01.579 19:47:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:01.837 19:47:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:12:01.837 19:47:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid0 = \r\a\i\d\1 ]] 00:12:01.838 19:47:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=2 00:12:01.838 19:47:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:01.838 19:47:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:01.838 19:47:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:01.838 19:47:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:01.838 19:47:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:01.838 19:47:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:01.838 19:47:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:01.838 19:47:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:01.838 19:47:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:01.838 19:47:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:01.838 19:47:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:01.838 19:47:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:02.096 19:47:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:02.096 "name": "raid_bdev1", 00:12:02.096 "uuid": "79e40e6b-b877-4717-be7a-da07bc3382fc", 00:12:02.096 "strip_size_kb": 64, 00:12:02.096 "state": "online", 00:12:02.096 "raid_level": "raid0", 00:12:02.096 "superblock": true, 00:12:02.096 "num_base_bdevs": 2, 00:12:02.096 "num_base_bdevs_discovered": 2, 00:12:02.096 "num_base_bdevs_operational": 2, 00:12:02.096 "base_bdevs_list": [ 00:12:02.096 { 00:12:02.096 "name": "BaseBdev1", 00:12:02.096 "uuid": "bc8f519e-71c9-5e7d-a6c9-1c347a45570d", 00:12:02.096 "is_configured": true, 00:12:02.096 "data_offset": 2048, 00:12:02.096 "data_size": 63488 00:12:02.096 }, 00:12:02.096 { 00:12:02.096 "name": "BaseBdev2", 00:12:02.096 "uuid": "2ec5e069-eb27-5991-8964-cd432417d331", 00:12:02.096 "is_configured": true, 00:12:02.096 "data_offset": 2048, 00:12:02.096 "data_size": 63488 00:12:02.096 } 00:12:02.096 ] 00:12:02.096 }' 00:12:02.096 19:47:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:02.096 19:47:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:02.662 19:47:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:02.920 [2024-07-24 19:47:54.348507] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:02.920 [2024-07-24 19:47:54.348543] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:02.920 [2024-07-24 19:47:54.351715] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:02.920 [2024-07-24 19:47:54.351746] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:02.920 [2024-07-24 19:47:54.351772] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:02.920 [2024-07-24 19:47:54.351790] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x112c340 name raid_bdev1, state offline 00:12:02.920 0 00:12:02.920 19:47:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1377796 00:12:02.920 19:47:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 1377796 ']' 00:12:02.920 19:47:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 1377796 00:12:02.920 19:47:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:12:02.920 19:47:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:02.920 19:47:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1377796 00:12:02.920 19:47:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:02.920 19:47:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:02.920 19:47:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1377796' 00:12:02.920 killing process with pid 1377796 00:12:02.920 19:47:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 1377796 00:12:02.920 [2024-07-24 19:47:54.433919] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:02.920 19:47:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 1377796 00:12:02.920 [2024-07-24 19:47:54.444753] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:03.178 19:47:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.FvKAjWViqa 00:12:03.178 19:47:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:12:03.178 19:47:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:12:03.178 19:47:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.45 00:12:03.178 19:47:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid0 00:12:03.178 19:47:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:03.178 19:47:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:03.178 19:47:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.45 != \0\.\0\0 ]] 00:12:03.178 00:12:03.178 real 0m6.056s 00:12:03.178 user 0m9.412s 00:12:03.178 sys 0m1.060s 00:12:03.178 19:47:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:03.178 19:47:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:03.178 ************************************ 00:12:03.178 END TEST raid_read_error_test 00:12:03.178 ************************************ 00:12:03.178 19:47:54 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:12:03.178 19:47:54 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:12:03.178 19:47:54 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:03.178 19:47:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:03.178 ************************************ 00:12:03.178 START TEST raid_write_error_test 00:12:03.179 ************************************ 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 2 write 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid0 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=2 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid0 '!=' raid1 ']' 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.7KjT7PQcWN 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1378769 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1378769 /var/tmp/spdk-raid.sock 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 1378769 ']' 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:03.179 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:03.179 19:47:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:03.437 [2024-07-24 19:47:54.823315] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:12:03.437 [2024-07-24 19:47:54.823382] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1378769 ] 00:12:03.437 [2024-07-24 19:47:54.949651] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:03.695 [2024-07-24 19:47:55.054641] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:03.695 [2024-07-24 19:47:55.122584] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:03.695 [2024-07-24 19:47:55.122620] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:04.626 19:47:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:04.626 19:47:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:12:04.626 19:47:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:12:04.626 19:47:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:04.883 BaseBdev1_malloc 00:12:04.883 19:47:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:05.448 true 00:12:05.448 19:47:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:05.706 [2024-07-24 19:47:57.271507] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:05.706 [2024-07-24 19:47:57.271560] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:05.706 [2024-07-24 19:47:57.271582] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21243a0 00:12:05.706 [2024-07-24 19:47:57.271595] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:05.706 [2024-07-24 19:47:57.273398] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:05.706 [2024-07-24 19:47:57.273426] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:05.706 BaseBdev1 00:12:05.964 19:47:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:12:05.964 19:47:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:06.222 BaseBdev2_malloc 00:12:06.480 19:47:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:06.738 true 00:12:06.996 19:47:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:06.996 [2024-07-24 19:47:58.563682] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:06.996 [2024-07-24 19:47:58.563723] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:06.996 [2024-07-24 19:47:58.563748] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21e3370 00:12:06.996 [2024-07-24 19:47:58.563761] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:06.996 [2024-07-24 19:47:58.565218] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:06.996 [2024-07-24 19:47:58.565244] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:06.996 BaseBdev2 00:12:06.996 19:47:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:07.562 [2024-07-24 19:47:59.073037] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:07.562 [2024-07-24 19:47:59.074384] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:07.562 [2024-07-24 19:47:59.074584] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x211a340 00:12:07.562 [2024-07-24 19:47:59.074598] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:07.562 [2024-07-24 19:47:59.074794] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x211b050 00:12:07.562 [2024-07-24 19:47:59.074942] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x211a340 00:12:07.562 [2024-07-24 19:47:59.074952] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x211a340 00:12:07.562 [2024-07-24 19:47:59.075059] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:07.562 19:47:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:07.562 19:47:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:07.562 19:47:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:07.562 19:47:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:07.562 19:47:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:07.562 19:47:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:07.562 19:47:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:07.562 19:47:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:07.562 19:47:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:07.562 19:47:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:07.562 19:47:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:07.562 19:47:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:07.821 19:47:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:07.821 "name": "raid_bdev1", 00:12:07.821 "uuid": "59539d1e-d84d-4d63-a28f-e2a360b59257", 00:12:07.821 "strip_size_kb": 64, 00:12:07.821 "state": "online", 00:12:07.821 "raid_level": "raid0", 00:12:07.821 "superblock": true, 00:12:07.821 "num_base_bdevs": 2, 00:12:07.821 "num_base_bdevs_discovered": 2, 00:12:07.821 "num_base_bdevs_operational": 2, 00:12:07.821 "base_bdevs_list": [ 00:12:07.821 { 00:12:07.821 "name": "BaseBdev1", 00:12:07.821 "uuid": "71090234-7494-5782-97d2-48bb61362649", 00:12:07.821 "is_configured": true, 00:12:07.821 "data_offset": 2048, 00:12:07.821 "data_size": 63488 00:12:07.821 }, 00:12:07.821 { 00:12:07.821 "name": "BaseBdev2", 00:12:07.821 "uuid": "16fd319c-ca79-5e37-b9ce-5e7fece4150c", 00:12:07.821 "is_configured": true, 00:12:07.821 "data_offset": 2048, 00:12:07.821 "data_size": 63488 00:12:07.821 } 00:12:07.821 ] 00:12:07.821 }' 00:12:07.821 19:47:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:07.821 19:47:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:08.388 19:47:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:12:08.388 19:47:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:08.646 [2024-07-24 19:48:00.043936] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21e48f0 00:12:09.583 19:48:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:12:09.841 19:48:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:12:09.842 19:48:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid0 = \r\a\i\d\1 ]] 00:12:09.842 19:48:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=2 00:12:09.842 19:48:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:09.842 19:48:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:09.842 19:48:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:09.842 19:48:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:09.842 19:48:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:09.842 19:48:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:09.842 19:48:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:09.842 19:48:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:09.842 19:48:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:09.842 19:48:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:09.842 19:48:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:09.842 19:48:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:09.842 19:48:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:09.842 "name": "raid_bdev1", 00:12:09.842 "uuid": "59539d1e-d84d-4d63-a28f-e2a360b59257", 00:12:09.842 "strip_size_kb": 64, 00:12:09.842 "state": "online", 00:12:09.842 "raid_level": "raid0", 00:12:09.842 "superblock": true, 00:12:09.842 "num_base_bdevs": 2, 00:12:09.842 "num_base_bdevs_discovered": 2, 00:12:09.842 "num_base_bdevs_operational": 2, 00:12:09.842 "base_bdevs_list": [ 00:12:09.842 { 00:12:09.842 "name": "BaseBdev1", 00:12:09.842 "uuid": "71090234-7494-5782-97d2-48bb61362649", 00:12:09.842 "is_configured": true, 00:12:09.842 "data_offset": 2048, 00:12:09.842 "data_size": 63488 00:12:09.842 }, 00:12:09.842 { 00:12:09.842 "name": "BaseBdev2", 00:12:09.842 "uuid": "16fd319c-ca79-5e37-b9ce-5e7fece4150c", 00:12:09.842 "is_configured": true, 00:12:09.842 "data_offset": 2048, 00:12:09.842 "data_size": 63488 00:12:09.842 } 00:12:09.842 ] 00:12:09.842 }' 00:12:09.842 19:48:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:09.842 19:48:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:10.780 19:48:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:10.780 [2024-07-24 19:48:02.272672] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:10.780 [2024-07-24 19:48:02.272706] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:10.780 [2024-07-24 19:48:02.275866] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:10.780 [2024-07-24 19:48:02.275896] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:10.780 [2024-07-24 19:48:02.275921] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:10.780 [2024-07-24 19:48:02.275932] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x211a340 name raid_bdev1, state offline 00:12:10.780 0 00:12:10.780 19:48:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1378769 00:12:10.780 19:48:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 1378769 ']' 00:12:10.780 19:48:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 1378769 00:12:10.781 19:48:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:12:10.781 19:48:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:10.781 19:48:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1378769 00:12:10.781 19:48:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:10.781 19:48:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:10.781 19:48:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1378769' 00:12:10.781 killing process with pid 1378769 00:12:10.781 19:48:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 1378769 00:12:10.781 [2024-07-24 19:48:02.361525] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:10.781 19:48:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 1378769 00:12:10.781 [2024-07-24 19:48:02.371852] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:11.039 19:48:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.7KjT7PQcWN 00:12:11.039 19:48:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:12:11.039 19:48:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:12:11.039 19:48:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.45 00:12:11.039 19:48:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid0 00:12:11.039 19:48:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:11.039 19:48:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:11.039 19:48:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.45 != \0\.\0\0 ]] 00:12:11.039 00:12:11.039 real 0m7.847s 00:12:11.039 user 0m12.765s 00:12:11.039 sys 0m1.286s 00:12:11.039 19:48:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:11.039 19:48:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:11.039 ************************************ 00:12:11.039 END TEST raid_write_error_test 00:12:11.039 ************************************ 00:12:11.299 19:48:02 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:12:11.299 19:48:02 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:12:11.299 19:48:02 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:12:11.299 19:48:02 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:11.299 19:48:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:11.299 ************************************ 00:12:11.299 START TEST raid_state_function_test 00:12:11.299 ************************************ 00:12:11.299 19:48:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 2 false 00:12:11.299 19:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:12:11.299 19:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:11.299 19:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:11.299 19:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:11.299 19:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:11.299 19:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:11.299 19:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:11.299 19:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:11.299 19:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:11.299 19:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:11.299 19:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:11.299 19:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:11.299 19:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:11.299 19:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:11.299 19:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:11.299 19:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:11.299 19:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:11.299 19:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:11.299 19:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:12:11.299 19:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:11.299 19:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:11.299 19:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:11.299 19:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:11.299 19:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1379993 00:12:11.299 19:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1379993' 00:12:11.299 Process raid pid: 1379993 00:12:11.300 19:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:11.300 19:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1379993 /var/tmp/spdk-raid.sock 00:12:11.300 19:48:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 1379993 ']' 00:12:11.300 19:48:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:11.300 19:48:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:11.300 19:48:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:11.300 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:11.300 19:48:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:11.300 19:48:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:11.300 [2024-07-24 19:48:02.748436] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:12:11.300 [2024-07-24 19:48:02.748504] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:11.300 [2024-07-24 19:48:02.871790] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:11.558 [2024-07-24 19:48:02.980528] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:11.558 [2024-07-24 19:48:03.045358] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:11.558 [2024-07-24 19:48:03.045388] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:12.125 19:48:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:12.125 19:48:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:12:12.125 19:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:12.384 [2024-07-24 19:48:03.907997] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:12.384 [2024-07-24 19:48:03.908035] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:12.384 [2024-07-24 19:48:03.908046] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:12.384 [2024-07-24 19:48:03.908057] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:12.384 19:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:12.384 19:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:12.384 19:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:12.384 19:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:12.384 19:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:12.384 19:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:12.384 19:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:12.384 19:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:12.384 19:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:12.384 19:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:12.384 19:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:12.384 19:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:12.643 19:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:12.643 "name": "Existed_Raid", 00:12:12.643 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:12.643 "strip_size_kb": 64, 00:12:12.644 "state": "configuring", 00:12:12.644 "raid_level": "concat", 00:12:12.644 "superblock": false, 00:12:12.644 "num_base_bdevs": 2, 00:12:12.644 "num_base_bdevs_discovered": 0, 00:12:12.644 "num_base_bdevs_operational": 2, 00:12:12.644 "base_bdevs_list": [ 00:12:12.644 { 00:12:12.644 "name": "BaseBdev1", 00:12:12.644 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:12.644 "is_configured": false, 00:12:12.644 "data_offset": 0, 00:12:12.644 "data_size": 0 00:12:12.644 }, 00:12:12.644 { 00:12:12.644 "name": "BaseBdev2", 00:12:12.644 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:12.644 "is_configured": false, 00:12:12.644 "data_offset": 0, 00:12:12.644 "data_size": 0 00:12:12.644 } 00:12:12.644 ] 00:12:12.644 }' 00:12:12.644 19:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:12.644 19:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:13.579 19:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:13.579 [2024-07-24 19:48:05.066951] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:13.579 [2024-07-24 19:48:05.066981] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22d09f0 name Existed_Raid, state configuring 00:12:13.579 19:48:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:13.837 [2024-07-24 19:48:05.311611] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:13.837 [2024-07-24 19:48:05.311640] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:13.837 [2024-07-24 19:48:05.311650] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:13.837 [2024-07-24 19:48:05.311661] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:13.838 19:48:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:14.096 [2024-07-24 19:48:05.562102] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:14.096 BaseBdev1 00:12:14.096 19:48:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:14.096 19:48:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:12:14.096 19:48:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:14.096 19:48:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:12:14.096 19:48:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:14.096 19:48:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:14.096 19:48:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:14.354 19:48:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:14.613 [ 00:12:14.613 { 00:12:14.613 "name": "BaseBdev1", 00:12:14.613 "aliases": [ 00:12:14.613 "d6f3d196-38e0-4ec7-a0b9-ef3b1f2cf655" 00:12:14.613 ], 00:12:14.613 "product_name": "Malloc disk", 00:12:14.613 "block_size": 512, 00:12:14.613 "num_blocks": 65536, 00:12:14.613 "uuid": "d6f3d196-38e0-4ec7-a0b9-ef3b1f2cf655", 00:12:14.613 "assigned_rate_limits": { 00:12:14.613 "rw_ios_per_sec": 0, 00:12:14.613 "rw_mbytes_per_sec": 0, 00:12:14.613 "r_mbytes_per_sec": 0, 00:12:14.613 "w_mbytes_per_sec": 0 00:12:14.613 }, 00:12:14.613 "claimed": true, 00:12:14.613 "claim_type": "exclusive_write", 00:12:14.613 "zoned": false, 00:12:14.613 "supported_io_types": { 00:12:14.613 "read": true, 00:12:14.613 "write": true, 00:12:14.613 "unmap": true, 00:12:14.613 "flush": true, 00:12:14.613 "reset": true, 00:12:14.613 "nvme_admin": false, 00:12:14.614 "nvme_io": false, 00:12:14.614 "nvme_io_md": false, 00:12:14.614 "write_zeroes": true, 00:12:14.614 "zcopy": true, 00:12:14.614 "get_zone_info": false, 00:12:14.614 "zone_management": false, 00:12:14.614 "zone_append": false, 00:12:14.614 "compare": false, 00:12:14.614 "compare_and_write": false, 00:12:14.614 "abort": true, 00:12:14.614 "seek_hole": false, 00:12:14.614 "seek_data": false, 00:12:14.614 "copy": true, 00:12:14.614 "nvme_iov_md": false 00:12:14.614 }, 00:12:14.614 "memory_domains": [ 00:12:14.614 { 00:12:14.614 "dma_device_id": "system", 00:12:14.614 "dma_device_type": 1 00:12:14.614 }, 00:12:14.614 { 00:12:14.614 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:14.614 "dma_device_type": 2 00:12:14.614 } 00:12:14.614 ], 00:12:14.614 "driver_specific": {} 00:12:14.614 } 00:12:14.614 ] 00:12:14.614 19:48:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:12:14.614 19:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:14.614 19:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:14.614 19:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:14.614 19:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:14.614 19:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:14.614 19:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:14.614 19:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:14.614 19:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:14.614 19:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:14.614 19:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:14.614 19:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:14.614 19:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:14.873 19:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:14.873 "name": "Existed_Raid", 00:12:14.873 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:14.873 "strip_size_kb": 64, 00:12:14.873 "state": "configuring", 00:12:14.873 "raid_level": "concat", 00:12:14.873 "superblock": false, 00:12:14.873 "num_base_bdevs": 2, 00:12:14.873 "num_base_bdevs_discovered": 1, 00:12:14.873 "num_base_bdevs_operational": 2, 00:12:14.873 "base_bdevs_list": [ 00:12:14.873 { 00:12:14.873 "name": "BaseBdev1", 00:12:14.873 "uuid": "d6f3d196-38e0-4ec7-a0b9-ef3b1f2cf655", 00:12:14.873 "is_configured": true, 00:12:14.873 "data_offset": 0, 00:12:14.873 "data_size": 65536 00:12:14.873 }, 00:12:14.873 { 00:12:14.873 "name": "BaseBdev2", 00:12:14.873 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:14.873 "is_configured": false, 00:12:14.873 "data_offset": 0, 00:12:14.873 "data_size": 0 00:12:14.873 } 00:12:14.873 ] 00:12:14.873 }' 00:12:14.873 19:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:14.873 19:48:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:15.439 19:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:15.742 [2024-07-24 19:48:07.110207] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:15.742 [2024-07-24 19:48:07.110244] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22d02e0 name Existed_Raid, state configuring 00:12:15.742 19:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:15.742 [2024-07-24 19:48:07.290714] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:15.742 [2024-07-24 19:48:07.292185] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:15.742 [2024-07-24 19:48:07.292217] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:15.742 19:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:15.742 19:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:15.742 19:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:15.742 19:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:15.742 19:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:15.742 19:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:15.742 19:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:15.742 19:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:15.742 19:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:15.742 19:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:15.742 19:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:15.742 19:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:15.742 19:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:15.742 19:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:16.001 19:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:16.001 "name": "Existed_Raid", 00:12:16.001 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:16.001 "strip_size_kb": 64, 00:12:16.001 "state": "configuring", 00:12:16.001 "raid_level": "concat", 00:12:16.001 "superblock": false, 00:12:16.001 "num_base_bdevs": 2, 00:12:16.001 "num_base_bdevs_discovered": 1, 00:12:16.001 "num_base_bdevs_operational": 2, 00:12:16.001 "base_bdevs_list": [ 00:12:16.001 { 00:12:16.001 "name": "BaseBdev1", 00:12:16.001 "uuid": "d6f3d196-38e0-4ec7-a0b9-ef3b1f2cf655", 00:12:16.001 "is_configured": true, 00:12:16.001 "data_offset": 0, 00:12:16.001 "data_size": 65536 00:12:16.001 }, 00:12:16.001 { 00:12:16.001 "name": "BaseBdev2", 00:12:16.001 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:16.001 "is_configured": false, 00:12:16.001 "data_offset": 0, 00:12:16.001 "data_size": 0 00:12:16.001 } 00:12:16.001 ] 00:12:16.001 }' 00:12:16.001 19:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:16.001 19:48:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:16.567 19:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:16.826 [2024-07-24 19:48:08.380999] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:16.826 [2024-07-24 19:48:08.381033] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x22d10d0 00:12:16.826 [2024-07-24 19:48:08.381041] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:12:16.826 [2024-07-24 19:48:08.381229] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2474ab0 00:12:16.826 [2024-07-24 19:48:08.381350] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22d10d0 00:12:16.826 [2024-07-24 19:48:08.381360] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x22d10d0 00:12:16.826 [2024-07-24 19:48:08.381529] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:16.826 BaseBdev2 00:12:16.826 19:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:16.826 19:48:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:12:16.826 19:48:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:16.826 19:48:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:12:16.826 19:48:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:16.826 19:48:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:16.826 19:48:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:17.084 19:48:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:17.342 [ 00:12:17.342 { 00:12:17.342 "name": "BaseBdev2", 00:12:17.342 "aliases": [ 00:12:17.342 "cd98c072-d961-4dbc-b184-76763012f929" 00:12:17.342 ], 00:12:17.342 "product_name": "Malloc disk", 00:12:17.342 "block_size": 512, 00:12:17.342 "num_blocks": 65536, 00:12:17.342 "uuid": "cd98c072-d961-4dbc-b184-76763012f929", 00:12:17.342 "assigned_rate_limits": { 00:12:17.342 "rw_ios_per_sec": 0, 00:12:17.342 "rw_mbytes_per_sec": 0, 00:12:17.342 "r_mbytes_per_sec": 0, 00:12:17.342 "w_mbytes_per_sec": 0 00:12:17.342 }, 00:12:17.342 "claimed": true, 00:12:17.342 "claim_type": "exclusive_write", 00:12:17.342 "zoned": false, 00:12:17.342 "supported_io_types": { 00:12:17.342 "read": true, 00:12:17.342 "write": true, 00:12:17.342 "unmap": true, 00:12:17.342 "flush": true, 00:12:17.342 "reset": true, 00:12:17.342 "nvme_admin": false, 00:12:17.342 "nvme_io": false, 00:12:17.342 "nvme_io_md": false, 00:12:17.342 "write_zeroes": true, 00:12:17.342 "zcopy": true, 00:12:17.342 "get_zone_info": false, 00:12:17.342 "zone_management": false, 00:12:17.342 "zone_append": false, 00:12:17.342 "compare": false, 00:12:17.342 "compare_and_write": false, 00:12:17.342 "abort": true, 00:12:17.342 "seek_hole": false, 00:12:17.342 "seek_data": false, 00:12:17.342 "copy": true, 00:12:17.342 "nvme_iov_md": false 00:12:17.342 }, 00:12:17.342 "memory_domains": [ 00:12:17.342 { 00:12:17.342 "dma_device_id": "system", 00:12:17.342 "dma_device_type": 1 00:12:17.342 }, 00:12:17.342 { 00:12:17.342 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:17.342 "dma_device_type": 2 00:12:17.342 } 00:12:17.342 ], 00:12:17.342 "driver_specific": {} 00:12:17.342 } 00:12:17.342 ] 00:12:17.342 19:48:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:12:17.342 19:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:17.342 19:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:17.342 19:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:12:17.342 19:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:17.342 19:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:17.342 19:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:17.342 19:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:17.342 19:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:17.342 19:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:17.342 19:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:17.342 19:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:17.342 19:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:17.342 19:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:17.342 19:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:17.601 19:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:17.601 "name": "Existed_Raid", 00:12:17.601 "uuid": "9ac9d24c-3716-49fe-9e33-54bca717947f", 00:12:17.601 "strip_size_kb": 64, 00:12:17.601 "state": "online", 00:12:17.601 "raid_level": "concat", 00:12:17.601 "superblock": false, 00:12:17.601 "num_base_bdevs": 2, 00:12:17.601 "num_base_bdevs_discovered": 2, 00:12:17.601 "num_base_bdevs_operational": 2, 00:12:17.601 "base_bdevs_list": [ 00:12:17.601 { 00:12:17.601 "name": "BaseBdev1", 00:12:17.601 "uuid": "d6f3d196-38e0-4ec7-a0b9-ef3b1f2cf655", 00:12:17.601 "is_configured": true, 00:12:17.601 "data_offset": 0, 00:12:17.601 "data_size": 65536 00:12:17.601 }, 00:12:17.601 { 00:12:17.601 "name": "BaseBdev2", 00:12:17.601 "uuid": "cd98c072-d961-4dbc-b184-76763012f929", 00:12:17.601 "is_configured": true, 00:12:17.601 "data_offset": 0, 00:12:17.601 "data_size": 65536 00:12:17.601 } 00:12:17.601 ] 00:12:17.601 }' 00:12:17.601 19:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:17.601 19:48:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:18.197 19:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:18.197 19:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:18.197 19:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:18.197 19:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:18.197 19:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:18.197 19:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:18.197 19:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:18.197 19:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:18.469 [2024-07-24 19:48:09.845168] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:18.469 19:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:18.469 "name": "Existed_Raid", 00:12:18.469 "aliases": [ 00:12:18.469 "9ac9d24c-3716-49fe-9e33-54bca717947f" 00:12:18.469 ], 00:12:18.469 "product_name": "Raid Volume", 00:12:18.469 "block_size": 512, 00:12:18.469 "num_blocks": 131072, 00:12:18.469 "uuid": "9ac9d24c-3716-49fe-9e33-54bca717947f", 00:12:18.469 "assigned_rate_limits": { 00:12:18.469 "rw_ios_per_sec": 0, 00:12:18.469 "rw_mbytes_per_sec": 0, 00:12:18.469 "r_mbytes_per_sec": 0, 00:12:18.469 "w_mbytes_per_sec": 0 00:12:18.469 }, 00:12:18.469 "claimed": false, 00:12:18.469 "zoned": false, 00:12:18.469 "supported_io_types": { 00:12:18.469 "read": true, 00:12:18.469 "write": true, 00:12:18.469 "unmap": true, 00:12:18.469 "flush": true, 00:12:18.469 "reset": true, 00:12:18.469 "nvme_admin": false, 00:12:18.469 "nvme_io": false, 00:12:18.469 "nvme_io_md": false, 00:12:18.469 "write_zeroes": true, 00:12:18.469 "zcopy": false, 00:12:18.469 "get_zone_info": false, 00:12:18.469 "zone_management": false, 00:12:18.469 "zone_append": false, 00:12:18.469 "compare": false, 00:12:18.469 "compare_and_write": false, 00:12:18.469 "abort": false, 00:12:18.469 "seek_hole": false, 00:12:18.469 "seek_data": false, 00:12:18.469 "copy": false, 00:12:18.469 "nvme_iov_md": false 00:12:18.469 }, 00:12:18.469 "memory_domains": [ 00:12:18.469 { 00:12:18.469 "dma_device_id": "system", 00:12:18.469 "dma_device_type": 1 00:12:18.469 }, 00:12:18.469 { 00:12:18.469 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:18.469 "dma_device_type": 2 00:12:18.469 }, 00:12:18.469 { 00:12:18.469 "dma_device_id": "system", 00:12:18.469 "dma_device_type": 1 00:12:18.469 }, 00:12:18.469 { 00:12:18.469 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:18.469 "dma_device_type": 2 00:12:18.469 } 00:12:18.469 ], 00:12:18.469 "driver_specific": { 00:12:18.469 "raid": { 00:12:18.469 "uuid": "9ac9d24c-3716-49fe-9e33-54bca717947f", 00:12:18.469 "strip_size_kb": 64, 00:12:18.469 "state": "online", 00:12:18.469 "raid_level": "concat", 00:12:18.469 "superblock": false, 00:12:18.469 "num_base_bdevs": 2, 00:12:18.469 "num_base_bdevs_discovered": 2, 00:12:18.469 "num_base_bdevs_operational": 2, 00:12:18.469 "base_bdevs_list": [ 00:12:18.469 { 00:12:18.469 "name": "BaseBdev1", 00:12:18.469 "uuid": "d6f3d196-38e0-4ec7-a0b9-ef3b1f2cf655", 00:12:18.469 "is_configured": true, 00:12:18.469 "data_offset": 0, 00:12:18.469 "data_size": 65536 00:12:18.469 }, 00:12:18.469 { 00:12:18.469 "name": "BaseBdev2", 00:12:18.469 "uuid": "cd98c072-d961-4dbc-b184-76763012f929", 00:12:18.469 "is_configured": true, 00:12:18.469 "data_offset": 0, 00:12:18.469 "data_size": 65536 00:12:18.469 } 00:12:18.469 ] 00:12:18.469 } 00:12:18.469 } 00:12:18.469 }' 00:12:18.469 19:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:18.469 19:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:18.469 BaseBdev2' 00:12:18.469 19:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:18.469 19:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:18.469 19:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:18.728 19:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:18.728 "name": "BaseBdev1", 00:12:18.728 "aliases": [ 00:12:18.728 "d6f3d196-38e0-4ec7-a0b9-ef3b1f2cf655" 00:12:18.728 ], 00:12:18.728 "product_name": "Malloc disk", 00:12:18.728 "block_size": 512, 00:12:18.728 "num_blocks": 65536, 00:12:18.728 "uuid": "d6f3d196-38e0-4ec7-a0b9-ef3b1f2cf655", 00:12:18.728 "assigned_rate_limits": { 00:12:18.728 "rw_ios_per_sec": 0, 00:12:18.728 "rw_mbytes_per_sec": 0, 00:12:18.728 "r_mbytes_per_sec": 0, 00:12:18.728 "w_mbytes_per_sec": 0 00:12:18.728 }, 00:12:18.728 "claimed": true, 00:12:18.728 "claim_type": "exclusive_write", 00:12:18.728 "zoned": false, 00:12:18.728 "supported_io_types": { 00:12:18.728 "read": true, 00:12:18.728 "write": true, 00:12:18.728 "unmap": true, 00:12:18.728 "flush": true, 00:12:18.728 "reset": true, 00:12:18.728 "nvme_admin": false, 00:12:18.728 "nvme_io": false, 00:12:18.728 "nvme_io_md": false, 00:12:18.728 "write_zeroes": true, 00:12:18.728 "zcopy": true, 00:12:18.728 "get_zone_info": false, 00:12:18.728 "zone_management": false, 00:12:18.728 "zone_append": false, 00:12:18.728 "compare": false, 00:12:18.728 "compare_and_write": false, 00:12:18.728 "abort": true, 00:12:18.728 "seek_hole": false, 00:12:18.728 "seek_data": false, 00:12:18.728 "copy": true, 00:12:18.728 "nvme_iov_md": false 00:12:18.728 }, 00:12:18.728 "memory_domains": [ 00:12:18.728 { 00:12:18.728 "dma_device_id": "system", 00:12:18.728 "dma_device_type": 1 00:12:18.728 }, 00:12:18.728 { 00:12:18.728 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:18.728 "dma_device_type": 2 00:12:18.728 } 00:12:18.728 ], 00:12:18.728 "driver_specific": {} 00:12:18.728 }' 00:12:18.728 19:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:18.728 19:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:18.728 19:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:18.728 19:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:18.728 19:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:18.987 19:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:18.987 19:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:18.987 19:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:18.987 19:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:18.987 19:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:18.987 19:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:18.987 19:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:18.987 19:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:18.987 19:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:18.987 19:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:19.246 19:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:19.246 "name": "BaseBdev2", 00:12:19.246 "aliases": [ 00:12:19.246 "cd98c072-d961-4dbc-b184-76763012f929" 00:12:19.246 ], 00:12:19.246 "product_name": "Malloc disk", 00:12:19.246 "block_size": 512, 00:12:19.246 "num_blocks": 65536, 00:12:19.246 "uuid": "cd98c072-d961-4dbc-b184-76763012f929", 00:12:19.246 "assigned_rate_limits": { 00:12:19.246 "rw_ios_per_sec": 0, 00:12:19.246 "rw_mbytes_per_sec": 0, 00:12:19.246 "r_mbytes_per_sec": 0, 00:12:19.246 "w_mbytes_per_sec": 0 00:12:19.246 }, 00:12:19.246 "claimed": true, 00:12:19.246 "claim_type": "exclusive_write", 00:12:19.246 "zoned": false, 00:12:19.246 "supported_io_types": { 00:12:19.246 "read": true, 00:12:19.246 "write": true, 00:12:19.246 "unmap": true, 00:12:19.246 "flush": true, 00:12:19.246 "reset": true, 00:12:19.246 "nvme_admin": false, 00:12:19.246 "nvme_io": false, 00:12:19.246 "nvme_io_md": false, 00:12:19.246 "write_zeroes": true, 00:12:19.246 "zcopy": true, 00:12:19.246 "get_zone_info": false, 00:12:19.246 "zone_management": false, 00:12:19.246 "zone_append": false, 00:12:19.246 "compare": false, 00:12:19.246 "compare_and_write": false, 00:12:19.246 "abort": true, 00:12:19.246 "seek_hole": false, 00:12:19.246 "seek_data": false, 00:12:19.246 "copy": true, 00:12:19.246 "nvme_iov_md": false 00:12:19.246 }, 00:12:19.246 "memory_domains": [ 00:12:19.246 { 00:12:19.246 "dma_device_id": "system", 00:12:19.246 "dma_device_type": 1 00:12:19.246 }, 00:12:19.246 { 00:12:19.246 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:19.246 "dma_device_type": 2 00:12:19.246 } 00:12:19.246 ], 00:12:19.246 "driver_specific": {} 00:12:19.246 }' 00:12:19.246 19:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:19.506 19:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:19.506 19:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:19.506 19:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:19.506 19:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:19.506 19:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:19.506 19:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:19.506 19:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:19.506 19:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:19.506 19:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:19.764 19:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:19.764 19:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:19.764 19:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:19.764 [2024-07-24 19:48:11.296790] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:19.764 [2024-07-24 19:48:11.296814] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:19.764 [2024-07-24 19:48:11.296851] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:19.764 19:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:19.764 19:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:12:19.764 19:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:19.764 19:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:19.764 19:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:19.764 19:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:12:19.764 19:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:19.765 19:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:19.765 19:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:19.765 19:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:19.765 19:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:19.765 19:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:19.765 19:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:19.765 19:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:19.765 19:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:19.765 19:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:19.765 19:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:20.023 19:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:20.024 "name": "Existed_Raid", 00:12:20.024 "uuid": "9ac9d24c-3716-49fe-9e33-54bca717947f", 00:12:20.024 "strip_size_kb": 64, 00:12:20.024 "state": "offline", 00:12:20.024 "raid_level": "concat", 00:12:20.024 "superblock": false, 00:12:20.024 "num_base_bdevs": 2, 00:12:20.024 "num_base_bdevs_discovered": 1, 00:12:20.024 "num_base_bdevs_operational": 1, 00:12:20.024 "base_bdevs_list": [ 00:12:20.024 { 00:12:20.024 "name": null, 00:12:20.024 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:20.024 "is_configured": false, 00:12:20.024 "data_offset": 0, 00:12:20.024 "data_size": 65536 00:12:20.024 }, 00:12:20.024 { 00:12:20.024 "name": "BaseBdev2", 00:12:20.024 "uuid": "cd98c072-d961-4dbc-b184-76763012f929", 00:12:20.024 "is_configured": true, 00:12:20.024 "data_offset": 0, 00:12:20.024 "data_size": 65536 00:12:20.024 } 00:12:20.024 ] 00:12:20.024 }' 00:12:20.024 19:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:20.024 19:48:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:20.591 19:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:20.591 19:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:20.591 19:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:20.591 19:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:20.850 19:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:20.850 19:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:20.850 19:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:21.108 [2024-07-24 19:48:12.573214] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:21.108 [2024-07-24 19:48:12.573265] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22d10d0 name Existed_Raid, state offline 00:12:21.108 19:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:21.108 19:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:21.108 19:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:21.108 19:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:21.367 19:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:21.367 19:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:21.367 19:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:21.367 19:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1379993 00:12:21.367 19:48:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 1379993 ']' 00:12:21.367 19:48:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 1379993 00:12:21.367 19:48:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:12:21.367 19:48:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:21.367 19:48:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1379993 00:12:21.367 19:48:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:21.367 19:48:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:21.367 19:48:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1379993' 00:12:21.367 killing process with pid 1379993 00:12:21.367 19:48:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 1379993 00:12:21.367 [2024-07-24 19:48:12.903471] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:21.367 19:48:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 1379993 00:12:21.367 [2024-07-24 19:48:12.904468] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:21.626 19:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:12:21.626 00:12:21.626 real 0m10.447s 00:12:21.626 user 0m18.615s 00:12:21.626 sys 0m1.930s 00:12:21.626 19:48:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:21.626 19:48:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:21.626 ************************************ 00:12:21.626 END TEST raid_state_function_test 00:12:21.626 ************************************ 00:12:21.626 19:48:13 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:12:21.626 19:48:13 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:12:21.626 19:48:13 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:21.626 19:48:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:21.626 ************************************ 00:12:21.626 START TEST raid_state_function_test_sb 00:12:21.626 ************************************ 00:12:21.626 19:48:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 2 true 00:12:21.626 19:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:12:21.626 19:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:21.626 19:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:12:21.626 19:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:21.626 19:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:21.626 19:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:21.626 19:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:21.626 19:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:21.626 19:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:21.626 19:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:21.626 19:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:21.626 19:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:21.890 19:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:21.890 19:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:21.890 19:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:21.890 19:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:21.890 19:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:21.890 19:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:21.890 19:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:12:21.890 19:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:21.890 19:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:21.890 19:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:12:21.890 19:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:12:21.890 19:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1381893 00:12:21.890 19:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1381893' 00:12:21.890 Process raid pid: 1381893 00:12:21.890 19:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:21.890 19:48:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1381893 /var/tmp/spdk-raid.sock 00:12:21.890 19:48:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1381893 ']' 00:12:21.890 19:48:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:21.890 19:48:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:21.890 19:48:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:21.890 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:21.890 19:48:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:21.890 19:48:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:21.890 [2024-07-24 19:48:13.281848] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:12:21.890 [2024-07-24 19:48:13.281915] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:21.890 [2024-07-24 19:48:13.412282] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:22.151 [2024-07-24 19:48:13.516676] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:22.151 [2024-07-24 19:48:13.580769] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:22.151 [2024-07-24 19:48:13.580805] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:22.718 19:48:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:22.718 19:48:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:12:22.718 19:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:22.976 [2024-07-24 19:48:14.437406] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:22.976 [2024-07-24 19:48:14.437444] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:22.976 [2024-07-24 19:48:14.437455] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:22.976 [2024-07-24 19:48:14.437467] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:22.976 19:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:22.976 19:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:22.976 19:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:22.976 19:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:22.976 19:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:22.976 19:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:22.976 19:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:22.976 19:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:22.976 19:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:22.976 19:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:22.976 19:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:22.976 19:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:23.235 19:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:23.235 "name": "Existed_Raid", 00:12:23.235 "uuid": "aae0a871-1b47-46d5-bf20-99c4e02547e3", 00:12:23.235 "strip_size_kb": 64, 00:12:23.235 "state": "configuring", 00:12:23.235 "raid_level": "concat", 00:12:23.235 "superblock": true, 00:12:23.235 "num_base_bdevs": 2, 00:12:23.235 "num_base_bdevs_discovered": 0, 00:12:23.235 "num_base_bdevs_operational": 2, 00:12:23.235 "base_bdevs_list": [ 00:12:23.235 { 00:12:23.235 "name": "BaseBdev1", 00:12:23.235 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:23.235 "is_configured": false, 00:12:23.235 "data_offset": 0, 00:12:23.235 "data_size": 0 00:12:23.235 }, 00:12:23.235 { 00:12:23.235 "name": "BaseBdev2", 00:12:23.235 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:23.235 "is_configured": false, 00:12:23.235 "data_offset": 0, 00:12:23.235 "data_size": 0 00:12:23.235 } 00:12:23.235 ] 00:12:23.235 }' 00:12:23.235 19:48:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:23.235 19:48:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:23.802 19:48:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:24.060 [2024-07-24 19:48:15.548185] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:24.060 [2024-07-24 19:48:15.548213] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc2f9f0 name Existed_Raid, state configuring 00:12:24.060 19:48:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:24.318 [2024-07-24 19:48:15.796868] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:24.318 [2024-07-24 19:48:15.796894] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:24.318 [2024-07-24 19:48:15.796903] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:24.318 [2024-07-24 19:48:15.796915] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:24.318 19:48:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:24.577 [2024-07-24 19:48:16.064413] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:24.577 BaseBdev1 00:12:24.577 19:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:24.577 19:48:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:12:24.577 19:48:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:24.577 19:48:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:12:24.577 19:48:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:24.577 19:48:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:24.577 19:48:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:24.836 19:48:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:25.095 [ 00:12:25.095 { 00:12:25.095 "name": "BaseBdev1", 00:12:25.095 "aliases": [ 00:12:25.095 "176b3258-bc2e-4a61-8022-aaaa4fcf7c11" 00:12:25.095 ], 00:12:25.095 "product_name": "Malloc disk", 00:12:25.095 "block_size": 512, 00:12:25.095 "num_blocks": 65536, 00:12:25.095 "uuid": "176b3258-bc2e-4a61-8022-aaaa4fcf7c11", 00:12:25.095 "assigned_rate_limits": { 00:12:25.095 "rw_ios_per_sec": 0, 00:12:25.095 "rw_mbytes_per_sec": 0, 00:12:25.095 "r_mbytes_per_sec": 0, 00:12:25.095 "w_mbytes_per_sec": 0 00:12:25.095 }, 00:12:25.095 "claimed": true, 00:12:25.095 "claim_type": "exclusive_write", 00:12:25.095 "zoned": false, 00:12:25.095 "supported_io_types": { 00:12:25.095 "read": true, 00:12:25.095 "write": true, 00:12:25.095 "unmap": true, 00:12:25.095 "flush": true, 00:12:25.095 "reset": true, 00:12:25.095 "nvme_admin": false, 00:12:25.095 "nvme_io": false, 00:12:25.095 "nvme_io_md": false, 00:12:25.095 "write_zeroes": true, 00:12:25.095 "zcopy": true, 00:12:25.095 "get_zone_info": false, 00:12:25.095 "zone_management": false, 00:12:25.095 "zone_append": false, 00:12:25.095 "compare": false, 00:12:25.095 "compare_and_write": false, 00:12:25.095 "abort": true, 00:12:25.095 "seek_hole": false, 00:12:25.095 "seek_data": false, 00:12:25.095 "copy": true, 00:12:25.095 "nvme_iov_md": false 00:12:25.095 }, 00:12:25.095 "memory_domains": [ 00:12:25.095 { 00:12:25.095 "dma_device_id": "system", 00:12:25.095 "dma_device_type": 1 00:12:25.095 }, 00:12:25.095 { 00:12:25.095 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:25.095 "dma_device_type": 2 00:12:25.095 } 00:12:25.095 ], 00:12:25.095 "driver_specific": {} 00:12:25.095 } 00:12:25.095 ] 00:12:25.095 19:48:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:12:25.095 19:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:25.095 19:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:25.095 19:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:25.095 19:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:25.095 19:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:25.095 19:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:25.095 19:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:25.095 19:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:25.095 19:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:25.095 19:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:25.095 19:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:25.095 19:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:25.354 19:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:25.354 "name": "Existed_Raid", 00:12:25.354 "uuid": "701fa2e1-c045-4981-b43a-c5d405a5e548", 00:12:25.354 "strip_size_kb": 64, 00:12:25.354 "state": "configuring", 00:12:25.354 "raid_level": "concat", 00:12:25.354 "superblock": true, 00:12:25.354 "num_base_bdevs": 2, 00:12:25.354 "num_base_bdevs_discovered": 1, 00:12:25.354 "num_base_bdevs_operational": 2, 00:12:25.354 "base_bdevs_list": [ 00:12:25.354 { 00:12:25.354 "name": "BaseBdev1", 00:12:25.354 "uuid": "176b3258-bc2e-4a61-8022-aaaa4fcf7c11", 00:12:25.354 "is_configured": true, 00:12:25.354 "data_offset": 2048, 00:12:25.354 "data_size": 63488 00:12:25.354 }, 00:12:25.354 { 00:12:25.354 "name": "BaseBdev2", 00:12:25.354 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:25.354 "is_configured": false, 00:12:25.354 "data_offset": 0, 00:12:25.354 "data_size": 0 00:12:25.354 } 00:12:25.354 ] 00:12:25.354 }' 00:12:25.354 19:48:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:25.354 19:48:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:25.920 19:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:26.179 [2024-07-24 19:48:17.628598] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:26.179 [2024-07-24 19:48:17.628633] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc2f2e0 name Existed_Raid, state configuring 00:12:26.179 19:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:26.437 [2024-07-24 19:48:17.813129] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:26.437 [2024-07-24 19:48:17.814606] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:26.437 [2024-07-24 19:48:17.814637] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:26.437 19:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:26.437 19:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:26.437 19:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:26.437 19:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:26.437 19:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:26.437 19:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:26.437 19:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:26.437 19:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:26.437 19:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:26.437 19:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:26.437 19:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:26.437 19:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:26.437 19:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:26.437 19:48:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:26.695 19:48:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:26.695 "name": "Existed_Raid", 00:12:26.695 "uuid": "ec1a4720-c2b1-4d44-93bd-1bb56b6e0f1f", 00:12:26.695 "strip_size_kb": 64, 00:12:26.695 "state": "configuring", 00:12:26.695 "raid_level": "concat", 00:12:26.695 "superblock": true, 00:12:26.695 "num_base_bdevs": 2, 00:12:26.695 "num_base_bdevs_discovered": 1, 00:12:26.695 "num_base_bdevs_operational": 2, 00:12:26.695 "base_bdevs_list": [ 00:12:26.695 { 00:12:26.696 "name": "BaseBdev1", 00:12:26.696 "uuid": "176b3258-bc2e-4a61-8022-aaaa4fcf7c11", 00:12:26.696 "is_configured": true, 00:12:26.696 "data_offset": 2048, 00:12:26.696 "data_size": 63488 00:12:26.696 }, 00:12:26.696 { 00:12:26.696 "name": "BaseBdev2", 00:12:26.696 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:26.696 "is_configured": false, 00:12:26.696 "data_offset": 0, 00:12:26.696 "data_size": 0 00:12:26.696 } 00:12:26.696 ] 00:12:26.696 }' 00:12:26.696 19:48:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:26.696 19:48:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:27.262 19:48:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:27.521 [2024-07-24 19:48:18.875211] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:27.521 [2024-07-24 19:48:18.875354] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xc300d0 00:12:27.521 [2024-07-24 19:48:18.875367] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:27.521 [2024-07-24 19:48:18.875556] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xde3a80 00:12:27.521 [2024-07-24 19:48:18.875677] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc300d0 00:12:27.521 [2024-07-24 19:48:18.875688] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xc300d0 00:12:27.521 [2024-07-24 19:48:18.875777] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:27.521 BaseBdev2 00:12:27.521 19:48:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:27.521 19:48:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:12:27.521 19:48:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:27.521 19:48:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:12:27.521 19:48:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:27.521 19:48:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:27.521 19:48:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:27.779 19:48:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:28.038 [ 00:12:28.038 { 00:12:28.038 "name": "BaseBdev2", 00:12:28.038 "aliases": [ 00:12:28.038 "f18a16e4-30a7-4863-8446-4d6b9ae390a0" 00:12:28.038 ], 00:12:28.038 "product_name": "Malloc disk", 00:12:28.038 "block_size": 512, 00:12:28.038 "num_blocks": 65536, 00:12:28.038 "uuid": "f18a16e4-30a7-4863-8446-4d6b9ae390a0", 00:12:28.038 "assigned_rate_limits": { 00:12:28.038 "rw_ios_per_sec": 0, 00:12:28.038 "rw_mbytes_per_sec": 0, 00:12:28.038 "r_mbytes_per_sec": 0, 00:12:28.038 "w_mbytes_per_sec": 0 00:12:28.038 }, 00:12:28.038 "claimed": true, 00:12:28.038 "claim_type": "exclusive_write", 00:12:28.038 "zoned": false, 00:12:28.038 "supported_io_types": { 00:12:28.038 "read": true, 00:12:28.038 "write": true, 00:12:28.038 "unmap": true, 00:12:28.038 "flush": true, 00:12:28.038 "reset": true, 00:12:28.038 "nvme_admin": false, 00:12:28.038 "nvme_io": false, 00:12:28.038 "nvme_io_md": false, 00:12:28.038 "write_zeroes": true, 00:12:28.038 "zcopy": true, 00:12:28.038 "get_zone_info": false, 00:12:28.038 "zone_management": false, 00:12:28.038 "zone_append": false, 00:12:28.038 "compare": false, 00:12:28.038 "compare_and_write": false, 00:12:28.038 "abort": true, 00:12:28.038 "seek_hole": false, 00:12:28.038 "seek_data": false, 00:12:28.038 "copy": true, 00:12:28.038 "nvme_iov_md": false 00:12:28.038 }, 00:12:28.038 "memory_domains": [ 00:12:28.038 { 00:12:28.038 "dma_device_id": "system", 00:12:28.038 "dma_device_type": 1 00:12:28.038 }, 00:12:28.038 { 00:12:28.038 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:28.038 "dma_device_type": 2 00:12:28.038 } 00:12:28.038 ], 00:12:28.038 "driver_specific": {} 00:12:28.038 } 00:12:28.038 ] 00:12:28.038 19:48:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:12:28.038 19:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:28.038 19:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:28.038 19:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:12:28.038 19:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:28.038 19:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:28.038 19:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:28.038 19:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:28.038 19:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:28.038 19:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:28.038 19:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:28.038 19:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:28.038 19:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:28.038 19:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:28.038 19:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:28.297 19:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:28.297 "name": "Existed_Raid", 00:12:28.297 "uuid": "ec1a4720-c2b1-4d44-93bd-1bb56b6e0f1f", 00:12:28.297 "strip_size_kb": 64, 00:12:28.297 "state": "online", 00:12:28.297 "raid_level": "concat", 00:12:28.297 "superblock": true, 00:12:28.297 "num_base_bdevs": 2, 00:12:28.297 "num_base_bdevs_discovered": 2, 00:12:28.297 "num_base_bdevs_operational": 2, 00:12:28.297 "base_bdevs_list": [ 00:12:28.297 { 00:12:28.297 "name": "BaseBdev1", 00:12:28.297 "uuid": "176b3258-bc2e-4a61-8022-aaaa4fcf7c11", 00:12:28.297 "is_configured": true, 00:12:28.297 "data_offset": 2048, 00:12:28.297 "data_size": 63488 00:12:28.297 }, 00:12:28.297 { 00:12:28.297 "name": "BaseBdev2", 00:12:28.297 "uuid": "f18a16e4-30a7-4863-8446-4d6b9ae390a0", 00:12:28.297 "is_configured": true, 00:12:28.297 "data_offset": 2048, 00:12:28.297 "data_size": 63488 00:12:28.297 } 00:12:28.297 ] 00:12:28.297 }' 00:12:28.297 19:48:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:28.297 19:48:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:28.863 19:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:28.863 19:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:28.863 19:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:28.863 19:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:28.863 19:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:28.863 19:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:28.863 19:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:28.863 19:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:29.121 [2024-07-24 19:48:20.487769] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:29.121 19:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:29.121 "name": "Existed_Raid", 00:12:29.121 "aliases": [ 00:12:29.121 "ec1a4720-c2b1-4d44-93bd-1bb56b6e0f1f" 00:12:29.121 ], 00:12:29.121 "product_name": "Raid Volume", 00:12:29.121 "block_size": 512, 00:12:29.121 "num_blocks": 126976, 00:12:29.121 "uuid": "ec1a4720-c2b1-4d44-93bd-1bb56b6e0f1f", 00:12:29.122 "assigned_rate_limits": { 00:12:29.122 "rw_ios_per_sec": 0, 00:12:29.122 "rw_mbytes_per_sec": 0, 00:12:29.122 "r_mbytes_per_sec": 0, 00:12:29.122 "w_mbytes_per_sec": 0 00:12:29.122 }, 00:12:29.122 "claimed": false, 00:12:29.122 "zoned": false, 00:12:29.122 "supported_io_types": { 00:12:29.122 "read": true, 00:12:29.122 "write": true, 00:12:29.122 "unmap": true, 00:12:29.122 "flush": true, 00:12:29.122 "reset": true, 00:12:29.122 "nvme_admin": false, 00:12:29.122 "nvme_io": false, 00:12:29.122 "nvme_io_md": false, 00:12:29.122 "write_zeroes": true, 00:12:29.122 "zcopy": false, 00:12:29.122 "get_zone_info": false, 00:12:29.122 "zone_management": false, 00:12:29.122 "zone_append": false, 00:12:29.122 "compare": false, 00:12:29.122 "compare_and_write": false, 00:12:29.122 "abort": false, 00:12:29.122 "seek_hole": false, 00:12:29.122 "seek_data": false, 00:12:29.122 "copy": false, 00:12:29.122 "nvme_iov_md": false 00:12:29.122 }, 00:12:29.122 "memory_domains": [ 00:12:29.122 { 00:12:29.122 "dma_device_id": "system", 00:12:29.122 "dma_device_type": 1 00:12:29.122 }, 00:12:29.122 { 00:12:29.122 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:29.122 "dma_device_type": 2 00:12:29.122 }, 00:12:29.122 { 00:12:29.122 "dma_device_id": "system", 00:12:29.122 "dma_device_type": 1 00:12:29.122 }, 00:12:29.122 { 00:12:29.122 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:29.122 "dma_device_type": 2 00:12:29.122 } 00:12:29.122 ], 00:12:29.122 "driver_specific": { 00:12:29.122 "raid": { 00:12:29.122 "uuid": "ec1a4720-c2b1-4d44-93bd-1bb56b6e0f1f", 00:12:29.122 "strip_size_kb": 64, 00:12:29.122 "state": "online", 00:12:29.122 "raid_level": "concat", 00:12:29.122 "superblock": true, 00:12:29.122 "num_base_bdevs": 2, 00:12:29.122 "num_base_bdevs_discovered": 2, 00:12:29.122 "num_base_bdevs_operational": 2, 00:12:29.122 "base_bdevs_list": [ 00:12:29.122 { 00:12:29.122 "name": "BaseBdev1", 00:12:29.122 "uuid": "176b3258-bc2e-4a61-8022-aaaa4fcf7c11", 00:12:29.122 "is_configured": true, 00:12:29.122 "data_offset": 2048, 00:12:29.122 "data_size": 63488 00:12:29.122 }, 00:12:29.122 { 00:12:29.122 "name": "BaseBdev2", 00:12:29.122 "uuid": "f18a16e4-30a7-4863-8446-4d6b9ae390a0", 00:12:29.122 "is_configured": true, 00:12:29.122 "data_offset": 2048, 00:12:29.122 "data_size": 63488 00:12:29.122 } 00:12:29.122 ] 00:12:29.122 } 00:12:29.122 } 00:12:29.122 }' 00:12:29.122 19:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:29.122 19:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:29.122 BaseBdev2' 00:12:29.122 19:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:29.122 19:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:29.122 19:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:29.380 19:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:29.380 "name": "BaseBdev1", 00:12:29.380 "aliases": [ 00:12:29.380 "176b3258-bc2e-4a61-8022-aaaa4fcf7c11" 00:12:29.380 ], 00:12:29.380 "product_name": "Malloc disk", 00:12:29.380 "block_size": 512, 00:12:29.380 "num_blocks": 65536, 00:12:29.380 "uuid": "176b3258-bc2e-4a61-8022-aaaa4fcf7c11", 00:12:29.380 "assigned_rate_limits": { 00:12:29.380 "rw_ios_per_sec": 0, 00:12:29.380 "rw_mbytes_per_sec": 0, 00:12:29.380 "r_mbytes_per_sec": 0, 00:12:29.380 "w_mbytes_per_sec": 0 00:12:29.380 }, 00:12:29.380 "claimed": true, 00:12:29.380 "claim_type": "exclusive_write", 00:12:29.380 "zoned": false, 00:12:29.380 "supported_io_types": { 00:12:29.380 "read": true, 00:12:29.380 "write": true, 00:12:29.380 "unmap": true, 00:12:29.380 "flush": true, 00:12:29.380 "reset": true, 00:12:29.380 "nvme_admin": false, 00:12:29.380 "nvme_io": false, 00:12:29.380 "nvme_io_md": false, 00:12:29.380 "write_zeroes": true, 00:12:29.380 "zcopy": true, 00:12:29.380 "get_zone_info": false, 00:12:29.380 "zone_management": false, 00:12:29.380 "zone_append": false, 00:12:29.380 "compare": false, 00:12:29.380 "compare_and_write": false, 00:12:29.380 "abort": true, 00:12:29.380 "seek_hole": false, 00:12:29.380 "seek_data": false, 00:12:29.380 "copy": true, 00:12:29.380 "nvme_iov_md": false 00:12:29.380 }, 00:12:29.380 "memory_domains": [ 00:12:29.380 { 00:12:29.380 "dma_device_id": "system", 00:12:29.380 "dma_device_type": 1 00:12:29.380 }, 00:12:29.380 { 00:12:29.380 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:29.380 "dma_device_type": 2 00:12:29.380 } 00:12:29.380 ], 00:12:29.380 "driver_specific": {} 00:12:29.380 }' 00:12:29.380 19:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:29.380 19:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:29.380 19:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:29.380 19:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:29.380 19:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:29.380 19:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:29.380 19:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:29.380 19:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:29.639 19:48:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:29.639 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:29.639 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:29.639 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:29.639 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:29.639 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:29.639 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:29.896 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:29.896 "name": "BaseBdev2", 00:12:29.896 "aliases": [ 00:12:29.896 "f18a16e4-30a7-4863-8446-4d6b9ae390a0" 00:12:29.896 ], 00:12:29.896 "product_name": "Malloc disk", 00:12:29.896 "block_size": 512, 00:12:29.896 "num_blocks": 65536, 00:12:29.896 "uuid": "f18a16e4-30a7-4863-8446-4d6b9ae390a0", 00:12:29.896 "assigned_rate_limits": { 00:12:29.896 "rw_ios_per_sec": 0, 00:12:29.896 "rw_mbytes_per_sec": 0, 00:12:29.896 "r_mbytes_per_sec": 0, 00:12:29.896 "w_mbytes_per_sec": 0 00:12:29.896 }, 00:12:29.896 "claimed": true, 00:12:29.896 "claim_type": "exclusive_write", 00:12:29.896 "zoned": false, 00:12:29.896 "supported_io_types": { 00:12:29.896 "read": true, 00:12:29.896 "write": true, 00:12:29.896 "unmap": true, 00:12:29.896 "flush": true, 00:12:29.896 "reset": true, 00:12:29.896 "nvme_admin": false, 00:12:29.896 "nvme_io": false, 00:12:29.896 "nvme_io_md": false, 00:12:29.896 "write_zeroes": true, 00:12:29.896 "zcopy": true, 00:12:29.896 "get_zone_info": false, 00:12:29.896 "zone_management": false, 00:12:29.896 "zone_append": false, 00:12:29.896 "compare": false, 00:12:29.896 "compare_and_write": false, 00:12:29.896 "abort": true, 00:12:29.896 "seek_hole": false, 00:12:29.896 "seek_data": false, 00:12:29.896 "copy": true, 00:12:29.896 "nvme_iov_md": false 00:12:29.896 }, 00:12:29.896 "memory_domains": [ 00:12:29.896 { 00:12:29.896 "dma_device_id": "system", 00:12:29.896 "dma_device_type": 1 00:12:29.896 }, 00:12:29.896 { 00:12:29.896 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:29.896 "dma_device_type": 2 00:12:29.896 } 00:12:29.896 ], 00:12:29.896 "driver_specific": {} 00:12:29.896 }' 00:12:29.896 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:29.896 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:29.896 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:29.896 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:29.896 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:30.154 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:30.154 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:30.154 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:30.154 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:30.154 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:30.154 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:30.154 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:30.154 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:30.412 [2024-07-24 19:48:21.911329] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:30.412 [2024-07-24 19:48:21.911356] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:30.412 [2024-07-24 19:48:21.911401] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:30.412 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:30.412 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:12:30.412 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:30.412 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:12:30.412 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:30.412 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:12:30.412 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:30.412 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:30.412 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:30.412 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:30.412 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:30.412 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:30.412 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:30.412 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:30.412 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:30.412 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:30.412 19:48:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:30.669 19:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:30.669 "name": "Existed_Raid", 00:12:30.669 "uuid": "ec1a4720-c2b1-4d44-93bd-1bb56b6e0f1f", 00:12:30.669 "strip_size_kb": 64, 00:12:30.669 "state": "offline", 00:12:30.669 "raid_level": "concat", 00:12:30.669 "superblock": true, 00:12:30.669 "num_base_bdevs": 2, 00:12:30.669 "num_base_bdevs_discovered": 1, 00:12:30.669 "num_base_bdevs_operational": 1, 00:12:30.669 "base_bdevs_list": [ 00:12:30.669 { 00:12:30.669 "name": null, 00:12:30.669 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:30.669 "is_configured": false, 00:12:30.669 "data_offset": 2048, 00:12:30.669 "data_size": 63488 00:12:30.669 }, 00:12:30.669 { 00:12:30.669 "name": "BaseBdev2", 00:12:30.669 "uuid": "f18a16e4-30a7-4863-8446-4d6b9ae390a0", 00:12:30.669 "is_configured": true, 00:12:30.669 "data_offset": 2048, 00:12:30.669 "data_size": 63488 00:12:30.669 } 00:12:30.669 ] 00:12:30.669 }' 00:12:30.669 19:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:30.669 19:48:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:31.233 19:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:31.233 19:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:31.233 19:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:31.233 19:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:31.491 19:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:31.491 19:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:31.491 19:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:31.749 [2024-07-24 19:48:23.260816] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:31.749 [2024-07-24 19:48:23.260865] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc300d0 name Existed_Raid, state offline 00:12:31.749 19:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:31.749 19:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:31.749 19:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:31.749 19:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:32.007 19:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:32.007 19:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:32.007 19:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:32.007 19:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1381893 00:12:32.007 19:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1381893 ']' 00:12:32.007 19:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 1381893 00:12:32.007 19:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:12:32.007 19:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:32.007 19:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1381893 00:12:32.007 19:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:32.007 19:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:32.007 19:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1381893' 00:12:32.007 killing process with pid 1381893 00:12:32.007 19:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 1381893 00:12:32.007 [2024-07-24 19:48:23.590401] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:32.007 19:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 1381893 00:12:32.007 [2024-07-24 19:48:23.591386] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:32.265 19:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:12:32.265 00:12:32.265 real 0m10.599s 00:12:32.265 user 0m18.765s 00:12:32.265 sys 0m2.032s 00:12:32.265 19:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:32.265 19:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:32.265 ************************************ 00:12:32.265 END TEST raid_state_function_test_sb 00:12:32.265 ************************************ 00:12:32.265 19:48:23 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:12:32.265 19:48:23 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:12:32.265 19:48:23 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:32.265 19:48:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:32.523 ************************************ 00:12:32.523 START TEST raid_superblock_test 00:12:32.523 ************************************ 00:12:32.523 19:48:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test concat 2 00:12:32.523 19:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=concat 00:12:32.523 19:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=2 00:12:32.523 19:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:12:32.523 19:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:12:32.523 19:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:12:32.523 19:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:12:32.523 19:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:12:32.523 19:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:12:32.523 19:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:12:32.523 19:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:12:32.523 19:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:12:32.523 19:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:12:32.523 19:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:12:32.523 19:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' concat '!=' raid1 ']' 00:12:32.523 19:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # strip_size=64 00:12:32.523 19:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # strip_size_create_arg='-z 64' 00:12:32.523 19:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=1383521 00:12:32.523 19:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 1383521 /var/tmp/spdk-raid.sock 00:12:32.523 19:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:12:32.523 19:48:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1383521 ']' 00:12:32.523 19:48:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:32.523 19:48:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:32.523 19:48:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:32.523 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:32.523 19:48:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:32.523 19:48:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:32.523 [2024-07-24 19:48:23.955990] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:12:32.523 [2024-07-24 19:48:23.956047] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1383521 ] 00:12:32.523 [2024-07-24 19:48:24.068947] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:32.781 [2024-07-24 19:48:24.176163] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:32.781 [2024-07-24 19:48:24.240859] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:32.781 [2024-07-24 19:48:24.240907] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:33.346 19:48:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:33.346 19:48:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:12:33.346 19:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:12:33.346 19:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:12:33.346 19:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:12:33.346 19:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:12:33.346 19:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:12:33.346 19:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:33.346 19:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:12:33.346 19:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:33.346 19:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:12:33.346 malloc1 00:12:33.603 19:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:33.603 [2024-07-24 19:48:25.102652] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:33.603 [2024-07-24 19:48:25.102702] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:33.603 [2024-07-24 19:48:25.102722] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f93590 00:12:33.603 [2024-07-24 19:48:25.102735] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:33.603 [2024-07-24 19:48:25.104290] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:33.603 [2024-07-24 19:48:25.104319] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:33.603 pt1 00:12:33.603 19:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:12:33.603 19:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:12:33.603 19:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:12:33.603 19:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:12:33.603 19:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:12:33.603 19:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:33.603 19:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:12:33.603 19:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:33.603 19:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:12:33.861 malloc2 00:12:33.861 19:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:33.861 [2024-07-24 19:48:25.448311] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:33.861 [2024-07-24 19:48:25.448356] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:33.861 [2024-07-24 19:48:25.448372] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2139690 00:12:33.861 [2024-07-24 19:48:25.448384] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:33.861 [2024-07-24 19:48:25.449798] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:33.861 [2024-07-24 19:48:25.449827] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:33.861 pt2 00:12:34.119 19:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:12:34.119 19:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:12:34.119 19:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:12:34.119 [2024-07-24 19:48:25.636834] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:34.119 [2024-07-24 19:48:25.637991] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:34.119 [2024-07-24 19:48:25.638128] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x213a980 00:12:34.119 [2024-07-24 19:48:25.638141] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:34.119 [2024-07-24 19:48:25.638318] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x213b730 00:12:34.119 [2024-07-24 19:48:25.638467] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x213a980 00:12:34.119 [2024-07-24 19:48:25.638478] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x213a980 00:12:34.119 [2024-07-24 19:48:25.638565] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:34.119 19:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:34.119 19:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:34.119 19:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:34.119 19:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:34.119 19:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:34.119 19:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:34.119 19:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:34.119 19:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:34.119 19:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:34.119 19:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:34.119 19:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:34.120 19:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:34.378 19:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:34.378 "name": "raid_bdev1", 00:12:34.378 "uuid": "ea443b7a-b2f8-4ba5-961b-c4bd03981188", 00:12:34.378 "strip_size_kb": 64, 00:12:34.378 "state": "online", 00:12:34.378 "raid_level": "concat", 00:12:34.378 "superblock": true, 00:12:34.378 "num_base_bdevs": 2, 00:12:34.378 "num_base_bdevs_discovered": 2, 00:12:34.378 "num_base_bdevs_operational": 2, 00:12:34.378 "base_bdevs_list": [ 00:12:34.378 { 00:12:34.378 "name": "pt1", 00:12:34.378 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:34.378 "is_configured": true, 00:12:34.378 "data_offset": 2048, 00:12:34.378 "data_size": 63488 00:12:34.378 }, 00:12:34.378 { 00:12:34.378 "name": "pt2", 00:12:34.378 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:34.378 "is_configured": true, 00:12:34.378 "data_offset": 2048, 00:12:34.378 "data_size": 63488 00:12:34.378 } 00:12:34.378 ] 00:12:34.378 }' 00:12:34.378 19:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:34.378 19:48:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:34.943 19:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:12:34.943 19:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:34.943 19:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:34.943 19:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:34.943 19:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:34.943 19:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:34.943 19:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:34.943 19:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:35.201 [2024-07-24 19:48:26.740091] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:35.201 19:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:35.201 "name": "raid_bdev1", 00:12:35.201 "aliases": [ 00:12:35.201 "ea443b7a-b2f8-4ba5-961b-c4bd03981188" 00:12:35.201 ], 00:12:35.201 "product_name": "Raid Volume", 00:12:35.201 "block_size": 512, 00:12:35.201 "num_blocks": 126976, 00:12:35.201 "uuid": "ea443b7a-b2f8-4ba5-961b-c4bd03981188", 00:12:35.201 "assigned_rate_limits": { 00:12:35.201 "rw_ios_per_sec": 0, 00:12:35.201 "rw_mbytes_per_sec": 0, 00:12:35.201 "r_mbytes_per_sec": 0, 00:12:35.201 "w_mbytes_per_sec": 0 00:12:35.201 }, 00:12:35.201 "claimed": false, 00:12:35.201 "zoned": false, 00:12:35.201 "supported_io_types": { 00:12:35.201 "read": true, 00:12:35.201 "write": true, 00:12:35.201 "unmap": true, 00:12:35.201 "flush": true, 00:12:35.201 "reset": true, 00:12:35.201 "nvme_admin": false, 00:12:35.201 "nvme_io": false, 00:12:35.201 "nvme_io_md": false, 00:12:35.201 "write_zeroes": true, 00:12:35.201 "zcopy": false, 00:12:35.201 "get_zone_info": false, 00:12:35.201 "zone_management": false, 00:12:35.201 "zone_append": false, 00:12:35.201 "compare": false, 00:12:35.201 "compare_and_write": false, 00:12:35.201 "abort": false, 00:12:35.201 "seek_hole": false, 00:12:35.201 "seek_data": false, 00:12:35.201 "copy": false, 00:12:35.201 "nvme_iov_md": false 00:12:35.201 }, 00:12:35.201 "memory_domains": [ 00:12:35.201 { 00:12:35.201 "dma_device_id": "system", 00:12:35.201 "dma_device_type": 1 00:12:35.201 }, 00:12:35.201 { 00:12:35.201 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:35.201 "dma_device_type": 2 00:12:35.201 }, 00:12:35.201 { 00:12:35.201 "dma_device_id": "system", 00:12:35.201 "dma_device_type": 1 00:12:35.201 }, 00:12:35.201 { 00:12:35.201 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:35.201 "dma_device_type": 2 00:12:35.201 } 00:12:35.201 ], 00:12:35.201 "driver_specific": { 00:12:35.201 "raid": { 00:12:35.201 "uuid": "ea443b7a-b2f8-4ba5-961b-c4bd03981188", 00:12:35.201 "strip_size_kb": 64, 00:12:35.201 "state": "online", 00:12:35.201 "raid_level": "concat", 00:12:35.201 "superblock": true, 00:12:35.201 "num_base_bdevs": 2, 00:12:35.201 "num_base_bdevs_discovered": 2, 00:12:35.201 "num_base_bdevs_operational": 2, 00:12:35.201 "base_bdevs_list": [ 00:12:35.201 { 00:12:35.201 "name": "pt1", 00:12:35.201 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:35.201 "is_configured": true, 00:12:35.201 "data_offset": 2048, 00:12:35.201 "data_size": 63488 00:12:35.201 }, 00:12:35.201 { 00:12:35.201 "name": "pt2", 00:12:35.201 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:35.201 "is_configured": true, 00:12:35.201 "data_offset": 2048, 00:12:35.201 "data_size": 63488 00:12:35.201 } 00:12:35.201 ] 00:12:35.201 } 00:12:35.201 } 00:12:35.201 }' 00:12:35.201 19:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:35.458 19:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:35.458 pt2' 00:12:35.458 19:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:35.458 19:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:35.458 19:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:35.716 19:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:35.716 "name": "pt1", 00:12:35.716 "aliases": [ 00:12:35.716 "00000000-0000-0000-0000-000000000001" 00:12:35.716 ], 00:12:35.716 "product_name": "passthru", 00:12:35.716 "block_size": 512, 00:12:35.716 "num_blocks": 65536, 00:12:35.716 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:35.716 "assigned_rate_limits": { 00:12:35.716 "rw_ios_per_sec": 0, 00:12:35.716 "rw_mbytes_per_sec": 0, 00:12:35.716 "r_mbytes_per_sec": 0, 00:12:35.716 "w_mbytes_per_sec": 0 00:12:35.716 }, 00:12:35.716 "claimed": true, 00:12:35.716 "claim_type": "exclusive_write", 00:12:35.716 "zoned": false, 00:12:35.716 "supported_io_types": { 00:12:35.716 "read": true, 00:12:35.716 "write": true, 00:12:35.716 "unmap": true, 00:12:35.716 "flush": true, 00:12:35.716 "reset": true, 00:12:35.716 "nvme_admin": false, 00:12:35.716 "nvme_io": false, 00:12:35.716 "nvme_io_md": false, 00:12:35.716 "write_zeroes": true, 00:12:35.716 "zcopy": true, 00:12:35.716 "get_zone_info": false, 00:12:35.716 "zone_management": false, 00:12:35.716 "zone_append": false, 00:12:35.716 "compare": false, 00:12:35.716 "compare_and_write": false, 00:12:35.716 "abort": true, 00:12:35.716 "seek_hole": false, 00:12:35.716 "seek_data": false, 00:12:35.716 "copy": true, 00:12:35.716 "nvme_iov_md": false 00:12:35.716 }, 00:12:35.716 "memory_domains": [ 00:12:35.716 { 00:12:35.716 "dma_device_id": "system", 00:12:35.716 "dma_device_type": 1 00:12:35.716 }, 00:12:35.716 { 00:12:35.716 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:35.716 "dma_device_type": 2 00:12:35.716 } 00:12:35.716 ], 00:12:35.716 "driver_specific": { 00:12:35.716 "passthru": { 00:12:35.716 "name": "pt1", 00:12:35.716 "base_bdev_name": "malloc1" 00:12:35.716 } 00:12:35.716 } 00:12:35.716 }' 00:12:35.716 19:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:35.716 19:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:35.716 19:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:35.716 19:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:35.716 19:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:35.716 19:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:35.716 19:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:35.716 19:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:35.983 19:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:35.983 19:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:35.983 19:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:35.983 19:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:35.983 19:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:35.983 19:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:35.983 19:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:36.240 19:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:36.240 "name": "pt2", 00:12:36.240 "aliases": [ 00:12:36.241 "00000000-0000-0000-0000-000000000002" 00:12:36.241 ], 00:12:36.241 "product_name": "passthru", 00:12:36.241 "block_size": 512, 00:12:36.241 "num_blocks": 65536, 00:12:36.241 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:36.241 "assigned_rate_limits": { 00:12:36.241 "rw_ios_per_sec": 0, 00:12:36.241 "rw_mbytes_per_sec": 0, 00:12:36.241 "r_mbytes_per_sec": 0, 00:12:36.241 "w_mbytes_per_sec": 0 00:12:36.241 }, 00:12:36.241 "claimed": true, 00:12:36.241 "claim_type": "exclusive_write", 00:12:36.241 "zoned": false, 00:12:36.241 "supported_io_types": { 00:12:36.241 "read": true, 00:12:36.241 "write": true, 00:12:36.241 "unmap": true, 00:12:36.241 "flush": true, 00:12:36.241 "reset": true, 00:12:36.241 "nvme_admin": false, 00:12:36.241 "nvme_io": false, 00:12:36.241 "nvme_io_md": false, 00:12:36.241 "write_zeroes": true, 00:12:36.241 "zcopy": true, 00:12:36.241 "get_zone_info": false, 00:12:36.241 "zone_management": false, 00:12:36.241 "zone_append": false, 00:12:36.241 "compare": false, 00:12:36.241 "compare_and_write": false, 00:12:36.241 "abort": true, 00:12:36.241 "seek_hole": false, 00:12:36.241 "seek_data": false, 00:12:36.241 "copy": true, 00:12:36.241 "nvme_iov_md": false 00:12:36.241 }, 00:12:36.241 "memory_domains": [ 00:12:36.241 { 00:12:36.241 "dma_device_id": "system", 00:12:36.241 "dma_device_type": 1 00:12:36.241 }, 00:12:36.241 { 00:12:36.241 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:36.241 "dma_device_type": 2 00:12:36.241 } 00:12:36.241 ], 00:12:36.241 "driver_specific": { 00:12:36.241 "passthru": { 00:12:36.241 "name": "pt2", 00:12:36.241 "base_bdev_name": "malloc2" 00:12:36.241 } 00:12:36.241 } 00:12:36.241 }' 00:12:36.241 19:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:36.241 19:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:36.241 19:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:36.241 19:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:36.241 19:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:36.499 19:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:36.499 19:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:36.499 19:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:36.499 19:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:36.499 19:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:36.499 19:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:36.499 19:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:36.499 19:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:36.499 19:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:12:36.795 [2024-07-24 19:48:28.236052] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:36.795 19:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=ea443b7a-b2f8-4ba5-961b-c4bd03981188 00:12:36.795 19:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z ea443b7a-b2f8-4ba5-961b-c4bd03981188 ']' 00:12:36.795 19:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:37.077 [2024-07-24 19:48:28.480433] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:37.077 [2024-07-24 19:48:28.480453] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:37.077 [2024-07-24 19:48:28.480505] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:37.077 [2024-07-24 19:48:28.480550] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:37.077 [2024-07-24 19:48:28.480562] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x213a980 name raid_bdev1, state offline 00:12:37.077 19:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:37.077 19:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:12:37.335 19:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:12:37.335 19:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:12:37.335 19:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:12:37.335 19:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:37.594 19:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:12:37.594 19:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:37.594 19:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:37.594 19:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:37.852 19:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:12:37.852 19:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:37.852 19:48:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:12:37.852 19:48:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:37.852 19:48:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:37.852 19:48:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:12:37.852 19:48:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:37.852 19:48:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:12:37.852 19:48:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:37.852 19:48:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:12:37.852 19:48:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:37.852 19:48:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:37.852 19:48:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:38.111 [2024-07-24 19:48:29.651491] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:38.111 [2024-07-24 19:48:29.652901] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:38.111 [2024-07-24 19:48:29.652958] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:38.111 [2024-07-24 19:48:29.653000] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:38.111 [2024-07-24 19:48:29.653020] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:38.111 [2024-07-24 19:48:29.653030] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f92760 name raid_bdev1, state configuring 00:12:38.111 request: 00:12:38.111 { 00:12:38.111 "name": "raid_bdev1", 00:12:38.111 "raid_level": "concat", 00:12:38.111 "base_bdevs": [ 00:12:38.111 "malloc1", 00:12:38.111 "malloc2" 00:12:38.111 ], 00:12:38.111 "strip_size_kb": 64, 00:12:38.111 "superblock": false, 00:12:38.111 "method": "bdev_raid_create", 00:12:38.111 "req_id": 1 00:12:38.111 } 00:12:38.111 Got JSON-RPC error response 00:12:38.111 response: 00:12:38.111 { 00:12:38.111 "code": -17, 00:12:38.111 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:38.111 } 00:12:38.111 19:48:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:12:38.111 19:48:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:12:38.111 19:48:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:12:38.111 19:48:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:12:38.111 19:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:38.111 19:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:12:38.370 19:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:12:38.370 19:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:12:38.370 19:48:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:38.630 [2024-07-24 19:48:30.144738] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:38.630 [2024-07-24 19:48:30.144792] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:38.630 [2024-07-24 19:48:30.144811] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2139460 00:12:38.630 [2024-07-24 19:48:30.144824] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:38.630 [2024-07-24 19:48:30.146467] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:38.630 [2024-07-24 19:48:30.146496] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:38.630 [2024-07-24 19:48:30.146566] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:38.630 [2024-07-24 19:48:30.146596] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:38.630 pt1 00:12:38.630 19:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:12:38.630 19:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:38.630 19:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:38.630 19:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:38.630 19:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:38.630 19:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:38.630 19:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:38.630 19:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:38.630 19:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:38.630 19:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:38.630 19:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:38.630 19:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:38.889 19:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:38.889 "name": "raid_bdev1", 00:12:38.889 "uuid": "ea443b7a-b2f8-4ba5-961b-c4bd03981188", 00:12:38.889 "strip_size_kb": 64, 00:12:38.889 "state": "configuring", 00:12:38.889 "raid_level": "concat", 00:12:38.889 "superblock": true, 00:12:38.889 "num_base_bdevs": 2, 00:12:38.889 "num_base_bdevs_discovered": 1, 00:12:38.889 "num_base_bdevs_operational": 2, 00:12:38.889 "base_bdevs_list": [ 00:12:38.889 { 00:12:38.889 "name": "pt1", 00:12:38.889 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:38.889 "is_configured": true, 00:12:38.889 "data_offset": 2048, 00:12:38.889 "data_size": 63488 00:12:38.889 }, 00:12:38.889 { 00:12:38.889 "name": null, 00:12:38.889 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:38.889 "is_configured": false, 00:12:38.889 "data_offset": 2048, 00:12:38.889 "data_size": 63488 00:12:38.889 } 00:12:38.889 ] 00:12:38.889 }' 00:12:38.889 19:48:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:38.889 19:48:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:39.826 19:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 2 -gt 2 ']' 00:12:39.826 19:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:12:39.826 19:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:12:39.826 19:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:39.826 [2024-07-24 19:48:31.279746] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:39.826 [2024-07-24 19:48:31.279795] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:39.826 [2024-07-24 19:48:31.279814] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x213ad20 00:12:39.826 [2024-07-24 19:48:31.279826] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:39.826 [2024-07-24 19:48:31.280178] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:39.826 [2024-07-24 19:48:31.280195] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:39.826 [2024-07-24 19:48:31.280258] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:39.826 [2024-07-24 19:48:31.280279] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:39.826 [2024-07-24 19:48:31.280376] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x213ba80 00:12:39.826 [2024-07-24 19:48:31.280386] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:39.826 [2024-07-24 19:48:31.280564] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x213e2d0 00:12:39.826 [2024-07-24 19:48:31.280689] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x213ba80 00:12:39.826 [2024-07-24 19:48:31.280699] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x213ba80 00:12:39.826 [2024-07-24 19:48:31.280796] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:39.826 pt2 00:12:39.826 19:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:12:39.826 19:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:12:39.826 19:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:39.826 19:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:39.826 19:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:39.826 19:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:39.826 19:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:39.826 19:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:39.826 19:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:39.826 19:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:39.826 19:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:39.826 19:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:39.826 19:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:39.826 19:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:40.085 19:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:40.085 "name": "raid_bdev1", 00:12:40.085 "uuid": "ea443b7a-b2f8-4ba5-961b-c4bd03981188", 00:12:40.085 "strip_size_kb": 64, 00:12:40.085 "state": "online", 00:12:40.085 "raid_level": "concat", 00:12:40.085 "superblock": true, 00:12:40.085 "num_base_bdevs": 2, 00:12:40.085 "num_base_bdevs_discovered": 2, 00:12:40.085 "num_base_bdevs_operational": 2, 00:12:40.085 "base_bdevs_list": [ 00:12:40.085 { 00:12:40.085 "name": "pt1", 00:12:40.085 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:40.085 "is_configured": true, 00:12:40.085 "data_offset": 2048, 00:12:40.085 "data_size": 63488 00:12:40.085 }, 00:12:40.085 { 00:12:40.085 "name": "pt2", 00:12:40.085 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:40.085 "is_configured": true, 00:12:40.085 "data_offset": 2048, 00:12:40.085 "data_size": 63488 00:12:40.085 } 00:12:40.085 ] 00:12:40.085 }' 00:12:40.085 19:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:40.085 19:48:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:40.653 19:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:12:40.653 19:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:40.653 19:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:40.653 19:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:40.653 19:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:40.653 19:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:40.653 19:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:40.653 19:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:40.913 [2024-07-24 19:48:32.394949] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:40.913 19:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:40.913 "name": "raid_bdev1", 00:12:40.913 "aliases": [ 00:12:40.913 "ea443b7a-b2f8-4ba5-961b-c4bd03981188" 00:12:40.913 ], 00:12:40.913 "product_name": "Raid Volume", 00:12:40.913 "block_size": 512, 00:12:40.913 "num_blocks": 126976, 00:12:40.913 "uuid": "ea443b7a-b2f8-4ba5-961b-c4bd03981188", 00:12:40.913 "assigned_rate_limits": { 00:12:40.913 "rw_ios_per_sec": 0, 00:12:40.913 "rw_mbytes_per_sec": 0, 00:12:40.913 "r_mbytes_per_sec": 0, 00:12:40.913 "w_mbytes_per_sec": 0 00:12:40.913 }, 00:12:40.913 "claimed": false, 00:12:40.913 "zoned": false, 00:12:40.913 "supported_io_types": { 00:12:40.913 "read": true, 00:12:40.913 "write": true, 00:12:40.913 "unmap": true, 00:12:40.913 "flush": true, 00:12:40.913 "reset": true, 00:12:40.913 "nvme_admin": false, 00:12:40.913 "nvme_io": false, 00:12:40.913 "nvme_io_md": false, 00:12:40.913 "write_zeroes": true, 00:12:40.913 "zcopy": false, 00:12:40.913 "get_zone_info": false, 00:12:40.913 "zone_management": false, 00:12:40.913 "zone_append": false, 00:12:40.913 "compare": false, 00:12:40.913 "compare_and_write": false, 00:12:40.913 "abort": false, 00:12:40.913 "seek_hole": false, 00:12:40.913 "seek_data": false, 00:12:40.913 "copy": false, 00:12:40.913 "nvme_iov_md": false 00:12:40.913 }, 00:12:40.913 "memory_domains": [ 00:12:40.913 { 00:12:40.913 "dma_device_id": "system", 00:12:40.913 "dma_device_type": 1 00:12:40.913 }, 00:12:40.913 { 00:12:40.913 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:40.913 "dma_device_type": 2 00:12:40.913 }, 00:12:40.913 { 00:12:40.913 "dma_device_id": "system", 00:12:40.913 "dma_device_type": 1 00:12:40.913 }, 00:12:40.913 { 00:12:40.913 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:40.913 "dma_device_type": 2 00:12:40.913 } 00:12:40.913 ], 00:12:40.913 "driver_specific": { 00:12:40.913 "raid": { 00:12:40.913 "uuid": "ea443b7a-b2f8-4ba5-961b-c4bd03981188", 00:12:40.913 "strip_size_kb": 64, 00:12:40.913 "state": "online", 00:12:40.913 "raid_level": "concat", 00:12:40.913 "superblock": true, 00:12:40.913 "num_base_bdevs": 2, 00:12:40.913 "num_base_bdevs_discovered": 2, 00:12:40.913 "num_base_bdevs_operational": 2, 00:12:40.913 "base_bdevs_list": [ 00:12:40.913 { 00:12:40.913 "name": "pt1", 00:12:40.913 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:40.913 "is_configured": true, 00:12:40.913 "data_offset": 2048, 00:12:40.913 "data_size": 63488 00:12:40.913 }, 00:12:40.913 { 00:12:40.913 "name": "pt2", 00:12:40.913 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:40.913 "is_configured": true, 00:12:40.913 "data_offset": 2048, 00:12:40.913 "data_size": 63488 00:12:40.913 } 00:12:40.913 ] 00:12:40.913 } 00:12:40.913 } 00:12:40.913 }' 00:12:40.913 19:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:40.913 19:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:40.913 pt2' 00:12:40.913 19:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:40.913 19:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:40.913 19:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:41.173 19:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:41.173 "name": "pt1", 00:12:41.173 "aliases": [ 00:12:41.173 "00000000-0000-0000-0000-000000000001" 00:12:41.173 ], 00:12:41.173 "product_name": "passthru", 00:12:41.173 "block_size": 512, 00:12:41.173 "num_blocks": 65536, 00:12:41.173 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:41.173 "assigned_rate_limits": { 00:12:41.173 "rw_ios_per_sec": 0, 00:12:41.173 "rw_mbytes_per_sec": 0, 00:12:41.173 "r_mbytes_per_sec": 0, 00:12:41.173 "w_mbytes_per_sec": 0 00:12:41.173 }, 00:12:41.173 "claimed": true, 00:12:41.173 "claim_type": "exclusive_write", 00:12:41.173 "zoned": false, 00:12:41.173 "supported_io_types": { 00:12:41.173 "read": true, 00:12:41.173 "write": true, 00:12:41.173 "unmap": true, 00:12:41.173 "flush": true, 00:12:41.173 "reset": true, 00:12:41.173 "nvme_admin": false, 00:12:41.173 "nvme_io": false, 00:12:41.173 "nvme_io_md": false, 00:12:41.173 "write_zeroes": true, 00:12:41.173 "zcopy": true, 00:12:41.173 "get_zone_info": false, 00:12:41.173 "zone_management": false, 00:12:41.173 "zone_append": false, 00:12:41.173 "compare": false, 00:12:41.173 "compare_and_write": false, 00:12:41.173 "abort": true, 00:12:41.173 "seek_hole": false, 00:12:41.173 "seek_data": false, 00:12:41.173 "copy": true, 00:12:41.173 "nvme_iov_md": false 00:12:41.173 }, 00:12:41.173 "memory_domains": [ 00:12:41.173 { 00:12:41.173 "dma_device_id": "system", 00:12:41.173 "dma_device_type": 1 00:12:41.173 }, 00:12:41.173 { 00:12:41.173 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:41.173 "dma_device_type": 2 00:12:41.173 } 00:12:41.173 ], 00:12:41.173 "driver_specific": { 00:12:41.173 "passthru": { 00:12:41.173 "name": "pt1", 00:12:41.173 "base_bdev_name": "malloc1" 00:12:41.173 } 00:12:41.173 } 00:12:41.173 }' 00:12:41.173 19:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:41.173 19:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:41.173 19:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:41.173 19:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:41.432 19:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:41.432 19:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:41.432 19:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:41.432 19:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:41.432 19:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:41.432 19:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:41.432 19:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:41.432 19:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:41.432 19:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:41.432 19:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:41.432 19:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:41.691 19:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:41.692 "name": "pt2", 00:12:41.692 "aliases": [ 00:12:41.692 "00000000-0000-0000-0000-000000000002" 00:12:41.692 ], 00:12:41.692 "product_name": "passthru", 00:12:41.692 "block_size": 512, 00:12:41.692 "num_blocks": 65536, 00:12:41.692 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:41.692 "assigned_rate_limits": { 00:12:41.692 "rw_ios_per_sec": 0, 00:12:41.692 "rw_mbytes_per_sec": 0, 00:12:41.692 "r_mbytes_per_sec": 0, 00:12:41.692 "w_mbytes_per_sec": 0 00:12:41.692 }, 00:12:41.692 "claimed": true, 00:12:41.692 "claim_type": "exclusive_write", 00:12:41.692 "zoned": false, 00:12:41.692 "supported_io_types": { 00:12:41.692 "read": true, 00:12:41.692 "write": true, 00:12:41.692 "unmap": true, 00:12:41.692 "flush": true, 00:12:41.692 "reset": true, 00:12:41.692 "nvme_admin": false, 00:12:41.692 "nvme_io": false, 00:12:41.692 "nvme_io_md": false, 00:12:41.692 "write_zeroes": true, 00:12:41.692 "zcopy": true, 00:12:41.692 "get_zone_info": false, 00:12:41.692 "zone_management": false, 00:12:41.692 "zone_append": false, 00:12:41.692 "compare": false, 00:12:41.692 "compare_and_write": false, 00:12:41.692 "abort": true, 00:12:41.692 "seek_hole": false, 00:12:41.692 "seek_data": false, 00:12:41.692 "copy": true, 00:12:41.692 "nvme_iov_md": false 00:12:41.692 }, 00:12:41.692 "memory_domains": [ 00:12:41.692 { 00:12:41.692 "dma_device_id": "system", 00:12:41.692 "dma_device_type": 1 00:12:41.692 }, 00:12:41.692 { 00:12:41.692 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:41.692 "dma_device_type": 2 00:12:41.692 } 00:12:41.692 ], 00:12:41.692 "driver_specific": { 00:12:41.692 "passthru": { 00:12:41.692 "name": "pt2", 00:12:41.692 "base_bdev_name": "malloc2" 00:12:41.692 } 00:12:41.692 } 00:12:41.692 }' 00:12:41.692 19:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:41.958 19:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:41.958 19:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:41.958 19:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:41.958 19:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:41.958 19:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:41.958 19:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:41.958 19:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:41.958 19:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:41.958 19:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:42.217 19:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:42.217 19:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:42.217 19:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:42.217 19:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:12:42.477 [2024-07-24 19:48:33.822716] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:42.477 19:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' ea443b7a-b2f8-4ba5-961b-c4bd03981188 '!=' ea443b7a-b2f8-4ba5-961b-c4bd03981188 ']' 00:12:42.477 19:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy concat 00:12:42.477 19:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:42.477 19:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:42.477 19:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 1383521 00:12:42.477 19:48:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1383521 ']' 00:12:42.477 19:48:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1383521 00:12:42.477 19:48:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:12:42.477 19:48:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:42.477 19:48:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1383521 00:12:42.477 19:48:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:42.477 19:48:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:42.477 19:48:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1383521' 00:12:42.477 killing process with pid 1383521 00:12:42.477 19:48:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 1383521 00:12:42.477 [2024-07-24 19:48:33.913196] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:42.477 [2024-07-24 19:48:33.913253] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:42.477 [2024-07-24 19:48:33.913300] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:42.477 [2024-07-24 19:48:33.913311] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x213ba80 name raid_bdev1, state offline 00:12:42.477 19:48:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 1383521 00:12:42.477 [2024-07-24 19:48:33.932529] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:42.737 19:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:12:42.737 00:12:42.737 real 0m10.261s 00:12:42.737 user 0m18.258s 00:12:42.737 sys 0m1.937s 00:12:42.737 19:48:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:42.737 19:48:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:42.737 ************************************ 00:12:42.737 END TEST raid_superblock_test 00:12:42.737 ************************************ 00:12:42.737 19:48:34 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:12:42.737 19:48:34 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:12:42.737 19:48:34 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:42.737 19:48:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:42.737 ************************************ 00:12:42.737 START TEST raid_read_error_test 00:12:42.737 ************************************ 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 2 read 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=concat 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=2 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' concat '!=' raid1 ']' 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.7BiR9CVDmI 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1385153 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1385153 /var/tmp/spdk-raid.sock 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 1385153 ']' 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:42.737 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:42.737 19:48:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:42.737 [2024-07-24 19:48:34.317505] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:12:42.737 [2024-07-24 19:48:34.317567] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1385153 ] 00:12:42.997 [2024-07-24 19:48:34.431673] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:42.997 [2024-07-24 19:48:34.537316] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:43.256 [2024-07-24 19:48:34.593199] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:43.256 [2024-07-24 19:48:34.593233] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:43.824 19:48:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:43.824 19:48:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:12:43.824 19:48:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:12:43.824 19:48:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:44.083 BaseBdev1_malloc 00:12:44.083 19:48:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:44.342 true 00:12:44.342 19:48:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:44.601 [2024-07-24 19:48:35.985558] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:44.601 [2024-07-24 19:48:35.985604] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:44.601 [2024-07-24 19:48:35.985622] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd513a0 00:12:44.601 [2024-07-24 19:48:35.985635] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:44.601 [2024-07-24 19:48:35.987179] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:44.601 [2024-07-24 19:48:35.987205] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:44.601 BaseBdev1 00:12:44.601 19:48:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:12:44.601 19:48:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:44.861 BaseBdev2_malloc 00:12:44.861 19:48:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:45.120 true 00:12:45.120 19:48:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:45.120 [2024-07-24 19:48:36.664024] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:45.120 [2024-07-24 19:48:36.664068] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:45.120 [2024-07-24 19:48:36.664091] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe10370 00:12:45.120 [2024-07-24 19:48:36.664103] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:45.120 [2024-07-24 19:48:36.665597] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:45.120 [2024-07-24 19:48:36.665625] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:45.120 BaseBdev2 00:12:45.120 19:48:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:45.380 [2024-07-24 19:48:36.912710] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:45.380 [2024-07-24 19:48:36.913925] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:45.380 [2024-07-24 19:48:36.914111] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xd47340 00:12:45.380 [2024-07-24 19:48:36.914124] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:45.380 [2024-07-24 19:48:36.914308] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd48080 00:12:45.380 [2024-07-24 19:48:36.914462] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd47340 00:12:45.380 [2024-07-24 19:48:36.914472] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd47340 00:12:45.380 [2024-07-24 19:48:36.914571] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:45.380 19:48:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:45.380 19:48:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:45.380 19:48:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:45.380 19:48:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:45.380 19:48:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:45.380 19:48:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:45.380 19:48:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:45.380 19:48:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:45.380 19:48:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:45.380 19:48:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:45.380 19:48:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:45.380 19:48:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:45.639 19:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:45.639 "name": "raid_bdev1", 00:12:45.639 "uuid": "8b4ef9a0-5ce4-4077-ae75-a4283472fb46", 00:12:45.640 "strip_size_kb": 64, 00:12:45.640 "state": "online", 00:12:45.640 "raid_level": "concat", 00:12:45.640 "superblock": true, 00:12:45.640 "num_base_bdevs": 2, 00:12:45.640 "num_base_bdevs_discovered": 2, 00:12:45.640 "num_base_bdevs_operational": 2, 00:12:45.640 "base_bdevs_list": [ 00:12:45.640 { 00:12:45.640 "name": "BaseBdev1", 00:12:45.640 "uuid": "11fc016b-ceb1-5196-a700-cc6f5fecba2e", 00:12:45.640 "is_configured": true, 00:12:45.640 "data_offset": 2048, 00:12:45.640 "data_size": 63488 00:12:45.640 }, 00:12:45.640 { 00:12:45.640 "name": "BaseBdev2", 00:12:45.640 "uuid": "373c6066-3cdc-56e8-ba9b-6a38f9aca623", 00:12:45.640 "is_configured": true, 00:12:45.640 "data_offset": 2048, 00:12:45.640 "data_size": 63488 00:12:45.640 } 00:12:45.640 ] 00:12:45.640 }' 00:12:45.640 19:48:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:45.640 19:48:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:46.578 19:48:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:46.578 19:48:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:12:46.578 [2024-07-24 19:48:38.152314] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe119c0 00:12:47.516 19:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:47.776 19:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:12:47.776 19:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ concat = \r\a\i\d\1 ]] 00:12:47.776 19:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=2 00:12:47.776 19:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:47.776 19:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:47.776 19:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:47.776 19:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:47.776 19:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:47.776 19:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:47.776 19:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:47.776 19:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:47.776 19:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:47.776 19:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:47.776 19:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:47.776 19:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:48.035 19:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:48.035 "name": "raid_bdev1", 00:12:48.035 "uuid": "8b4ef9a0-5ce4-4077-ae75-a4283472fb46", 00:12:48.035 "strip_size_kb": 64, 00:12:48.035 "state": "online", 00:12:48.035 "raid_level": "concat", 00:12:48.035 "superblock": true, 00:12:48.035 "num_base_bdevs": 2, 00:12:48.035 "num_base_bdevs_discovered": 2, 00:12:48.035 "num_base_bdevs_operational": 2, 00:12:48.035 "base_bdevs_list": [ 00:12:48.035 { 00:12:48.035 "name": "BaseBdev1", 00:12:48.035 "uuid": "11fc016b-ceb1-5196-a700-cc6f5fecba2e", 00:12:48.035 "is_configured": true, 00:12:48.035 "data_offset": 2048, 00:12:48.035 "data_size": 63488 00:12:48.035 }, 00:12:48.035 { 00:12:48.035 "name": "BaseBdev2", 00:12:48.035 "uuid": "373c6066-3cdc-56e8-ba9b-6a38f9aca623", 00:12:48.035 "is_configured": true, 00:12:48.035 "data_offset": 2048, 00:12:48.035 "data_size": 63488 00:12:48.035 } 00:12:48.035 ] 00:12:48.035 }' 00:12:48.035 19:48:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:48.035 19:48:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:48.604 19:48:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:48.863 [2024-07-24 19:48:40.398155] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:48.863 [2024-07-24 19:48:40.398193] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:48.863 [2024-07-24 19:48:40.401356] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:48.863 [2024-07-24 19:48:40.401388] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:48.863 [2024-07-24 19:48:40.401421] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:48.863 [2024-07-24 19:48:40.401432] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd47340 name raid_bdev1, state offline 00:12:48.863 0 00:12:48.863 19:48:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1385153 00:12:48.863 19:48:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 1385153 ']' 00:12:48.863 19:48:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 1385153 00:12:48.863 19:48:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:12:48.863 19:48:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:48.863 19:48:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1385153 00:12:49.123 19:48:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:49.123 19:48:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:49.123 19:48:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1385153' 00:12:49.123 killing process with pid 1385153 00:12:49.123 19:48:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 1385153 00:12:49.123 [2024-07-24 19:48:40.468424] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:49.123 19:48:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 1385153 00:12:49.123 [2024-07-24 19:48:40.479079] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:49.123 19:48:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.7BiR9CVDmI 00:12:49.123 19:48:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:12:49.123 19:48:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:12:49.383 19:48:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.45 00:12:49.383 19:48:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy concat 00:12:49.383 19:48:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:49.383 19:48:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:49.383 19:48:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.45 != \0\.\0\0 ]] 00:12:49.383 00:12:49.383 real 0m6.472s 00:12:49.383 user 0m10.175s 00:12:49.383 sys 0m1.129s 00:12:49.383 19:48:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:49.383 19:48:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:49.383 ************************************ 00:12:49.383 END TEST raid_read_error_test 00:12:49.383 ************************************ 00:12:49.383 19:48:40 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:12:49.383 19:48:40 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:12:49.383 19:48:40 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:49.383 19:48:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:49.383 ************************************ 00:12:49.383 START TEST raid_write_error_test 00:12:49.383 ************************************ 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 2 write 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=concat 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=2 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' concat '!=' raid1 ']' 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.UdA4Etav1L 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1386125 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1386125 /var/tmp/spdk-raid.sock 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 1386125 ']' 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:49.383 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:49.383 19:48:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:49.383 [2024-07-24 19:48:40.877322] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:12:49.383 [2024-07-24 19:48:40.877399] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1386125 ] 00:12:49.643 [2024-07-24 19:48:41.009530] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:49.643 [2024-07-24 19:48:41.115755] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:49.643 [2024-07-24 19:48:41.179823] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:49.643 [2024-07-24 19:48:41.179860] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:50.580 19:48:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:50.580 19:48:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:12:50.580 19:48:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:12:50.580 19:48:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:50.580 BaseBdev1_malloc 00:12:50.580 19:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:50.839 true 00:12:50.839 19:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:51.098 [2024-07-24 19:48:42.522575] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:51.098 [2024-07-24 19:48:42.522618] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:51.098 [2024-07-24 19:48:42.522639] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11fd3a0 00:12:51.098 [2024-07-24 19:48:42.522651] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:51.098 [2024-07-24 19:48:42.524414] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:51.098 [2024-07-24 19:48:42.524442] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:51.098 BaseBdev1 00:12:51.098 19:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:12:51.098 19:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:51.358 BaseBdev2_malloc 00:12:51.358 19:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:51.617 true 00:12:51.618 19:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:51.887 [2024-07-24 19:48:43.242253] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:51.887 [2024-07-24 19:48:43.242297] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:51.887 [2024-07-24 19:48:43.242321] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12bc370 00:12:51.887 [2024-07-24 19:48:43.242334] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:51.887 [2024-07-24 19:48:43.243969] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:51.887 [2024-07-24 19:48:43.243997] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:51.887 BaseBdev2 00:12:51.888 19:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:52.150 [2024-07-24 19:48:43.486931] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:52.150 [2024-07-24 19:48:43.488297] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:52.150 [2024-07-24 19:48:43.488509] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x11f3340 00:12:52.150 [2024-07-24 19:48:43.488525] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:52.150 [2024-07-24 19:48:43.488726] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11f4080 00:12:52.150 [2024-07-24 19:48:43.488875] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11f3340 00:12:52.150 [2024-07-24 19:48:43.488886] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11f3340 00:12:52.150 [2024-07-24 19:48:43.488991] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:52.150 19:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:52.150 19:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:52.150 19:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:52.150 19:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:52.150 19:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:52.150 19:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:52.150 19:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:52.150 19:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:52.150 19:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:52.150 19:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:52.150 19:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:52.150 19:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:52.409 19:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:52.409 "name": "raid_bdev1", 00:12:52.409 "uuid": "af9ff0f4-562c-49ba-aa76-d2544181209c", 00:12:52.409 "strip_size_kb": 64, 00:12:52.409 "state": "online", 00:12:52.409 "raid_level": "concat", 00:12:52.409 "superblock": true, 00:12:52.409 "num_base_bdevs": 2, 00:12:52.409 "num_base_bdevs_discovered": 2, 00:12:52.409 "num_base_bdevs_operational": 2, 00:12:52.409 "base_bdevs_list": [ 00:12:52.409 { 00:12:52.409 "name": "BaseBdev1", 00:12:52.409 "uuid": "6fa02e4c-f086-5411-867d-2e3086769ab6", 00:12:52.409 "is_configured": true, 00:12:52.409 "data_offset": 2048, 00:12:52.409 "data_size": 63488 00:12:52.409 }, 00:12:52.409 { 00:12:52.409 "name": "BaseBdev2", 00:12:52.409 "uuid": "1cb155c3-9401-5875-af55-2b6adf65ffdc", 00:12:52.409 "is_configured": true, 00:12:52.409 "data_offset": 2048, 00:12:52.409 "data_size": 63488 00:12:52.409 } 00:12:52.409 ] 00:12:52.409 }' 00:12:52.409 19:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:52.409 19:48:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:52.977 19:48:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:12:52.978 19:48:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:52.978 [2024-07-24 19:48:44.373595] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12bd9c0 00:12:53.916 19:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:12:53.916 19:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:12:53.916 19:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ concat = \r\a\i\d\1 ]] 00:12:53.916 19:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=2 00:12:53.916 19:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:53.916 19:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:53.916 19:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:53.916 19:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:53.916 19:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:53.916 19:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:53.916 19:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:53.916 19:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:53.916 19:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:53.916 19:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:53.916 19:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:53.916 19:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:54.175 19:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:54.175 "name": "raid_bdev1", 00:12:54.175 "uuid": "af9ff0f4-562c-49ba-aa76-d2544181209c", 00:12:54.175 "strip_size_kb": 64, 00:12:54.175 "state": "online", 00:12:54.175 "raid_level": "concat", 00:12:54.175 "superblock": true, 00:12:54.175 "num_base_bdevs": 2, 00:12:54.175 "num_base_bdevs_discovered": 2, 00:12:54.175 "num_base_bdevs_operational": 2, 00:12:54.175 "base_bdevs_list": [ 00:12:54.175 { 00:12:54.175 "name": "BaseBdev1", 00:12:54.175 "uuid": "6fa02e4c-f086-5411-867d-2e3086769ab6", 00:12:54.175 "is_configured": true, 00:12:54.175 "data_offset": 2048, 00:12:54.175 "data_size": 63488 00:12:54.175 }, 00:12:54.175 { 00:12:54.175 "name": "BaseBdev2", 00:12:54.175 "uuid": "1cb155c3-9401-5875-af55-2b6adf65ffdc", 00:12:54.175 "is_configured": true, 00:12:54.175 "data_offset": 2048, 00:12:54.175 "data_size": 63488 00:12:54.175 } 00:12:54.175 ] 00:12:54.175 }' 00:12:54.175 19:48:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:54.175 19:48:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:54.744 19:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:55.003 [2024-07-24 19:48:46.456846] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:55.003 [2024-07-24 19:48:46.456885] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:55.003 [2024-07-24 19:48:46.460058] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:55.003 [2024-07-24 19:48:46.460088] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:55.003 [2024-07-24 19:48:46.460114] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:55.003 [2024-07-24 19:48:46.460125] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11f3340 name raid_bdev1, state offline 00:12:55.003 0 00:12:55.003 19:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1386125 00:12:55.003 19:48:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 1386125 ']' 00:12:55.003 19:48:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 1386125 00:12:55.003 19:48:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:12:55.003 19:48:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:55.003 19:48:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1386125 00:12:55.003 19:48:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:55.003 19:48:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:55.003 19:48:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1386125' 00:12:55.003 killing process with pid 1386125 00:12:55.004 19:48:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 1386125 00:12:55.004 [2024-07-24 19:48:46.541775] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:55.004 19:48:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 1386125 00:12:55.004 [2024-07-24 19:48:46.552869] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:55.265 19:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.UdA4Etav1L 00:12:55.265 19:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:12:55.265 19:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:12:55.265 19:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.48 00:12:55.265 19:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy concat 00:12:55.265 19:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:55.265 19:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:55.265 19:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.48 != \0\.\0\0 ]] 00:12:55.265 00:12:55.265 real 0m5.997s 00:12:55.265 user 0m9.213s 00:12:55.265 sys 0m1.120s 00:12:55.265 19:48:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:55.265 19:48:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:55.265 ************************************ 00:12:55.265 END TEST raid_write_error_test 00:12:55.265 ************************************ 00:12:55.266 19:48:46 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:12:55.266 19:48:46 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:12:55.266 19:48:46 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:12:55.266 19:48:46 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:55.266 19:48:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:55.526 ************************************ 00:12:55.526 START TEST raid_state_function_test 00:12:55.526 ************************************ 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 2 false 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1386930 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1386930' 00:12:55.526 Process raid pid: 1386930 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1386930 /var/tmp/spdk-raid.sock 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 1386930 ']' 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:55.526 19:48:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:55.527 19:48:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:55.527 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:55.527 19:48:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:55.527 19:48:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:55.527 [2024-07-24 19:48:46.950423] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:12:55.527 [2024-07-24 19:48:46.950489] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:55.527 [2024-07-24 19:48:47.081569] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:55.786 [2024-07-24 19:48:47.188901] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:55.786 [2024-07-24 19:48:47.256444] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:55.786 [2024-07-24 19:48:47.256478] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:56.453 19:48:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:56.453 19:48:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:12:56.453 19:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:56.713 [2024-07-24 19:48:48.115952] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:56.713 [2024-07-24 19:48:48.115991] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:56.713 [2024-07-24 19:48:48.116001] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:56.713 [2024-07-24 19:48:48.116013] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:56.713 19:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:56.713 19:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:56.713 19:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:56.713 19:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:56.713 19:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:56.713 19:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:56.713 19:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:56.713 19:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:56.713 19:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:56.713 19:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:56.713 19:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:56.713 19:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:56.972 19:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:56.972 "name": "Existed_Raid", 00:12:56.972 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:56.972 "strip_size_kb": 0, 00:12:56.972 "state": "configuring", 00:12:56.972 "raid_level": "raid1", 00:12:56.972 "superblock": false, 00:12:56.972 "num_base_bdevs": 2, 00:12:56.972 "num_base_bdevs_discovered": 0, 00:12:56.972 "num_base_bdevs_operational": 2, 00:12:56.972 "base_bdevs_list": [ 00:12:56.972 { 00:12:56.972 "name": "BaseBdev1", 00:12:56.972 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:56.972 "is_configured": false, 00:12:56.972 "data_offset": 0, 00:12:56.972 "data_size": 0 00:12:56.972 }, 00:12:56.972 { 00:12:56.972 "name": "BaseBdev2", 00:12:56.972 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:56.972 "is_configured": false, 00:12:56.972 "data_offset": 0, 00:12:56.972 "data_size": 0 00:12:56.972 } 00:12:56.972 ] 00:12:56.972 }' 00:12:56.972 19:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:56.972 19:48:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:57.553 19:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:57.813 [2024-07-24 19:48:49.274903] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:57.813 [2024-07-24 19:48:49.274932] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf4f9f0 name Existed_Raid, state configuring 00:12:57.813 19:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:58.073 [2024-07-24 19:48:49.519558] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:58.073 [2024-07-24 19:48:49.519588] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:58.073 [2024-07-24 19:48:49.519598] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:58.073 [2024-07-24 19:48:49.519610] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:58.073 19:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:58.332 [2024-07-24 19:48:49.774075] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:58.332 BaseBdev1 00:12:58.332 19:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:58.332 19:48:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:12:58.332 19:48:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:58.332 19:48:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:12:58.332 19:48:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:58.332 19:48:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:58.332 19:48:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:58.591 19:48:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:58.850 [ 00:12:58.850 { 00:12:58.850 "name": "BaseBdev1", 00:12:58.850 "aliases": [ 00:12:58.850 "86be286f-492e-4399-8908-2ce1c290125c" 00:12:58.850 ], 00:12:58.850 "product_name": "Malloc disk", 00:12:58.850 "block_size": 512, 00:12:58.850 "num_blocks": 65536, 00:12:58.850 "uuid": "86be286f-492e-4399-8908-2ce1c290125c", 00:12:58.850 "assigned_rate_limits": { 00:12:58.850 "rw_ios_per_sec": 0, 00:12:58.851 "rw_mbytes_per_sec": 0, 00:12:58.851 "r_mbytes_per_sec": 0, 00:12:58.851 "w_mbytes_per_sec": 0 00:12:58.851 }, 00:12:58.851 "claimed": true, 00:12:58.851 "claim_type": "exclusive_write", 00:12:58.851 "zoned": false, 00:12:58.851 "supported_io_types": { 00:12:58.851 "read": true, 00:12:58.851 "write": true, 00:12:58.851 "unmap": true, 00:12:58.851 "flush": true, 00:12:58.851 "reset": true, 00:12:58.851 "nvme_admin": false, 00:12:58.851 "nvme_io": false, 00:12:58.851 "nvme_io_md": false, 00:12:58.851 "write_zeroes": true, 00:12:58.851 "zcopy": true, 00:12:58.851 "get_zone_info": false, 00:12:58.851 "zone_management": false, 00:12:58.851 "zone_append": false, 00:12:58.851 "compare": false, 00:12:58.851 "compare_and_write": false, 00:12:58.851 "abort": true, 00:12:58.851 "seek_hole": false, 00:12:58.851 "seek_data": false, 00:12:58.851 "copy": true, 00:12:58.851 "nvme_iov_md": false 00:12:58.851 }, 00:12:58.851 "memory_domains": [ 00:12:58.851 { 00:12:58.851 "dma_device_id": "system", 00:12:58.851 "dma_device_type": 1 00:12:58.851 }, 00:12:58.851 { 00:12:58.851 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:58.851 "dma_device_type": 2 00:12:58.851 } 00:12:58.851 ], 00:12:58.851 "driver_specific": {} 00:12:58.851 } 00:12:58.851 ] 00:12:58.851 19:48:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:12:58.851 19:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:58.851 19:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:58.851 19:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:58.851 19:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:58.851 19:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:58.851 19:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:58.851 19:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:58.851 19:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:58.851 19:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:58.851 19:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:58.851 19:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:58.851 19:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:59.110 19:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:59.110 "name": "Existed_Raid", 00:12:59.110 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:59.110 "strip_size_kb": 0, 00:12:59.110 "state": "configuring", 00:12:59.110 "raid_level": "raid1", 00:12:59.110 "superblock": false, 00:12:59.110 "num_base_bdevs": 2, 00:12:59.110 "num_base_bdevs_discovered": 1, 00:12:59.110 "num_base_bdevs_operational": 2, 00:12:59.110 "base_bdevs_list": [ 00:12:59.110 { 00:12:59.110 "name": "BaseBdev1", 00:12:59.110 "uuid": "86be286f-492e-4399-8908-2ce1c290125c", 00:12:59.110 "is_configured": true, 00:12:59.110 "data_offset": 0, 00:12:59.110 "data_size": 65536 00:12:59.110 }, 00:12:59.110 { 00:12:59.110 "name": "BaseBdev2", 00:12:59.110 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:59.110 "is_configured": false, 00:12:59.110 "data_offset": 0, 00:12:59.110 "data_size": 0 00:12:59.110 } 00:12:59.110 ] 00:12:59.110 }' 00:12:59.110 19:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:59.110 19:48:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:59.679 19:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:59.938 [2024-07-24 19:48:51.350258] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:59.938 [2024-07-24 19:48:51.350297] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf4f2e0 name Existed_Raid, state configuring 00:12:59.938 19:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:00.197 [2024-07-24 19:48:51.538788] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:00.197 [2024-07-24 19:48:51.540257] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:00.197 [2024-07-24 19:48:51.540289] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:00.197 19:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:00.197 19:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:00.197 19:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:00.197 19:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:00.197 19:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:00.197 19:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:00.197 19:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:00.197 19:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:00.197 19:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:00.197 19:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:00.197 19:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:00.197 19:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:00.197 19:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:00.197 19:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:00.456 19:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:00.456 "name": "Existed_Raid", 00:13:00.456 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:00.456 "strip_size_kb": 0, 00:13:00.456 "state": "configuring", 00:13:00.456 "raid_level": "raid1", 00:13:00.457 "superblock": false, 00:13:00.457 "num_base_bdevs": 2, 00:13:00.457 "num_base_bdevs_discovered": 1, 00:13:00.457 "num_base_bdevs_operational": 2, 00:13:00.457 "base_bdevs_list": [ 00:13:00.457 { 00:13:00.457 "name": "BaseBdev1", 00:13:00.457 "uuid": "86be286f-492e-4399-8908-2ce1c290125c", 00:13:00.457 "is_configured": true, 00:13:00.457 "data_offset": 0, 00:13:00.457 "data_size": 65536 00:13:00.457 }, 00:13:00.457 { 00:13:00.457 "name": "BaseBdev2", 00:13:00.457 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:00.457 "is_configured": false, 00:13:00.457 "data_offset": 0, 00:13:00.457 "data_size": 0 00:13:00.457 } 00:13:00.457 ] 00:13:00.457 }' 00:13:00.457 19:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:00.457 19:48:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:01.025 19:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:01.025 [2024-07-24 19:48:52.552809] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:01.025 [2024-07-24 19:48:52.552846] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xf500d0 00:13:01.025 [2024-07-24 19:48:52.552854] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:13:01.025 [2024-07-24 19:48:52.553043] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10f3bd0 00:13:01.025 [2024-07-24 19:48:52.553171] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf500d0 00:13:01.025 [2024-07-24 19:48:52.553181] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xf500d0 00:13:01.025 [2024-07-24 19:48:52.553345] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:01.025 BaseBdev2 00:13:01.025 19:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:01.025 19:48:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:13:01.025 19:48:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:01.025 19:48:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:13:01.025 19:48:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:01.025 19:48:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:01.025 19:48:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:01.285 19:48:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:01.544 [ 00:13:01.544 { 00:13:01.544 "name": "BaseBdev2", 00:13:01.544 "aliases": [ 00:13:01.544 "18b157ca-6b4e-4d68-8660-9ebd32fbeb4e" 00:13:01.544 ], 00:13:01.544 "product_name": "Malloc disk", 00:13:01.544 "block_size": 512, 00:13:01.544 "num_blocks": 65536, 00:13:01.544 "uuid": "18b157ca-6b4e-4d68-8660-9ebd32fbeb4e", 00:13:01.544 "assigned_rate_limits": { 00:13:01.544 "rw_ios_per_sec": 0, 00:13:01.544 "rw_mbytes_per_sec": 0, 00:13:01.544 "r_mbytes_per_sec": 0, 00:13:01.544 "w_mbytes_per_sec": 0 00:13:01.544 }, 00:13:01.544 "claimed": true, 00:13:01.544 "claim_type": "exclusive_write", 00:13:01.544 "zoned": false, 00:13:01.544 "supported_io_types": { 00:13:01.544 "read": true, 00:13:01.544 "write": true, 00:13:01.544 "unmap": true, 00:13:01.544 "flush": true, 00:13:01.544 "reset": true, 00:13:01.544 "nvme_admin": false, 00:13:01.544 "nvme_io": false, 00:13:01.544 "nvme_io_md": false, 00:13:01.545 "write_zeroes": true, 00:13:01.545 "zcopy": true, 00:13:01.545 "get_zone_info": false, 00:13:01.545 "zone_management": false, 00:13:01.545 "zone_append": false, 00:13:01.545 "compare": false, 00:13:01.545 "compare_and_write": false, 00:13:01.545 "abort": true, 00:13:01.545 "seek_hole": false, 00:13:01.545 "seek_data": false, 00:13:01.545 "copy": true, 00:13:01.545 "nvme_iov_md": false 00:13:01.545 }, 00:13:01.545 "memory_domains": [ 00:13:01.545 { 00:13:01.545 "dma_device_id": "system", 00:13:01.545 "dma_device_type": 1 00:13:01.545 }, 00:13:01.545 { 00:13:01.545 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:01.545 "dma_device_type": 2 00:13:01.545 } 00:13:01.545 ], 00:13:01.545 "driver_specific": {} 00:13:01.545 } 00:13:01.545 ] 00:13:01.545 19:48:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:13:01.545 19:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:01.545 19:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:01.545 19:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:13:01.545 19:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:01.545 19:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:01.545 19:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:01.545 19:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:01.545 19:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:01.545 19:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:01.545 19:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:01.545 19:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:01.545 19:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:01.545 19:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:01.545 19:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:01.804 19:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:01.804 "name": "Existed_Raid", 00:13:01.804 "uuid": "d0b40229-1d33-4633-ab2a-cf2774c75723", 00:13:01.804 "strip_size_kb": 0, 00:13:01.804 "state": "online", 00:13:01.804 "raid_level": "raid1", 00:13:01.804 "superblock": false, 00:13:01.804 "num_base_bdevs": 2, 00:13:01.804 "num_base_bdevs_discovered": 2, 00:13:01.804 "num_base_bdevs_operational": 2, 00:13:01.804 "base_bdevs_list": [ 00:13:01.804 { 00:13:01.804 "name": "BaseBdev1", 00:13:01.804 "uuid": "86be286f-492e-4399-8908-2ce1c290125c", 00:13:01.804 "is_configured": true, 00:13:01.804 "data_offset": 0, 00:13:01.804 "data_size": 65536 00:13:01.804 }, 00:13:01.804 { 00:13:01.804 "name": "BaseBdev2", 00:13:01.804 "uuid": "18b157ca-6b4e-4d68-8660-9ebd32fbeb4e", 00:13:01.804 "is_configured": true, 00:13:01.804 "data_offset": 0, 00:13:01.804 "data_size": 65536 00:13:01.804 } 00:13:01.804 ] 00:13:01.804 }' 00:13:01.804 19:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:01.804 19:48:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:02.372 19:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:02.372 19:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:02.372 19:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:02.372 19:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:02.372 19:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:02.372 19:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:02.372 19:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:02.372 19:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:02.631 [2024-07-24 19:48:54.069094] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:02.631 19:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:02.631 "name": "Existed_Raid", 00:13:02.631 "aliases": [ 00:13:02.631 "d0b40229-1d33-4633-ab2a-cf2774c75723" 00:13:02.631 ], 00:13:02.631 "product_name": "Raid Volume", 00:13:02.631 "block_size": 512, 00:13:02.631 "num_blocks": 65536, 00:13:02.631 "uuid": "d0b40229-1d33-4633-ab2a-cf2774c75723", 00:13:02.631 "assigned_rate_limits": { 00:13:02.631 "rw_ios_per_sec": 0, 00:13:02.631 "rw_mbytes_per_sec": 0, 00:13:02.631 "r_mbytes_per_sec": 0, 00:13:02.631 "w_mbytes_per_sec": 0 00:13:02.632 }, 00:13:02.632 "claimed": false, 00:13:02.632 "zoned": false, 00:13:02.632 "supported_io_types": { 00:13:02.632 "read": true, 00:13:02.632 "write": true, 00:13:02.632 "unmap": false, 00:13:02.632 "flush": false, 00:13:02.632 "reset": true, 00:13:02.632 "nvme_admin": false, 00:13:02.632 "nvme_io": false, 00:13:02.632 "nvme_io_md": false, 00:13:02.632 "write_zeroes": true, 00:13:02.632 "zcopy": false, 00:13:02.632 "get_zone_info": false, 00:13:02.632 "zone_management": false, 00:13:02.632 "zone_append": false, 00:13:02.632 "compare": false, 00:13:02.632 "compare_and_write": false, 00:13:02.632 "abort": false, 00:13:02.632 "seek_hole": false, 00:13:02.632 "seek_data": false, 00:13:02.632 "copy": false, 00:13:02.632 "nvme_iov_md": false 00:13:02.632 }, 00:13:02.632 "memory_domains": [ 00:13:02.632 { 00:13:02.632 "dma_device_id": "system", 00:13:02.632 "dma_device_type": 1 00:13:02.632 }, 00:13:02.632 { 00:13:02.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:02.632 "dma_device_type": 2 00:13:02.632 }, 00:13:02.632 { 00:13:02.632 "dma_device_id": "system", 00:13:02.632 "dma_device_type": 1 00:13:02.632 }, 00:13:02.632 { 00:13:02.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:02.632 "dma_device_type": 2 00:13:02.632 } 00:13:02.632 ], 00:13:02.632 "driver_specific": { 00:13:02.632 "raid": { 00:13:02.632 "uuid": "d0b40229-1d33-4633-ab2a-cf2774c75723", 00:13:02.632 "strip_size_kb": 0, 00:13:02.632 "state": "online", 00:13:02.632 "raid_level": "raid1", 00:13:02.632 "superblock": false, 00:13:02.632 "num_base_bdevs": 2, 00:13:02.632 "num_base_bdevs_discovered": 2, 00:13:02.632 "num_base_bdevs_operational": 2, 00:13:02.632 "base_bdevs_list": [ 00:13:02.632 { 00:13:02.632 "name": "BaseBdev1", 00:13:02.632 "uuid": "86be286f-492e-4399-8908-2ce1c290125c", 00:13:02.632 "is_configured": true, 00:13:02.632 "data_offset": 0, 00:13:02.632 "data_size": 65536 00:13:02.632 }, 00:13:02.632 { 00:13:02.632 "name": "BaseBdev2", 00:13:02.632 "uuid": "18b157ca-6b4e-4d68-8660-9ebd32fbeb4e", 00:13:02.632 "is_configured": true, 00:13:02.632 "data_offset": 0, 00:13:02.632 "data_size": 65536 00:13:02.632 } 00:13:02.632 ] 00:13:02.632 } 00:13:02.632 } 00:13:02.632 }' 00:13:02.632 19:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:02.632 19:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:02.632 BaseBdev2' 00:13:02.632 19:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:02.632 19:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:02.632 19:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:02.891 19:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:02.891 "name": "BaseBdev1", 00:13:02.891 "aliases": [ 00:13:02.891 "86be286f-492e-4399-8908-2ce1c290125c" 00:13:02.891 ], 00:13:02.891 "product_name": "Malloc disk", 00:13:02.891 "block_size": 512, 00:13:02.891 "num_blocks": 65536, 00:13:02.891 "uuid": "86be286f-492e-4399-8908-2ce1c290125c", 00:13:02.891 "assigned_rate_limits": { 00:13:02.891 "rw_ios_per_sec": 0, 00:13:02.891 "rw_mbytes_per_sec": 0, 00:13:02.891 "r_mbytes_per_sec": 0, 00:13:02.891 "w_mbytes_per_sec": 0 00:13:02.891 }, 00:13:02.891 "claimed": true, 00:13:02.891 "claim_type": "exclusive_write", 00:13:02.891 "zoned": false, 00:13:02.891 "supported_io_types": { 00:13:02.891 "read": true, 00:13:02.891 "write": true, 00:13:02.891 "unmap": true, 00:13:02.891 "flush": true, 00:13:02.891 "reset": true, 00:13:02.891 "nvme_admin": false, 00:13:02.891 "nvme_io": false, 00:13:02.891 "nvme_io_md": false, 00:13:02.891 "write_zeroes": true, 00:13:02.891 "zcopy": true, 00:13:02.891 "get_zone_info": false, 00:13:02.891 "zone_management": false, 00:13:02.891 "zone_append": false, 00:13:02.891 "compare": false, 00:13:02.891 "compare_and_write": false, 00:13:02.891 "abort": true, 00:13:02.891 "seek_hole": false, 00:13:02.891 "seek_data": false, 00:13:02.891 "copy": true, 00:13:02.891 "nvme_iov_md": false 00:13:02.891 }, 00:13:02.891 "memory_domains": [ 00:13:02.891 { 00:13:02.891 "dma_device_id": "system", 00:13:02.891 "dma_device_type": 1 00:13:02.891 }, 00:13:02.891 { 00:13:02.891 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:02.891 "dma_device_type": 2 00:13:02.891 } 00:13:02.891 ], 00:13:02.891 "driver_specific": {} 00:13:02.891 }' 00:13:02.892 19:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:02.892 19:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:03.151 19:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:03.151 19:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:03.151 19:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:03.151 19:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:03.151 19:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:03.151 19:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:03.151 19:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:03.151 19:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:03.151 19:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:03.151 19:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:03.151 19:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:03.151 19:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:03.151 19:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:03.720 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:03.720 "name": "BaseBdev2", 00:13:03.720 "aliases": [ 00:13:03.720 "18b157ca-6b4e-4d68-8660-9ebd32fbeb4e" 00:13:03.720 ], 00:13:03.720 "product_name": "Malloc disk", 00:13:03.720 "block_size": 512, 00:13:03.720 "num_blocks": 65536, 00:13:03.720 "uuid": "18b157ca-6b4e-4d68-8660-9ebd32fbeb4e", 00:13:03.720 "assigned_rate_limits": { 00:13:03.720 "rw_ios_per_sec": 0, 00:13:03.720 "rw_mbytes_per_sec": 0, 00:13:03.720 "r_mbytes_per_sec": 0, 00:13:03.720 "w_mbytes_per_sec": 0 00:13:03.720 }, 00:13:03.720 "claimed": true, 00:13:03.720 "claim_type": "exclusive_write", 00:13:03.720 "zoned": false, 00:13:03.720 "supported_io_types": { 00:13:03.720 "read": true, 00:13:03.720 "write": true, 00:13:03.720 "unmap": true, 00:13:03.720 "flush": true, 00:13:03.720 "reset": true, 00:13:03.720 "nvme_admin": false, 00:13:03.720 "nvme_io": false, 00:13:03.720 "nvme_io_md": false, 00:13:03.720 "write_zeroes": true, 00:13:03.720 "zcopy": true, 00:13:03.720 "get_zone_info": false, 00:13:03.720 "zone_management": false, 00:13:03.720 "zone_append": false, 00:13:03.720 "compare": false, 00:13:03.720 "compare_and_write": false, 00:13:03.720 "abort": true, 00:13:03.720 "seek_hole": false, 00:13:03.720 "seek_data": false, 00:13:03.720 "copy": true, 00:13:03.720 "nvme_iov_md": false 00:13:03.720 }, 00:13:03.720 "memory_domains": [ 00:13:03.720 { 00:13:03.720 "dma_device_id": "system", 00:13:03.720 "dma_device_type": 1 00:13:03.720 }, 00:13:03.720 { 00:13:03.720 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:03.720 "dma_device_type": 2 00:13:03.720 } 00:13:03.720 ], 00:13:03.720 "driver_specific": {} 00:13:03.720 }' 00:13:03.720 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:03.720 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:03.979 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:03.979 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:03.979 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:03.979 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:03.979 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:03.979 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:03.979 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:03.979 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:03.979 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:04.239 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:04.239 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:04.239 [2024-07-24 19:48:55.825551] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:04.498 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:04.498 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:13:04.498 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:04.498 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:04.498 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:13:04.498 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:13:04.498 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:04.498 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:04.498 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:04.498 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:04.498 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:04.498 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:04.498 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:04.498 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:04.498 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:04.498 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:04.498 19:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:04.758 19:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:04.758 "name": "Existed_Raid", 00:13:04.758 "uuid": "d0b40229-1d33-4633-ab2a-cf2774c75723", 00:13:04.758 "strip_size_kb": 0, 00:13:04.758 "state": "online", 00:13:04.758 "raid_level": "raid1", 00:13:04.758 "superblock": false, 00:13:04.758 "num_base_bdevs": 2, 00:13:04.758 "num_base_bdevs_discovered": 1, 00:13:04.758 "num_base_bdevs_operational": 1, 00:13:04.758 "base_bdevs_list": [ 00:13:04.758 { 00:13:04.758 "name": null, 00:13:04.758 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:04.758 "is_configured": false, 00:13:04.758 "data_offset": 0, 00:13:04.758 "data_size": 65536 00:13:04.758 }, 00:13:04.758 { 00:13:04.758 "name": "BaseBdev2", 00:13:04.758 "uuid": "18b157ca-6b4e-4d68-8660-9ebd32fbeb4e", 00:13:04.758 "is_configured": true, 00:13:04.758 "data_offset": 0, 00:13:04.758 "data_size": 65536 00:13:04.758 } 00:13:04.758 ] 00:13:04.758 }' 00:13:04.758 19:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:04.758 19:48:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:05.326 19:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:05.326 19:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:05.326 19:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:05.326 19:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:05.585 19:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:05.585 19:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:05.585 19:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:05.585 [2024-07-24 19:48:57.170151] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:05.585 [2024-07-24 19:48:57.170229] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:05.844 [2024-07-24 19:48:57.182948] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:05.844 [2024-07-24 19:48:57.182984] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:05.844 [2024-07-24 19:48:57.182996] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf500d0 name Existed_Raid, state offline 00:13:05.844 19:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:05.844 19:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:05.844 19:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:05.844 19:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:06.102 19:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:06.102 19:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:06.102 19:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:13:06.102 19:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1386930 00:13:06.102 19:48:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 1386930 ']' 00:13:06.102 19:48:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 1386930 00:13:06.102 19:48:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:13:06.102 19:48:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:06.102 19:48:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1386930 00:13:06.102 19:48:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:06.102 19:48:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:06.102 19:48:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1386930' 00:13:06.102 killing process with pid 1386930 00:13:06.102 19:48:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 1386930 00:13:06.102 [2024-07-24 19:48:57.495638] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:06.102 19:48:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 1386930 00:13:06.102 [2024-07-24 19:48:57.496610] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:13:06.361 00:13:06.361 real 0m10.843s 00:13:06.361 user 0m19.309s 00:13:06.361 sys 0m1.973s 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:06.361 ************************************ 00:13:06.361 END TEST raid_state_function_test 00:13:06.361 ************************************ 00:13:06.361 19:48:57 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:13:06.361 19:48:57 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:13:06.361 19:48:57 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:06.361 19:48:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:06.361 ************************************ 00:13:06.361 START TEST raid_state_function_test_sb 00:13:06.361 ************************************ 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 2 true 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1388570 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1388570' 00:13:06.361 Process raid pid: 1388570 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1388570 /var/tmp/spdk-raid.sock 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1388570 ']' 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:06.361 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:06.361 19:48:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:06.361 [2024-07-24 19:48:57.872645] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:13:06.361 [2024-07-24 19:48:57.872711] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:06.620 [2024-07-24 19:48:57.991421] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:06.620 [2024-07-24 19:48:58.096287] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:06.620 [2024-07-24 19:48:58.159342] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:06.620 [2024-07-24 19:48:58.159375] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:07.557 19:48:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:07.557 19:48:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:13:07.557 19:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:07.816 [2024-07-24 19:48:59.309822] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:07.816 [2024-07-24 19:48:59.309862] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:07.816 [2024-07-24 19:48:59.309873] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:07.816 [2024-07-24 19:48:59.309885] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:07.816 19:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:07.816 19:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:07.816 19:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:07.816 19:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:07.816 19:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:07.816 19:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:07.816 19:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:07.816 19:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:07.816 19:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:07.816 19:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:07.816 19:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:07.816 19:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:08.075 19:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:08.075 "name": "Existed_Raid", 00:13:08.075 "uuid": "e837c5ca-2574-4b83-985c-4abcca0b536b", 00:13:08.075 "strip_size_kb": 0, 00:13:08.075 "state": "configuring", 00:13:08.075 "raid_level": "raid1", 00:13:08.075 "superblock": true, 00:13:08.075 "num_base_bdevs": 2, 00:13:08.075 "num_base_bdevs_discovered": 0, 00:13:08.075 "num_base_bdevs_operational": 2, 00:13:08.075 "base_bdevs_list": [ 00:13:08.075 { 00:13:08.075 "name": "BaseBdev1", 00:13:08.075 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:08.075 "is_configured": false, 00:13:08.075 "data_offset": 0, 00:13:08.075 "data_size": 0 00:13:08.075 }, 00:13:08.075 { 00:13:08.075 "name": "BaseBdev2", 00:13:08.075 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:08.075 "is_configured": false, 00:13:08.075 "data_offset": 0, 00:13:08.076 "data_size": 0 00:13:08.076 } 00:13:08.076 ] 00:13:08.076 }' 00:13:08.076 19:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:08.076 19:48:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:08.644 19:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:08.902 [2024-07-24 19:49:00.412611] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:08.902 [2024-07-24 19:49:00.412641] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25929f0 name Existed_Raid, state configuring 00:13:08.902 19:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:09.161 [2024-07-24 19:49:00.661277] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:09.161 [2024-07-24 19:49:00.661303] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:09.161 [2024-07-24 19:49:00.661313] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:09.161 [2024-07-24 19:49:00.661324] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:09.161 19:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:09.420 [2024-07-24 19:49:00.919811] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:09.420 BaseBdev1 00:13:09.420 19:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:09.420 19:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:13:09.420 19:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:09.420 19:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:13:09.420 19:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:09.420 19:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:09.420 19:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:09.679 19:49:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:09.937 [ 00:13:09.937 { 00:13:09.937 "name": "BaseBdev1", 00:13:09.937 "aliases": [ 00:13:09.937 "4de3317c-640a-4ca5-bf34-8dee8f6f29a3" 00:13:09.937 ], 00:13:09.937 "product_name": "Malloc disk", 00:13:09.937 "block_size": 512, 00:13:09.937 "num_blocks": 65536, 00:13:09.937 "uuid": "4de3317c-640a-4ca5-bf34-8dee8f6f29a3", 00:13:09.937 "assigned_rate_limits": { 00:13:09.937 "rw_ios_per_sec": 0, 00:13:09.937 "rw_mbytes_per_sec": 0, 00:13:09.937 "r_mbytes_per_sec": 0, 00:13:09.937 "w_mbytes_per_sec": 0 00:13:09.937 }, 00:13:09.937 "claimed": true, 00:13:09.937 "claim_type": "exclusive_write", 00:13:09.937 "zoned": false, 00:13:09.937 "supported_io_types": { 00:13:09.937 "read": true, 00:13:09.937 "write": true, 00:13:09.937 "unmap": true, 00:13:09.937 "flush": true, 00:13:09.937 "reset": true, 00:13:09.937 "nvme_admin": false, 00:13:09.937 "nvme_io": false, 00:13:09.937 "nvme_io_md": false, 00:13:09.937 "write_zeroes": true, 00:13:09.937 "zcopy": true, 00:13:09.937 "get_zone_info": false, 00:13:09.937 "zone_management": false, 00:13:09.937 "zone_append": false, 00:13:09.937 "compare": false, 00:13:09.937 "compare_and_write": false, 00:13:09.937 "abort": true, 00:13:09.937 "seek_hole": false, 00:13:09.937 "seek_data": false, 00:13:09.937 "copy": true, 00:13:09.937 "nvme_iov_md": false 00:13:09.937 }, 00:13:09.937 "memory_domains": [ 00:13:09.937 { 00:13:09.937 "dma_device_id": "system", 00:13:09.937 "dma_device_type": 1 00:13:09.937 }, 00:13:09.937 { 00:13:09.937 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:09.937 "dma_device_type": 2 00:13:09.937 } 00:13:09.937 ], 00:13:09.937 "driver_specific": {} 00:13:09.937 } 00:13:09.937 ] 00:13:09.937 19:49:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:13:09.937 19:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:09.937 19:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:09.937 19:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:09.937 19:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:09.937 19:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:09.937 19:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:09.937 19:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:09.937 19:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:09.937 19:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:09.937 19:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:09.937 19:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:09.937 19:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:10.196 19:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:10.196 "name": "Existed_Raid", 00:13:10.196 "uuid": "0185addc-229b-4aa3-97d6-3a47b6e8915f", 00:13:10.196 "strip_size_kb": 0, 00:13:10.196 "state": "configuring", 00:13:10.196 "raid_level": "raid1", 00:13:10.196 "superblock": true, 00:13:10.196 "num_base_bdevs": 2, 00:13:10.196 "num_base_bdevs_discovered": 1, 00:13:10.196 "num_base_bdevs_operational": 2, 00:13:10.196 "base_bdevs_list": [ 00:13:10.196 { 00:13:10.196 "name": "BaseBdev1", 00:13:10.196 "uuid": "4de3317c-640a-4ca5-bf34-8dee8f6f29a3", 00:13:10.196 "is_configured": true, 00:13:10.196 "data_offset": 2048, 00:13:10.196 "data_size": 63488 00:13:10.196 }, 00:13:10.196 { 00:13:10.196 "name": "BaseBdev2", 00:13:10.196 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:10.196 "is_configured": false, 00:13:10.196 "data_offset": 0, 00:13:10.196 "data_size": 0 00:13:10.196 } 00:13:10.196 ] 00:13:10.196 }' 00:13:10.196 19:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:10.196 19:49:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:10.763 19:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:11.022 [2024-07-24 19:49:02.375659] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:11.022 [2024-07-24 19:49:02.375694] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25922e0 name Existed_Raid, state configuring 00:13:11.022 19:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:11.281 [2024-07-24 19:49:02.628360] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:11.281 [2024-07-24 19:49:02.629823] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:11.281 [2024-07-24 19:49:02.629859] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:11.281 19:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:11.282 19:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:11.282 19:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:11.282 19:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:11.282 19:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:11.282 19:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:11.282 19:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:11.282 19:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:11.282 19:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:11.282 19:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:11.282 19:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:11.282 19:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:11.282 19:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:11.282 19:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:11.541 19:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:11.541 "name": "Existed_Raid", 00:13:11.541 "uuid": "28d5aee1-b1fd-4c06-a44e-8516da33d0cb", 00:13:11.541 "strip_size_kb": 0, 00:13:11.541 "state": "configuring", 00:13:11.541 "raid_level": "raid1", 00:13:11.541 "superblock": true, 00:13:11.541 "num_base_bdevs": 2, 00:13:11.541 "num_base_bdevs_discovered": 1, 00:13:11.541 "num_base_bdevs_operational": 2, 00:13:11.541 "base_bdevs_list": [ 00:13:11.541 { 00:13:11.541 "name": "BaseBdev1", 00:13:11.541 "uuid": "4de3317c-640a-4ca5-bf34-8dee8f6f29a3", 00:13:11.541 "is_configured": true, 00:13:11.541 "data_offset": 2048, 00:13:11.541 "data_size": 63488 00:13:11.541 }, 00:13:11.541 { 00:13:11.541 "name": "BaseBdev2", 00:13:11.541 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:11.541 "is_configured": false, 00:13:11.541 "data_offset": 0, 00:13:11.541 "data_size": 0 00:13:11.541 } 00:13:11.541 ] 00:13:11.541 }' 00:13:11.541 19:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:11.541 19:49:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:12.108 19:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:12.108 [2024-07-24 19:49:03.666373] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:12.109 [2024-07-24 19:49:03.666529] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x25930d0 00:13:12.109 [2024-07-24 19:49:03.666542] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:12.109 [2024-07-24 19:49:03.666717] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2746bd0 00:13:12.109 [2024-07-24 19:49:03.666845] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25930d0 00:13:12.109 [2024-07-24 19:49:03.666856] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x25930d0 00:13:12.109 [2024-07-24 19:49:03.666948] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:12.109 BaseBdev2 00:13:12.109 19:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:12.109 19:49:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:13:12.109 19:49:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:12.109 19:49:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:13:12.109 19:49:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:12.109 19:49:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:12.109 19:49:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:12.368 19:49:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:12.626 [ 00:13:12.626 { 00:13:12.626 "name": "BaseBdev2", 00:13:12.626 "aliases": [ 00:13:12.626 "98b34411-622e-433b-8e13-83e040b8dec8" 00:13:12.626 ], 00:13:12.626 "product_name": "Malloc disk", 00:13:12.626 "block_size": 512, 00:13:12.626 "num_blocks": 65536, 00:13:12.626 "uuid": "98b34411-622e-433b-8e13-83e040b8dec8", 00:13:12.626 "assigned_rate_limits": { 00:13:12.626 "rw_ios_per_sec": 0, 00:13:12.626 "rw_mbytes_per_sec": 0, 00:13:12.626 "r_mbytes_per_sec": 0, 00:13:12.626 "w_mbytes_per_sec": 0 00:13:12.626 }, 00:13:12.626 "claimed": true, 00:13:12.626 "claim_type": "exclusive_write", 00:13:12.626 "zoned": false, 00:13:12.626 "supported_io_types": { 00:13:12.626 "read": true, 00:13:12.626 "write": true, 00:13:12.626 "unmap": true, 00:13:12.626 "flush": true, 00:13:12.626 "reset": true, 00:13:12.626 "nvme_admin": false, 00:13:12.626 "nvme_io": false, 00:13:12.626 "nvme_io_md": false, 00:13:12.626 "write_zeroes": true, 00:13:12.626 "zcopy": true, 00:13:12.626 "get_zone_info": false, 00:13:12.626 "zone_management": false, 00:13:12.626 "zone_append": false, 00:13:12.626 "compare": false, 00:13:12.626 "compare_and_write": false, 00:13:12.626 "abort": true, 00:13:12.626 "seek_hole": false, 00:13:12.626 "seek_data": false, 00:13:12.626 "copy": true, 00:13:12.626 "nvme_iov_md": false 00:13:12.626 }, 00:13:12.626 "memory_domains": [ 00:13:12.626 { 00:13:12.627 "dma_device_id": "system", 00:13:12.627 "dma_device_type": 1 00:13:12.627 }, 00:13:12.627 { 00:13:12.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:12.627 "dma_device_type": 2 00:13:12.627 } 00:13:12.627 ], 00:13:12.627 "driver_specific": {} 00:13:12.627 } 00:13:12.627 ] 00:13:12.627 19:49:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:13:12.627 19:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:12.627 19:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:12.627 19:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:13:12.627 19:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:12.627 19:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:12.627 19:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:12.627 19:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:12.627 19:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:12.627 19:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:12.627 19:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:12.627 19:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:12.627 19:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:12.627 19:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:12.627 19:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:12.886 19:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:12.886 "name": "Existed_Raid", 00:13:12.886 "uuid": "28d5aee1-b1fd-4c06-a44e-8516da33d0cb", 00:13:12.886 "strip_size_kb": 0, 00:13:12.886 "state": "online", 00:13:12.886 "raid_level": "raid1", 00:13:12.886 "superblock": true, 00:13:12.886 "num_base_bdevs": 2, 00:13:12.886 "num_base_bdevs_discovered": 2, 00:13:12.886 "num_base_bdevs_operational": 2, 00:13:12.886 "base_bdevs_list": [ 00:13:12.886 { 00:13:12.886 "name": "BaseBdev1", 00:13:12.886 "uuid": "4de3317c-640a-4ca5-bf34-8dee8f6f29a3", 00:13:12.886 "is_configured": true, 00:13:12.886 "data_offset": 2048, 00:13:12.886 "data_size": 63488 00:13:12.886 }, 00:13:12.886 { 00:13:12.886 "name": "BaseBdev2", 00:13:12.886 "uuid": "98b34411-622e-433b-8e13-83e040b8dec8", 00:13:12.886 "is_configured": true, 00:13:12.886 "data_offset": 2048, 00:13:12.886 "data_size": 63488 00:13:12.886 } 00:13:12.886 ] 00:13:12.886 }' 00:13:12.886 19:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:12.886 19:49:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:13.455 19:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:13.455 19:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:13.455 19:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:13.455 19:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:13.455 19:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:13.455 19:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:13.455 19:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:13.455 19:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:13.714 [2024-07-24 19:49:05.174640] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:13.714 19:49:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:13.714 "name": "Existed_Raid", 00:13:13.714 "aliases": [ 00:13:13.714 "28d5aee1-b1fd-4c06-a44e-8516da33d0cb" 00:13:13.714 ], 00:13:13.714 "product_name": "Raid Volume", 00:13:13.714 "block_size": 512, 00:13:13.714 "num_blocks": 63488, 00:13:13.714 "uuid": "28d5aee1-b1fd-4c06-a44e-8516da33d0cb", 00:13:13.714 "assigned_rate_limits": { 00:13:13.714 "rw_ios_per_sec": 0, 00:13:13.714 "rw_mbytes_per_sec": 0, 00:13:13.714 "r_mbytes_per_sec": 0, 00:13:13.714 "w_mbytes_per_sec": 0 00:13:13.714 }, 00:13:13.714 "claimed": false, 00:13:13.714 "zoned": false, 00:13:13.714 "supported_io_types": { 00:13:13.714 "read": true, 00:13:13.714 "write": true, 00:13:13.714 "unmap": false, 00:13:13.714 "flush": false, 00:13:13.714 "reset": true, 00:13:13.714 "nvme_admin": false, 00:13:13.714 "nvme_io": false, 00:13:13.714 "nvme_io_md": false, 00:13:13.714 "write_zeroes": true, 00:13:13.714 "zcopy": false, 00:13:13.714 "get_zone_info": false, 00:13:13.715 "zone_management": false, 00:13:13.715 "zone_append": false, 00:13:13.715 "compare": false, 00:13:13.715 "compare_and_write": false, 00:13:13.715 "abort": false, 00:13:13.715 "seek_hole": false, 00:13:13.715 "seek_data": false, 00:13:13.715 "copy": false, 00:13:13.715 "nvme_iov_md": false 00:13:13.715 }, 00:13:13.715 "memory_domains": [ 00:13:13.715 { 00:13:13.715 "dma_device_id": "system", 00:13:13.715 "dma_device_type": 1 00:13:13.715 }, 00:13:13.715 { 00:13:13.715 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:13.715 "dma_device_type": 2 00:13:13.715 }, 00:13:13.715 { 00:13:13.715 "dma_device_id": "system", 00:13:13.715 "dma_device_type": 1 00:13:13.715 }, 00:13:13.715 { 00:13:13.715 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:13.715 "dma_device_type": 2 00:13:13.715 } 00:13:13.715 ], 00:13:13.715 "driver_specific": { 00:13:13.715 "raid": { 00:13:13.715 "uuid": "28d5aee1-b1fd-4c06-a44e-8516da33d0cb", 00:13:13.715 "strip_size_kb": 0, 00:13:13.715 "state": "online", 00:13:13.715 "raid_level": "raid1", 00:13:13.715 "superblock": true, 00:13:13.715 "num_base_bdevs": 2, 00:13:13.715 "num_base_bdevs_discovered": 2, 00:13:13.715 "num_base_bdevs_operational": 2, 00:13:13.715 "base_bdevs_list": [ 00:13:13.715 { 00:13:13.715 "name": "BaseBdev1", 00:13:13.715 "uuid": "4de3317c-640a-4ca5-bf34-8dee8f6f29a3", 00:13:13.715 "is_configured": true, 00:13:13.715 "data_offset": 2048, 00:13:13.715 "data_size": 63488 00:13:13.715 }, 00:13:13.715 { 00:13:13.715 "name": "BaseBdev2", 00:13:13.715 "uuid": "98b34411-622e-433b-8e13-83e040b8dec8", 00:13:13.715 "is_configured": true, 00:13:13.715 "data_offset": 2048, 00:13:13.715 "data_size": 63488 00:13:13.715 } 00:13:13.715 ] 00:13:13.715 } 00:13:13.715 } 00:13:13.715 }' 00:13:13.715 19:49:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:13.715 19:49:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:13.715 BaseBdev2' 00:13:13.715 19:49:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:13.715 19:49:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:13.715 19:49:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:13.974 19:49:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:13.974 "name": "BaseBdev1", 00:13:13.974 "aliases": [ 00:13:13.974 "4de3317c-640a-4ca5-bf34-8dee8f6f29a3" 00:13:13.974 ], 00:13:13.974 "product_name": "Malloc disk", 00:13:13.974 "block_size": 512, 00:13:13.974 "num_blocks": 65536, 00:13:13.974 "uuid": "4de3317c-640a-4ca5-bf34-8dee8f6f29a3", 00:13:13.974 "assigned_rate_limits": { 00:13:13.974 "rw_ios_per_sec": 0, 00:13:13.974 "rw_mbytes_per_sec": 0, 00:13:13.974 "r_mbytes_per_sec": 0, 00:13:13.975 "w_mbytes_per_sec": 0 00:13:13.975 }, 00:13:13.975 "claimed": true, 00:13:13.975 "claim_type": "exclusive_write", 00:13:13.975 "zoned": false, 00:13:13.975 "supported_io_types": { 00:13:13.975 "read": true, 00:13:13.975 "write": true, 00:13:13.975 "unmap": true, 00:13:13.975 "flush": true, 00:13:13.975 "reset": true, 00:13:13.975 "nvme_admin": false, 00:13:13.975 "nvme_io": false, 00:13:13.975 "nvme_io_md": false, 00:13:13.975 "write_zeroes": true, 00:13:13.975 "zcopy": true, 00:13:13.975 "get_zone_info": false, 00:13:13.975 "zone_management": false, 00:13:13.975 "zone_append": false, 00:13:13.975 "compare": false, 00:13:13.975 "compare_and_write": false, 00:13:13.975 "abort": true, 00:13:13.975 "seek_hole": false, 00:13:13.975 "seek_data": false, 00:13:13.975 "copy": true, 00:13:13.975 "nvme_iov_md": false 00:13:13.975 }, 00:13:13.975 "memory_domains": [ 00:13:13.975 { 00:13:13.975 "dma_device_id": "system", 00:13:13.975 "dma_device_type": 1 00:13:13.975 }, 00:13:13.975 { 00:13:13.975 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:13.975 "dma_device_type": 2 00:13:13.975 } 00:13:13.975 ], 00:13:13.975 "driver_specific": {} 00:13:13.975 }' 00:13:13.975 19:49:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:13.975 19:49:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:13.975 19:49:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:13.975 19:49:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:13.975 19:49:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:14.234 19:49:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:14.234 19:49:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:14.234 19:49:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:14.234 19:49:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:14.234 19:49:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:14.234 19:49:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:14.234 19:49:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:14.234 19:49:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:14.234 19:49:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:14.234 19:49:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:14.493 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:14.494 "name": "BaseBdev2", 00:13:14.494 "aliases": [ 00:13:14.494 "98b34411-622e-433b-8e13-83e040b8dec8" 00:13:14.494 ], 00:13:14.494 "product_name": "Malloc disk", 00:13:14.494 "block_size": 512, 00:13:14.494 "num_blocks": 65536, 00:13:14.494 "uuid": "98b34411-622e-433b-8e13-83e040b8dec8", 00:13:14.494 "assigned_rate_limits": { 00:13:14.494 "rw_ios_per_sec": 0, 00:13:14.494 "rw_mbytes_per_sec": 0, 00:13:14.494 "r_mbytes_per_sec": 0, 00:13:14.494 "w_mbytes_per_sec": 0 00:13:14.494 }, 00:13:14.494 "claimed": true, 00:13:14.494 "claim_type": "exclusive_write", 00:13:14.494 "zoned": false, 00:13:14.494 "supported_io_types": { 00:13:14.494 "read": true, 00:13:14.494 "write": true, 00:13:14.494 "unmap": true, 00:13:14.494 "flush": true, 00:13:14.494 "reset": true, 00:13:14.494 "nvme_admin": false, 00:13:14.494 "nvme_io": false, 00:13:14.494 "nvme_io_md": false, 00:13:14.494 "write_zeroes": true, 00:13:14.494 "zcopy": true, 00:13:14.494 "get_zone_info": false, 00:13:14.494 "zone_management": false, 00:13:14.494 "zone_append": false, 00:13:14.494 "compare": false, 00:13:14.494 "compare_and_write": false, 00:13:14.494 "abort": true, 00:13:14.494 "seek_hole": false, 00:13:14.494 "seek_data": false, 00:13:14.494 "copy": true, 00:13:14.494 "nvme_iov_md": false 00:13:14.494 }, 00:13:14.494 "memory_domains": [ 00:13:14.494 { 00:13:14.494 "dma_device_id": "system", 00:13:14.494 "dma_device_type": 1 00:13:14.494 }, 00:13:14.494 { 00:13:14.494 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.494 "dma_device_type": 2 00:13:14.494 } 00:13:14.494 ], 00:13:14.494 "driver_specific": {} 00:13:14.494 }' 00:13:14.494 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:14.494 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:14.753 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:14.753 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:14.753 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:14.753 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:14.753 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:14.753 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:14.753 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:14.753 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:14.753 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.013 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:15.013 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:15.013 [2024-07-24 19:49:06.598191] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:15.332 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:15.332 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:13:15.332 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:15.332 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:13:15.332 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:13:15.332 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:13:15.332 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:15.332 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:15.332 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:15.332 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:15.332 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:15.332 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:15.332 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:15.332 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:15.332 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:15.332 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:15.332 19:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:15.592 19:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:15.592 "name": "Existed_Raid", 00:13:15.592 "uuid": "28d5aee1-b1fd-4c06-a44e-8516da33d0cb", 00:13:15.592 "strip_size_kb": 0, 00:13:15.592 "state": "online", 00:13:15.592 "raid_level": "raid1", 00:13:15.592 "superblock": true, 00:13:15.592 "num_base_bdevs": 2, 00:13:15.592 "num_base_bdevs_discovered": 1, 00:13:15.592 "num_base_bdevs_operational": 1, 00:13:15.592 "base_bdevs_list": [ 00:13:15.592 { 00:13:15.592 "name": null, 00:13:15.592 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:15.592 "is_configured": false, 00:13:15.592 "data_offset": 2048, 00:13:15.592 "data_size": 63488 00:13:15.592 }, 00:13:15.592 { 00:13:15.592 "name": "BaseBdev2", 00:13:15.592 "uuid": "98b34411-622e-433b-8e13-83e040b8dec8", 00:13:15.592 "is_configured": true, 00:13:15.592 "data_offset": 2048, 00:13:15.592 "data_size": 63488 00:13:15.592 } 00:13:15.592 ] 00:13:15.592 }' 00:13:15.592 19:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:15.592 19:49:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:16.159 19:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:16.159 19:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:16.159 19:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:16.159 19:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:16.418 19:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:16.418 19:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:16.418 19:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:16.677 [2024-07-24 19:49:08.224472] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:16.677 [2024-07-24 19:49:08.224556] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:16.677 [2024-07-24 19:49:08.235457] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:16.678 [2024-07-24 19:49:08.235493] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:16.678 [2024-07-24 19:49:08.235504] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25930d0 name Existed_Raid, state offline 00:13:16.678 19:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:16.678 19:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:16.678 19:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:16.678 19:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:16.937 19:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:16.937 19:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:16.937 19:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:13:16.937 19:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1388570 00:13:16.937 19:49:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1388570 ']' 00:13:16.937 19:49:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 1388570 00:13:16.937 19:49:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:13:16.937 19:49:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:16.937 19:49:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1388570 00:13:17.197 19:49:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:17.197 19:49:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:17.197 19:49:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1388570' 00:13:17.197 killing process with pid 1388570 00:13:17.197 19:49:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 1388570 00:13:17.197 [2024-07-24 19:49:08.554385] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:17.197 19:49:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 1388570 00:13:17.197 [2024-07-24 19:49:08.555353] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:17.197 19:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:13:17.197 00:13:17.197 real 0m10.977s 00:13:17.197 user 0m19.519s 00:13:17.197 sys 0m2.033s 00:13:17.197 19:49:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:17.197 19:49:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:17.197 ************************************ 00:13:17.197 END TEST raid_state_function_test_sb 00:13:17.197 ************************************ 00:13:17.457 19:49:08 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:13:17.457 19:49:08 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:13:17.457 19:49:08 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:17.457 19:49:08 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:17.457 ************************************ 00:13:17.457 START TEST raid_superblock_test 00:13:17.457 ************************************ 00:13:17.457 19:49:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 2 00:13:17.457 19:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=raid1 00:13:17.457 19:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=2 00:13:17.457 19:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:13:17.457 19:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:13:17.457 19:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:13:17.457 19:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:13:17.457 19:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:13:17.457 19:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:13:17.457 19:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:13:17.457 19:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:13:17.457 19:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:13:17.457 19:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:13:17.457 19:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:13:17.457 19:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' raid1 '!=' raid1 ']' 00:13:17.457 19:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # strip_size=0 00:13:17.457 19:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=1390204 00:13:17.457 19:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 1390204 /var/tmp/spdk-raid.sock 00:13:17.457 19:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:13:17.457 19:49:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1390204 ']' 00:13:17.457 19:49:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:17.457 19:49:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:17.457 19:49:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:17.457 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:17.457 19:49:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:17.457 19:49:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:17.457 [2024-07-24 19:49:08.972092] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:13:17.457 [2024-07-24 19:49:08.972227] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1390204 ] 00:13:17.717 [2024-07-24 19:49:09.166563] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:17.717 [2024-07-24 19:49:09.268310] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:17.976 [2024-07-24 19:49:09.343385] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:17.976 [2024-07-24 19:49:09.343442] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:18.545 19:49:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:18.545 19:49:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:13:18.545 19:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:13:18.545 19:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:13:18.545 19:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:13:18.545 19:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:13:18.545 19:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:13:18.545 19:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:18.545 19:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:13:18.545 19:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:18.545 19:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:13:18.545 malloc1 00:13:18.545 19:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:18.804 [2024-07-24 19:49:10.273206] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:18.804 [2024-07-24 19:49:10.273250] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:18.804 [2024-07-24 19:49:10.273270] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fd6590 00:13:18.804 [2024-07-24 19:49:10.273282] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:18.804 [2024-07-24 19:49:10.274847] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:18.804 [2024-07-24 19:49:10.274874] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:18.804 pt1 00:13:18.804 19:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:13:18.804 19:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:13:18.804 19:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:13:18.804 19:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:13:18.804 19:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:13:18.804 19:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:18.804 19:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:13:18.804 19:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:18.804 19:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:13:19.064 malloc2 00:13:19.064 19:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:19.323 [2024-07-24 19:49:10.768512] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:19.323 [2024-07-24 19:49:10.768559] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:19.323 [2024-07-24 19:49:10.768577] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x217c690 00:13:19.323 [2024-07-24 19:49:10.768590] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:19.323 [2024-07-24 19:49:10.770155] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:19.323 [2024-07-24 19:49:10.770183] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:19.323 pt2 00:13:19.323 19:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:13:19.323 19:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:13:19.323 19:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:13:19.582 [2024-07-24 19:49:11.009160] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:19.582 [2024-07-24 19:49:11.010480] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:19.582 [2024-07-24 19:49:11.010627] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x217d980 00:13:19.583 [2024-07-24 19:49:11.010640] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:19.583 [2024-07-24 19:49:11.010835] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x217e730 00:13:19.583 [2024-07-24 19:49:11.010987] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x217d980 00:13:19.583 [2024-07-24 19:49:11.010999] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x217d980 00:13:19.583 [2024-07-24 19:49:11.011098] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:19.583 19:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:19.583 19:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:19.583 19:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:19.583 19:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:19.583 19:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:19.583 19:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:19.583 19:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:19.583 19:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:19.583 19:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:19.583 19:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:19.583 19:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:19.583 19:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:19.842 19:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:19.842 "name": "raid_bdev1", 00:13:19.842 "uuid": "bae72963-6736-4f7f-8800-36e07c849f1f", 00:13:19.842 "strip_size_kb": 0, 00:13:19.842 "state": "online", 00:13:19.842 "raid_level": "raid1", 00:13:19.842 "superblock": true, 00:13:19.842 "num_base_bdevs": 2, 00:13:19.842 "num_base_bdevs_discovered": 2, 00:13:19.842 "num_base_bdevs_operational": 2, 00:13:19.842 "base_bdevs_list": [ 00:13:19.842 { 00:13:19.842 "name": "pt1", 00:13:19.842 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:19.842 "is_configured": true, 00:13:19.842 "data_offset": 2048, 00:13:19.842 "data_size": 63488 00:13:19.842 }, 00:13:19.842 { 00:13:19.842 "name": "pt2", 00:13:19.842 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:19.842 "is_configured": true, 00:13:19.842 "data_offset": 2048, 00:13:19.842 "data_size": 63488 00:13:19.842 } 00:13:19.842 ] 00:13:19.842 }' 00:13:19.842 19:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:19.842 19:49:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:20.411 19:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:13:20.411 19:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:20.411 19:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:20.411 19:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:20.411 19:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:20.411 19:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:20.411 19:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:20.411 19:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:20.670 [2024-07-24 19:49:12.088221] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:20.670 19:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:20.670 "name": "raid_bdev1", 00:13:20.670 "aliases": [ 00:13:20.670 "bae72963-6736-4f7f-8800-36e07c849f1f" 00:13:20.670 ], 00:13:20.670 "product_name": "Raid Volume", 00:13:20.670 "block_size": 512, 00:13:20.670 "num_blocks": 63488, 00:13:20.670 "uuid": "bae72963-6736-4f7f-8800-36e07c849f1f", 00:13:20.670 "assigned_rate_limits": { 00:13:20.670 "rw_ios_per_sec": 0, 00:13:20.670 "rw_mbytes_per_sec": 0, 00:13:20.670 "r_mbytes_per_sec": 0, 00:13:20.670 "w_mbytes_per_sec": 0 00:13:20.670 }, 00:13:20.670 "claimed": false, 00:13:20.670 "zoned": false, 00:13:20.670 "supported_io_types": { 00:13:20.670 "read": true, 00:13:20.670 "write": true, 00:13:20.670 "unmap": false, 00:13:20.670 "flush": false, 00:13:20.670 "reset": true, 00:13:20.670 "nvme_admin": false, 00:13:20.670 "nvme_io": false, 00:13:20.670 "nvme_io_md": false, 00:13:20.670 "write_zeroes": true, 00:13:20.670 "zcopy": false, 00:13:20.670 "get_zone_info": false, 00:13:20.670 "zone_management": false, 00:13:20.670 "zone_append": false, 00:13:20.671 "compare": false, 00:13:20.671 "compare_and_write": false, 00:13:20.671 "abort": false, 00:13:20.671 "seek_hole": false, 00:13:20.671 "seek_data": false, 00:13:20.671 "copy": false, 00:13:20.671 "nvme_iov_md": false 00:13:20.671 }, 00:13:20.671 "memory_domains": [ 00:13:20.671 { 00:13:20.671 "dma_device_id": "system", 00:13:20.671 "dma_device_type": 1 00:13:20.671 }, 00:13:20.671 { 00:13:20.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:20.671 "dma_device_type": 2 00:13:20.671 }, 00:13:20.671 { 00:13:20.671 "dma_device_id": "system", 00:13:20.671 "dma_device_type": 1 00:13:20.671 }, 00:13:20.671 { 00:13:20.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:20.671 "dma_device_type": 2 00:13:20.671 } 00:13:20.671 ], 00:13:20.671 "driver_specific": { 00:13:20.671 "raid": { 00:13:20.671 "uuid": "bae72963-6736-4f7f-8800-36e07c849f1f", 00:13:20.671 "strip_size_kb": 0, 00:13:20.671 "state": "online", 00:13:20.671 "raid_level": "raid1", 00:13:20.671 "superblock": true, 00:13:20.671 "num_base_bdevs": 2, 00:13:20.671 "num_base_bdevs_discovered": 2, 00:13:20.671 "num_base_bdevs_operational": 2, 00:13:20.671 "base_bdevs_list": [ 00:13:20.671 { 00:13:20.671 "name": "pt1", 00:13:20.671 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:20.671 "is_configured": true, 00:13:20.671 "data_offset": 2048, 00:13:20.671 "data_size": 63488 00:13:20.671 }, 00:13:20.671 { 00:13:20.671 "name": "pt2", 00:13:20.671 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:20.671 "is_configured": true, 00:13:20.671 "data_offset": 2048, 00:13:20.671 "data_size": 63488 00:13:20.671 } 00:13:20.671 ] 00:13:20.671 } 00:13:20.671 } 00:13:20.671 }' 00:13:20.671 19:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:20.671 19:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:20.671 pt2' 00:13:20.671 19:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:20.671 19:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:20.671 19:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:20.930 19:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:20.930 "name": "pt1", 00:13:20.930 "aliases": [ 00:13:20.930 "00000000-0000-0000-0000-000000000001" 00:13:20.930 ], 00:13:20.930 "product_name": "passthru", 00:13:20.930 "block_size": 512, 00:13:20.930 "num_blocks": 65536, 00:13:20.930 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:20.930 "assigned_rate_limits": { 00:13:20.930 "rw_ios_per_sec": 0, 00:13:20.930 "rw_mbytes_per_sec": 0, 00:13:20.930 "r_mbytes_per_sec": 0, 00:13:20.930 "w_mbytes_per_sec": 0 00:13:20.930 }, 00:13:20.930 "claimed": true, 00:13:20.930 "claim_type": "exclusive_write", 00:13:20.930 "zoned": false, 00:13:20.930 "supported_io_types": { 00:13:20.930 "read": true, 00:13:20.930 "write": true, 00:13:20.930 "unmap": true, 00:13:20.930 "flush": true, 00:13:20.930 "reset": true, 00:13:20.930 "nvme_admin": false, 00:13:20.930 "nvme_io": false, 00:13:20.930 "nvme_io_md": false, 00:13:20.930 "write_zeroes": true, 00:13:20.930 "zcopy": true, 00:13:20.930 "get_zone_info": false, 00:13:20.930 "zone_management": false, 00:13:20.930 "zone_append": false, 00:13:20.930 "compare": false, 00:13:20.930 "compare_and_write": false, 00:13:20.930 "abort": true, 00:13:20.930 "seek_hole": false, 00:13:20.930 "seek_data": false, 00:13:20.930 "copy": true, 00:13:20.930 "nvme_iov_md": false 00:13:20.930 }, 00:13:20.930 "memory_domains": [ 00:13:20.930 { 00:13:20.930 "dma_device_id": "system", 00:13:20.930 "dma_device_type": 1 00:13:20.930 }, 00:13:20.930 { 00:13:20.930 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:20.930 "dma_device_type": 2 00:13:20.930 } 00:13:20.930 ], 00:13:20.930 "driver_specific": { 00:13:20.930 "passthru": { 00:13:20.930 "name": "pt1", 00:13:20.930 "base_bdev_name": "malloc1" 00:13:20.930 } 00:13:20.930 } 00:13:20.930 }' 00:13:20.930 19:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:20.930 19:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:20.930 19:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:20.930 19:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:21.190 19:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:21.190 19:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:21.190 19:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:21.190 19:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:21.190 19:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:21.190 19:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:21.190 19:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:21.190 19:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:21.190 19:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:21.190 19:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:21.190 19:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:21.449 19:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:21.449 "name": "pt2", 00:13:21.449 "aliases": [ 00:13:21.449 "00000000-0000-0000-0000-000000000002" 00:13:21.449 ], 00:13:21.449 "product_name": "passthru", 00:13:21.449 "block_size": 512, 00:13:21.449 "num_blocks": 65536, 00:13:21.449 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:21.449 "assigned_rate_limits": { 00:13:21.449 "rw_ios_per_sec": 0, 00:13:21.449 "rw_mbytes_per_sec": 0, 00:13:21.449 "r_mbytes_per_sec": 0, 00:13:21.449 "w_mbytes_per_sec": 0 00:13:21.449 }, 00:13:21.449 "claimed": true, 00:13:21.449 "claim_type": "exclusive_write", 00:13:21.449 "zoned": false, 00:13:21.449 "supported_io_types": { 00:13:21.449 "read": true, 00:13:21.449 "write": true, 00:13:21.449 "unmap": true, 00:13:21.449 "flush": true, 00:13:21.449 "reset": true, 00:13:21.449 "nvme_admin": false, 00:13:21.449 "nvme_io": false, 00:13:21.449 "nvme_io_md": false, 00:13:21.449 "write_zeroes": true, 00:13:21.449 "zcopy": true, 00:13:21.449 "get_zone_info": false, 00:13:21.449 "zone_management": false, 00:13:21.449 "zone_append": false, 00:13:21.449 "compare": false, 00:13:21.449 "compare_and_write": false, 00:13:21.449 "abort": true, 00:13:21.449 "seek_hole": false, 00:13:21.449 "seek_data": false, 00:13:21.449 "copy": true, 00:13:21.449 "nvme_iov_md": false 00:13:21.449 }, 00:13:21.449 "memory_domains": [ 00:13:21.449 { 00:13:21.449 "dma_device_id": "system", 00:13:21.449 "dma_device_type": 1 00:13:21.449 }, 00:13:21.449 { 00:13:21.449 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:21.449 "dma_device_type": 2 00:13:21.449 } 00:13:21.449 ], 00:13:21.449 "driver_specific": { 00:13:21.449 "passthru": { 00:13:21.449 "name": "pt2", 00:13:21.449 "base_bdev_name": "malloc2" 00:13:21.449 } 00:13:21.449 } 00:13:21.449 }' 00:13:21.449 19:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:21.449 19:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:21.709 19:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:21.709 19:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:21.709 19:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:21.709 19:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:21.709 19:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:21.709 19:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:21.709 19:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:21.709 19:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:21.709 19:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:21.968 19:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:21.968 19:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:21.968 19:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:13:21.968 [2024-07-24 19:49:13.556098] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:22.227 19:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=bae72963-6736-4f7f-8800-36e07c849f1f 00:13:22.227 19:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z bae72963-6736-4f7f-8800-36e07c849f1f ']' 00:13:22.227 19:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:22.227 [2024-07-24 19:49:13.804522] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:22.227 [2024-07-24 19:49:13.804542] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:22.227 [2024-07-24 19:49:13.804593] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:22.227 [2024-07-24 19:49:13.804646] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:22.227 [2024-07-24 19:49:13.804657] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x217d980 name raid_bdev1, state offline 00:13:22.487 19:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:22.487 19:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:13:22.487 19:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:13:22.487 19:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:13:22.487 19:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:13:22.487 19:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:22.746 19:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:13:22.747 19:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:23.010 19:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:13:23.010 19:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:13:23.271 19:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:13:23.271 19:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:13:23.271 19:49:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:13:23.271 19:49:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:13:23.271 19:49:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:23.271 19:49:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:13:23.271 19:49:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:23.271 19:49:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:13:23.271 19:49:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:23.271 19:49:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:13:23.271 19:49:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:23.271 19:49:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:13:23.271 19:49:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:13:23.530 [2024-07-24 19:49:15.027706] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:13:23.530 [2024-07-24 19:49:15.029096] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:13:23.530 [2024-07-24 19:49:15.029151] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:13:23.530 [2024-07-24 19:49:15.029191] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:13:23.530 [2024-07-24 19:49:15.029210] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:23.530 [2024-07-24 19:49:15.029220] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x217b760 name raid_bdev1, state configuring 00:13:23.530 request: 00:13:23.530 { 00:13:23.530 "name": "raid_bdev1", 00:13:23.530 "raid_level": "raid1", 00:13:23.530 "base_bdevs": [ 00:13:23.530 "malloc1", 00:13:23.530 "malloc2" 00:13:23.530 ], 00:13:23.530 "superblock": false, 00:13:23.530 "method": "bdev_raid_create", 00:13:23.530 "req_id": 1 00:13:23.530 } 00:13:23.530 Got JSON-RPC error response 00:13:23.530 response: 00:13:23.530 { 00:13:23.530 "code": -17, 00:13:23.530 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:13:23.530 } 00:13:23.530 19:49:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:13:23.530 19:49:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:13:23.530 19:49:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:13:23.530 19:49:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:13:23.530 19:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:23.530 19:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:13:23.789 19:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:13:23.789 19:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:13:23.789 19:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:24.049 [2024-07-24 19:49:15.516948] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:24.049 [2024-07-24 19:49:15.516987] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:24.049 [2024-07-24 19:49:15.517004] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x217c460 00:13:24.049 [2024-07-24 19:49:15.517017] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:24.049 [2024-07-24 19:49:15.518591] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:24.049 [2024-07-24 19:49:15.518619] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:24.049 [2024-07-24 19:49:15.518683] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:24.049 [2024-07-24 19:49:15.518710] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:24.049 pt1 00:13:24.049 19:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:13:24.049 19:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:24.049 19:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:24.049 19:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:24.049 19:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:24.049 19:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:24.049 19:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:24.049 19:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:24.049 19:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:24.049 19:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:24.049 19:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:24.049 19:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:24.309 19:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:24.309 "name": "raid_bdev1", 00:13:24.309 "uuid": "bae72963-6736-4f7f-8800-36e07c849f1f", 00:13:24.309 "strip_size_kb": 0, 00:13:24.309 "state": "configuring", 00:13:24.309 "raid_level": "raid1", 00:13:24.309 "superblock": true, 00:13:24.309 "num_base_bdevs": 2, 00:13:24.309 "num_base_bdevs_discovered": 1, 00:13:24.309 "num_base_bdevs_operational": 2, 00:13:24.309 "base_bdevs_list": [ 00:13:24.309 { 00:13:24.309 "name": "pt1", 00:13:24.309 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:24.309 "is_configured": true, 00:13:24.309 "data_offset": 2048, 00:13:24.309 "data_size": 63488 00:13:24.309 }, 00:13:24.309 { 00:13:24.309 "name": null, 00:13:24.309 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:24.309 "is_configured": false, 00:13:24.309 "data_offset": 2048, 00:13:24.309 "data_size": 63488 00:13:24.309 } 00:13:24.309 ] 00:13:24.309 }' 00:13:24.309 19:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:24.309 19:49:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:24.878 19:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 2 -gt 2 ']' 00:13:24.878 19:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:13:24.878 19:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:13:24.878 19:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:25.137 [2024-07-24 19:49:16.607872] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:25.137 [2024-07-24 19:49:16.607918] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:25.137 [2024-07-24 19:49:16.607940] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x217b230 00:13:25.137 [2024-07-24 19:49:16.607952] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:25.137 [2024-07-24 19:49:16.608287] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:25.137 [2024-07-24 19:49:16.608305] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:25.137 [2024-07-24 19:49:16.608364] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:25.137 [2024-07-24 19:49:16.608382] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:25.137 [2024-07-24 19:49:16.608486] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fd57e0 00:13:25.137 [2024-07-24 19:49:16.608497] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:25.137 [2024-07-24 19:49:16.608663] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21814c0 00:13:25.137 [2024-07-24 19:49:16.608793] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fd57e0 00:13:25.137 [2024-07-24 19:49:16.608803] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1fd57e0 00:13:25.137 [2024-07-24 19:49:16.608900] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:25.137 pt2 00:13:25.137 19:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:13:25.137 19:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:13:25.137 19:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:25.137 19:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:25.137 19:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:25.137 19:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:25.137 19:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:25.137 19:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:25.137 19:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:25.137 19:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:25.137 19:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:25.137 19:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:25.137 19:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:25.137 19:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:25.397 19:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:25.397 "name": "raid_bdev1", 00:13:25.397 "uuid": "bae72963-6736-4f7f-8800-36e07c849f1f", 00:13:25.397 "strip_size_kb": 0, 00:13:25.397 "state": "online", 00:13:25.397 "raid_level": "raid1", 00:13:25.397 "superblock": true, 00:13:25.397 "num_base_bdevs": 2, 00:13:25.397 "num_base_bdevs_discovered": 2, 00:13:25.397 "num_base_bdevs_operational": 2, 00:13:25.397 "base_bdevs_list": [ 00:13:25.398 { 00:13:25.398 "name": "pt1", 00:13:25.398 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:25.398 "is_configured": true, 00:13:25.398 "data_offset": 2048, 00:13:25.398 "data_size": 63488 00:13:25.398 }, 00:13:25.398 { 00:13:25.398 "name": "pt2", 00:13:25.398 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:25.398 "is_configured": true, 00:13:25.398 "data_offset": 2048, 00:13:25.398 "data_size": 63488 00:13:25.398 } 00:13:25.398 ] 00:13:25.398 }' 00:13:25.398 19:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:25.398 19:49:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:25.966 19:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:13:25.966 19:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:25.966 19:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:25.966 19:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:25.966 19:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:25.966 19:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:25.966 19:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:25.966 19:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:26.226 [2024-07-24 19:49:17.690992] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:26.226 19:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:26.226 "name": "raid_bdev1", 00:13:26.226 "aliases": [ 00:13:26.226 "bae72963-6736-4f7f-8800-36e07c849f1f" 00:13:26.226 ], 00:13:26.226 "product_name": "Raid Volume", 00:13:26.226 "block_size": 512, 00:13:26.226 "num_blocks": 63488, 00:13:26.226 "uuid": "bae72963-6736-4f7f-8800-36e07c849f1f", 00:13:26.226 "assigned_rate_limits": { 00:13:26.226 "rw_ios_per_sec": 0, 00:13:26.226 "rw_mbytes_per_sec": 0, 00:13:26.226 "r_mbytes_per_sec": 0, 00:13:26.226 "w_mbytes_per_sec": 0 00:13:26.226 }, 00:13:26.226 "claimed": false, 00:13:26.226 "zoned": false, 00:13:26.226 "supported_io_types": { 00:13:26.226 "read": true, 00:13:26.226 "write": true, 00:13:26.226 "unmap": false, 00:13:26.226 "flush": false, 00:13:26.226 "reset": true, 00:13:26.226 "nvme_admin": false, 00:13:26.226 "nvme_io": false, 00:13:26.226 "nvme_io_md": false, 00:13:26.226 "write_zeroes": true, 00:13:26.226 "zcopy": false, 00:13:26.226 "get_zone_info": false, 00:13:26.226 "zone_management": false, 00:13:26.226 "zone_append": false, 00:13:26.226 "compare": false, 00:13:26.226 "compare_and_write": false, 00:13:26.226 "abort": false, 00:13:26.226 "seek_hole": false, 00:13:26.226 "seek_data": false, 00:13:26.226 "copy": false, 00:13:26.226 "nvme_iov_md": false 00:13:26.226 }, 00:13:26.226 "memory_domains": [ 00:13:26.226 { 00:13:26.226 "dma_device_id": "system", 00:13:26.226 "dma_device_type": 1 00:13:26.226 }, 00:13:26.226 { 00:13:26.226 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:26.226 "dma_device_type": 2 00:13:26.226 }, 00:13:26.226 { 00:13:26.226 "dma_device_id": "system", 00:13:26.226 "dma_device_type": 1 00:13:26.226 }, 00:13:26.226 { 00:13:26.226 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:26.226 "dma_device_type": 2 00:13:26.226 } 00:13:26.226 ], 00:13:26.226 "driver_specific": { 00:13:26.226 "raid": { 00:13:26.226 "uuid": "bae72963-6736-4f7f-8800-36e07c849f1f", 00:13:26.226 "strip_size_kb": 0, 00:13:26.226 "state": "online", 00:13:26.226 "raid_level": "raid1", 00:13:26.226 "superblock": true, 00:13:26.226 "num_base_bdevs": 2, 00:13:26.226 "num_base_bdevs_discovered": 2, 00:13:26.226 "num_base_bdevs_operational": 2, 00:13:26.226 "base_bdevs_list": [ 00:13:26.226 { 00:13:26.226 "name": "pt1", 00:13:26.226 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:26.226 "is_configured": true, 00:13:26.226 "data_offset": 2048, 00:13:26.226 "data_size": 63488 00:13:26.226 }, 00:13:26.226 { 00:13:26.226 "name": "pt2", 00:13:26.226 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:26.226 "is_configured": true, 00:13:26.226 "data_offset": 2048, 00:13:26.226 "data_size": 63488 00:13:26.226 } 00:13:26.226 ] 00:13:26.226 } 00:13:26.226 } 00:13:26.226 }' 00:13:26.226 19:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:26.226 19:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:26.226 pt2' 00:13:26.226 19:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:26.226 19:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:26.226 19:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:26.485 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:26.486 "name": "pt1", 00:13:26.486 "aliases": [ 00:13:26.486 "00000000-0000-0000-0000-000000000001" 00:13:26.486 ], 00:13:26.486 "product_name": "passthru", 00:13:26.486 "block_size": 512, 00:13:26.486 "num_blocks": 65536, 00:13:26.486 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:26.486 "assigned_rate_limits": { 00:13:26.486 "rw_ios_per_sec": 0, 00:13:26.486 "rw_mbytes_per_sec": 0, 00:13:26.486 "r_mbytes_per_sec": 0, 00:13:26.486 "w_mbytes_per_sec": 0 00:13:26.486 }, 00:13:26.486 "claimed": true, 00:13:26.486 "claim_type": "exclusive_write", 00:13:26.486 "zoned": false, 00:13:26.486 "supported_io_types": { 00:13:26.486 "read": true, 00:13:26.486 "write": true, 00:13:26.486 "unmap": true, 00:13:26.486 "flush": true, 00:13:26.486 "reset": true, 00:13:26.486 "nvme_admin": false, 00:13:26.486 "nvme_io": false, 00:13:26.486 "nvme_io_md": false, 00:13:26.486 "write_zeroes": true, 00:13:26.486 "zcopy": true, 00:13:26.486 "get_zone_info": false, 00:13:26.486 "zone_management": false, 00:13:26.486 "zone_append": false, 00:13:26.486 "compare": false, 00:13:26.486 "compare_and_write": false, 00:13:26.486 "abort": true, 00:13:26.486 "seek_hole": false, 00:13:26.486 "seek_data": false, 00:13:26.486 "copy": true, 00:13:26.486 "nvme_iov_md": false 00:13:26.486 }, 00:13:26.486 "memory_domains": [ 00:13:26.486 { 00:13:26.486 "dma_device_id": "system", 00:13:26.486 "dma_device_type": 1 00:13:26.486 }, 00:13:26.486 { 00:13:26.486 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:26.486 "dma_device_type": 2 00:13:26.486 } 00:13:26.486 ], 00:13:26.486 "driver_specific": { 00:13:26.486 "passthru": { 00:13:26.486 "name": "pt1", 00:13:26.486 "base_bdev_name": "malloc1" 00:13:26.486 } 00:13:26.486 } 00:13:26.486 }' 00:13:26.486 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:26.486 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:26.745 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:26.745 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:26.745 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:26.745 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:26.745 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:26.745 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:26.745 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:26.745 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:26.745 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:27.003 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:27.003 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:27.003 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:27.003 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:27.263 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:27.263 "name": "pt2", 00:13:27.263 "aliases": [ 00:13:27.263 "00000000-0000-0000-0000-000000000002" 00:13:27.263 ], 00:13:27.263 "product_name": "passthru", 00:13:27.263 "block_size": 512, 00:13:27.263 "num_blocks": 65536, 00:13:27.263 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:27.263 "assigned_rate_limits": { 00:13:27.263 "rw_ios_per_sec": 0, 00:13:27.263 "rw_mbytes_per_sec": 0, 00:13:27.263 "r_mbytes_per_sec": 0, 00:13:27.263 "w_mbytes_per_sec": 0 00:13:27.263 }, 00:13:27.263 "claimed": true, 00:13:27.263 "claim_type": "exclusive_write", 00:13:27.263 "zoned": false, 00:13:27.263 "supported_io_types": { 00:13:27.263 "read": true, 00:13:27.263 "write": true, 00:13:27.263 "unmap": true, 00:13:27.263 "flush": true, 00:13:27.263 "reset": true, 00:13:27.263 "nvme_admin": false, 00:13:27.263 "nvme_io": false, 00:13:27.263 "nvme_io_md": false, 00:13:27.263 "write_zeroes": true, 00:13:27.263 "zcopy": true, 00:13:27.263 "get_zone_info": false, 00:13:27.263 "zone_management": false, 00:13:27.263 "zone_append": false, 00:13:27.263 "compare": false, 00:13:27.263 "compare_and_write": false, 00:13:27.263 "abort": true, 00:13:27.263 "seek_hole": false, 00:13:27.263 "seek_data": false, 00:13:27.263 "copy": true, 00:13:27.263 "nvme_iov_md": false 00:13:27.263 }, 00:13:27.263 "memory_domains": [ 00:13:27.263 { 00:13:27.263 "dma_device_id": "system", 00:13:27.263 "dma_device_type": 1 00:13:27.263 }, 00:13:27.263 { 00:13:27.263 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:27.263 "dma_device_type": 2 00:13:27.263 } 00:13:27.263 ], 00:13:27.263 "driver_specific": { 00:13:27.263 "passthru": { 00:13:27.263 "name": "pt2", 00:13:27.263 "base_bdev_name": "malloc2" 00:13:27.263 } 00:13:27.263 } 00:13:27.263 }' 00:13:27.263 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:27.263 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:27.263 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:27.263 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:27.263 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:27.263 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:27.263 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:27.263 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:27.522 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:27.522 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:27.522 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:27.522 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:27.522 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:27.522 19:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:13:27.781 [2024-07-24 19:49:19.199001] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:27.781 19:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' bae72963-6736-4f7f-8800-36e07c849f1f '!=' bae72963-6736-4f7f-8800-36e07c849f1f ']' 00:13:27.781 19:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy raid1 00:13:27.781 19:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:27.781 19:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:27.781 19:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@508 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:28.041 [2024-07-24 19:49:19.435412] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:13:28.041 19:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:28.041 19:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:28.041 19:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:28.041 19:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:28.041 19:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:28.041 19:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:28.041 19:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:28.041 19:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:28.041 19:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:28.041 19:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:28.041 19:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:28.041 19:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:28.301 19:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:28.301 "name": "raid_bdev1", 00:13:28.301 "uuid": "bae72963-6736-4f7f-8800-36e07c849f1f", 00:13:28.301 "strip_size_kb": 0, 00:13:28.301 "state": "online", 00:13:28.301 "raid_level": "raid1", 00:13:28.301 "superblock": true, 00:13:28.301 "num_base_bdevs": 2, 00:13:28.301 "num_base_bdevs_discovered": 1, 00:13:28.301 "num_base_bdevs_operational": 1, 00:13:28.301 "base_bdevs_list": [ 00:13:28.301 { 00:13:28.301 "name": null, 00:13:28.301 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:28.301 "is_configured": false, 00:13:28.301 "data_offset": 2048, 00:13:28.301 "data_size": 63488 00:13:28.301 }, 00:13:28.301 { 00:13:28.301 "name": "pt2", 00:13:28.301 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:28.301 "is_configured": true, 00:13:28.301 "data_offset": 2048, 00:13:28.301 "data_size": 63488 00:13:28.301 } 00:13:28.301 ] 00:13:28.301 }' 00:13:28.301 19:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:28.301 19:49:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:28.869 19:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:28.869 [2024-07-24 19:49:20.458107] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:28.869 [2024-07-24 19:49:20.458136] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:28.869 [2024-07-24 19:49:20.458188] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:28.869 [2024-07-24 19:49:20.458233] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:28.869 [2024-07-24 19:49:20.458245] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fd57e0 name raid_bdev1, state offline 00:13:29.127 19:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:29.127 19:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # jq -r '.[]' 00:13:29.386 19:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # raid_bdev= 00:13:29.386 19:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@516 -- # '[' -n '' ']' 00:13:29.386 19:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i = 1 )) 00:13:29.386 19:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:13:29.386 19:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:29.645 19:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:13:29.645 19:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:13:29.645 19:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i = 1 )) 00:13:29.645 19:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:13:29.645 19:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # i=1 00:13:29.645 19:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:29.645 [2024-07-24 19:49:21.208044] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:29.645 [2024-07-24 19:49:21.208096] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:29.645 [2024-07-24 19:49:21.208114] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x217ae50 00:13:29.645 [2024-07-24 19:49:21.208127] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:29.645 [2024-07-24 19:49:21.209713] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:29.645 [2024-07-24 19:49:21.209741] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:29.645 [2024-07-24 19:49:21.209809] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:29.645 [2024-07-24 19:49:21.209836] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:29.645 [2024-07-24 19:49:21.209927] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2180e60 00:13:29.645 [2024-07-24 19:49:21.209938] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:29.645 [2024-07-24 19:49:21.210106] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x217ced0 00:13:29.645 [2024-07-24 19:49:21.210223] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2180e60 00:13:29.645 [2024-07-24 19:49:21.210232] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2180e60 00:13:29.645 [2024-07-24 19:49:21.210326] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:29.645 pt2 00:13:29.645 19:49:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@538 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:29.645 19:49:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:29.645 19:49:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:29.645 19:49:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:29.645 19:49:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:29.645 19:49:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:29.645 19:49:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:29.645 19:49:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:29.645 19:49:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:29.645 19:49:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:29.904 19:49:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:29.904 19:49:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:29.904 19:49:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:29.904 "name": "raid_bdev1", 00:13:29.904 "uuid": "bae72963-6736-4f7f-8800-36e07c849f1f", 00:13:29.904 "strip_size_kb": 0, 00:13:29.904 "state": "online", 00:13:29.904 "raid_level": "raid1", 00:13:29.904 "superblock": true, 00:13:29.904 "num_base_bdevs": 2, 00:13:29.904 "num_base_bdevs_discovered": 1, 00:13:29.904 "num_base_bdevs_operational": 1, 00:13:29.904 "base_bdevs_list": [ 00:13:29.904 { 00:13:29.904 "name": null, 00:13:29.904 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:29.904 "is_configured": false, 00:13:29.904 "data_offset": 2048, 00:13:29.904 "data_size": 63488 00:13:29.904 }, 00:13:29.904 { 00:13:29.904 "name": "pt2", 00:13:29.904 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:29.904 "is_configured": true, 00:13:29.904 "data_offset": 2048, 00:13:29.904 "data_size": 63488 00:13:29.904 } 00:13:29.904 ] 00:13:29.904 }' 00:13:29.904 19:49:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:29.904 19:49:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:30.472 19:49:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:30.731 [2024-07-24 19:49:22.286878] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:30.731 [2024-07-24 19:49:22.286904] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:30.731 [2024-07-24 19:49:22.286952] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:30.731 [2024-07-24 19:49:22.286994] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:30.732 [2024-07-24 19:49:22.287006] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2180e60 name raid_bdev1, state offline 00:13:30.732 19:49:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:30.732 19:49:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # jq -r '.[]' 00:13:30.991 19:49:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # raid_bdev= 00:13:30.991 19:49:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@543 -- # '[' -n '' ']' 00:13:30.991 19:49:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@547 -- # '[' 2 -gt 2 ']' 00:13:30.991 19:49:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:31.251 [2024-07-24 19:49:22.788178] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:31.251 [2024-07-24 19:49:22.788221] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:31.251 [2024-07-24 19:49:22.788237] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x217dd20 00:13:31.251 [2024-07-24 19:49:22.788250] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:31.251 [2024-07-24 19:49:22.789836] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:31.251 [2024-07-24 19:49:22.789865] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:31.251 [2024-07-24 19:49:22.789927] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:31.251 [2024-07-24 19:49:22.789952] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:31.251 [2024-07-24 19:49:22.790050] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:13:31.251 [2024-07-24 19:49:22.790063] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:31.251 [2024-07-24 19:49:22.790076] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x217ef40 name raid_bdev1, state configuring 00:13:31.251 [2024-07-24 19:49:22.790099] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:31.251 [2024-07-24 19:49:22.790153] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x217c8c0 00:13:31.251 [2024-07-24 19:49:22.790163] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:31.251 [2024-07-24 19:49:22.790327] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x217aa60 00:13:31.251 [2024-07-24 19:49:22.790458] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x217c8c0 00:13:31.251 [2024-07-24 19:49:22.790469] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x217c8c0 00:13:31.251 [2024-07-24 19:49:22.790566] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:31.251 pt1 00:13:31.251 19:49:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 2 -gt 2 ']' 00:13:31.251 19:49:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@569 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:31.251 19:49:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:31.251 19:49:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:31.251 19:49:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:31.251 19:49:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:31.251 19:49:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:31.251 19:49:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:31.251 19:49:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:31.251 19:49:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:31.251 19:49:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:31.251 19:49:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:31.251 19:49:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:31.510 19:49:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:31.510 "name": "raid_bdev1", 00:13:31.510 "uuid": "bae72963-6736-4f7f-8800-36e07c849f1f", 00:13:31.510 "strip_size_kb": 0, 00:13:31.510 "state": "online", 00:13:31.510 "raid_level": "raid1", 00:13:31.510 "superblock": true, 00:13:31.510 "num_base_bdevs": 2, 00:13:31.510 "num_base_bdevs_discovered": 1, 00:13:31.510 "num_base_bdevs_operational": 1, 00:13:31.510 "base_bdevs_list": [ 00:13:31.510 { 00:13:31.510 "name": null, 00:13:31.510 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:31.510 "is_configured": false, 00:13:31.510 "data_offset": 2048, 00:13:31.510 "data_size": 63488 00:13:31.510 }, 00:13:31.510 { 00:13:31.510 "name": "pt2", 00:13:31.510 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:31.510 "is_configured": true, 00:13:31.510 "data_offset": 2048, 00:13:31.510 "data_size": 63488 00:13:31.510 } 00:13:31.510 ] 00:13:31.510 }' 00:13:31.510 19:49:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:31.510 19:49:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:32.077 19:49:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:13:32.077 19:49:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:13:32.336 19:49:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # [[ false == \f\a\l\s\e ]] 00:13:32.336 19:49:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:32.336 19:49:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # jq -r '.[] | .uuid' 00:13:32.596 [2024-07-24 19:49:23.995603] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:32.596 19:49:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # '[' bae72963-6736-4f7f-8800-36e07c849f1f '!=' bae72963-6736-4f7f-8800-36e07c849f1f ']' 00:13:32.596 19:49:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 1390204 00:13:32.596 19:49:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1390204 ']' 00:13:32.596 19:49:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1390204 00:13:32.596 19:49:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:13:32.596 19:49:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:32.596 19:49:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1390204 00:13:32.596 19:49:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:32.596 19:49:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:32.596 19:49:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1390204' 00:13:32.596 killing process with pid 1390204 00:13:32.596 19:49:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 1390204 00:13:32.596 [2024-07-24 19:49:24.069980] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:32.596 [2024-07-24 19:49:24.070030] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:32.596 [2024-07-24 19:49:24.070072] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:32.596 [2024-07-24 19:49:24.070083] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x217c8c0 name raid_bdev1, state offline 00:13:32.596 19:49:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 1390204 00:13:32.596 [2024-07-24 19:49:24.086804] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:32.855 19:49:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:13:32.855 00:13:32.855 real 0m15.443s 00:13:32.855 user 0m27.935s 00:13:32.855 sys 0m2.947s 00:13:32.855 19:49:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:32.855 19:49:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:32.855 ************************************ 00:13:32.855 END TEST raid_superblock_test 00:13:32.855 ************************************ 00:13:32.855 19:49:24 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:13:32.855 19:49:24 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:13:32.855 19:49:24 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:32.855 19:49:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:32.855 ************************************ 00:13:32.855 START TEST raid_read_error_test 00:13:32.855 ************************************ 00:13:32.855 19:49:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 2 read 00:13:32.855 19:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid1 00:13:32.855 19:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=2 00:13:32.855 19:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:13:32.855 19:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:13:32.855 19:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:13:32.855 19:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:13:32.855 19:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:13:32.855 19:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:13:32.855 19:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:13:32.855 19:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:13:32.855 19:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:13:32.855 19:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:32.855 19:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:13:32.855 19:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:13:32.855 19:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:13:32.855 19:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:13:32.855 19:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:13:32.855 19:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:13:32.855 19:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid1 '!=' raid1 ']' 00:13:32.855 19:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@818 -- # strip_size=0 00:13:32.855 19:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:13:32.855 19:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.nkM7pO5VqP 00:13:32.855 19:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1392637 00:13:32.855 19:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1392637 /var/tmp/spdk-raid.sock 00:13:32.855 19:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:32.855 19:49:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 1392637 ']' 00:13:32.855 19:49:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:32.856 19:49:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:32.856 19:49:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:32.856 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:32.856 19:49:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:32.856 19:49:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:33.115 [2024-07-24 19:49:24.467216] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:13:33.115 [2024-07-24 19:49:24.467274] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1392637 ] 00:13:33.115 [2024-07-24 19:49:24.579411] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:33.115 [2024-07-24 19:49:24.682883] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:33.374 [2024-07-24 19:49:24.750203] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:33.374 [2024-07-24 19:49:24.750251] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:33.944 19:49:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:33.944 19:49:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:13:33.944 19:49:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:13:33.944 19:49:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:34.282 BaseBdev1_malloc 00:13:34.282 19:49:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:34.282 true 00:13:34.550 19:49:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:34.550 [2024-07-24 19:49:26.061108] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:34.550 [2024-07-24 19:49:26.061155] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:34.550 [2024-07-24 19:49:26.061175] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x110c3a0 00:13:34.550 [2024-07-24 19:49:26.061188] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:34.550 [2024-07-24 19:49:26.062763] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:34.550 [2024-07-24 19:49:26.062791] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:34.550 BaseBdev1 00:13:34.550 19:49:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:13:34.550 19:49:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:34.810 BaseBdev2_malloc 00:13:34.810 19:49:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:35.070 true 00:13:35.070 19:49:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:35.329 [2024-07-24 19:49:26.811606] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:35.329 [2024-07-24 19:49:26.811647] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:35.329 [2024-07-24 19:49:26.811670] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11cb370 00:13:35.329 [2024-07-24 19:49:26.811682] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:35.329 [2024-07-24 19:49:26.813154] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:35.329 [2024-07-24 19:49:26.813182] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:35.329 BaseBdev2 00:13:35.329 19:49:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:35.588 [2024-07-24 19:49:27.112420] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:35.588 [2024-07-24 19:49:27.113779] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:35.588 [2024-07-24 19:49:27.113975] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1102340 00:13:35.588 [2024-07-24 19:49:27.113988] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:35.588 [2024-07-24 19:49:27.114185] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1103050 00:13:35.588 [2024-07-24 19:49:27.114338] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1102340 00:13:35.588 [2024-07-24 19:49:27.114349] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1102340 00:13:35.588 [2024-07-24 19:49:27.114463] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:35.588 19:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:35.588 19:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:35.588 19:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:35.588 19:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:35.588 19:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:35.588 19:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:35.588 19:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:35.588 19:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:35.588 19:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:35.588 19:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:35.589 19:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:35.589 19:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:35.847 19:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:35.847 "name": "raid_bdev1", 00:13:35.847 "uuid": "bacae6c7-ad18-4105-8eca-d867bd78d750", 00:13:35.847 "strip_size_kb": 0, 00:13:35.847 "state": "online", 00:13:35.847 "raid_level": "raid1", 00:13:35.847 "superblock": true, 00:13:35.847 "num_base_bdevs": 2, 00:13:35.847 "num_base_bdevs_discovered": 2, 00:13:35.847 "num_base_bdevs_operational": 2, 00:13:35.847 "base_bdevs_list": [ 00:13:35.847 { 00:13:35.847 "name": "BaseBdev1", 00:13:35.847 "uuid": "0a0edb65-afa9-50fe-b66d-4ed55f88be5e", 00:13:35.847 "is_configured": true, 00:13:35.847 "data_offset": 2048, 00:13:35.847 "data_size": 63488 00:13:35.847 }, 00:13:35.847 { 00:13:35.847 "name": "BaseBdev2", 00:13:35.847 "uuid": "abc4e675-24c4-505c-985e-e620f392ad94", 00:13:35.847 "is_configured": true, 00:13:35.847 "data_offset": 2048, 00:13:35.847 "data_size": 63488 00:13:35.847 } 00:13:35.847 ] 00:13:35.847 }' 00:13:35.847 19:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:35.847 19:49:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:36.415 19:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:13:36.415 19:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:36.674 [2024-07-24 19:49:28.067277] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11ccac0 00:13:37.611 19:49:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:13:37.871 19:49:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:13:37.871 19:49:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid1 = \r\a\i\d\1 ]] 00:13:37.871 19:49:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ read = \w\r\i\t\e ]] 00:13:37.871 19:49:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=2 00:13:37.871 19:49:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:37.871 19:49:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:37.871 19:49:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:37.871 19:49:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:37.871 19:49:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:37.871 19:49:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:37.871 19:49:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:37.871 19:49:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:37.871 19:49:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:37.871 19:49:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:37.871 19:49:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:37.871 19:49:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:37.871 19:49:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:37.871 "name": "raid_bdev1", 00:13:37.871 "uuid": "bacae6c7-ad18-4105-8eca-d867bd78d750", 00:13:37.871 "strip_size_kb": 0, 00:13:37.871 "state": "online", 00:13:37.871 "raid_level": "raid1", 00:13:37.871 "superblock": true, 00:13:37.871 "num_base_bdevs": 2, 00:13:37.871 "num_base_bdevs_discovered": 2, 00:13:37.871 "num_base_bdevs_operational": 2, 00:13:37.871 "base_bdevs_list": [ 00:13:37.871 { 00:13:37.871 "name": "BaseBdev1", 00:13:37.871 "uuid": "0a0edb65-afa9-50fe-b66d-4ed55f88be5e", 00:13:37.871 "is_configured": true, 00:13:37.871 "data_offset": 2048, 00:13:37.871 "data_size": 63488 00:13:37.871 }, 00:13:37.871 { 00:13:37.871 "name": "BaseBdev2", 00:13:37.871 "uuid": "abc4e675-24c4-505c-985e-e620f392ad94", 00:13:37.871 "is_configured": true, 00:13:37.871 "data_offset": 2048, 00:13:37.871 "data_size": 63488 00:13:37.871 } 00:13:37.871 ] 00:13:37.871 }' 00:13:37.871 19:49:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:37.871 19:49:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:38.809 19:49:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:38.809 [2024-07-24 19:49:30.285979] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:38.809 [2024-07-24 19:49:30.286024] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:38.809 [2024-07-24 19:49:30.289185] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:38.809 [2024-07-24 19:49:30.289219] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:38.809 [2024-07-24 19:49:30.289293] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:38.809 [2024-07-24 19:49:30.289305] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1102340 name raid_bdev1, state offline 00:13:38.809 0 00:13:38.809 19:49:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1392637 00:13:38.809 19:49:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 1392637 ']' 00:13:38.809 19:49:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 1392637 00:13:38.809 19:49:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:13:38.809 19:49:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:38.809 19:49:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1392637 00:13:38.809 19:49:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:38.809 19:49:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:38.809 19:49:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1392637' 00:13:38.809 killing process with pid 1392637 00:13:38.809 19:49:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 1392637 00:13:38.809 [2024-07-24 19:49:30.369693] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:38.809 19:49:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 1392637 00:13:38.809 [2024-07-24 19:49:30.380385] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:39.068 19:49:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.nkM7pO5VqP 00:13:39.068 19:49:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:13:39.068 19:49:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:13:39.068 19:49:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.00 00:13:39.068 19:49:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid1 00:13:39.068 19:49:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:39.068 19:49:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:39.068 19:49:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@861 -- # [[ 0.00 = \0\.\0\0 ]] 00:13:39.068 00:13:39.068 real 0m6.226s 00:13:39.068 user 0m9.726s 00:13:39.068 sys 0m1.081s 00:13:39.068 19:49:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:39.068 19:49:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:39.068 ************************************ 00:13:39.068 END TEST raid_read_error_test 00:13:39.068 ************************************ 00:13:39.068 19:49:30 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:13:39.068 19:49:30 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:13:39.328 19:49:30 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:39.328 19:49:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:39.328 ************************************ 00:13:39.328 START TEST raid_write_error_test 00:13:39.328 ************************************ 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 2 write 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid1 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=2 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid1 '!=' raid1 ']' 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@818 -- # strip_size=0 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.Zd51uCAWU8 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1393453 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1393453 /var/tmp/spdk-raid.sock 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 1393453 ']' 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:39.328 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:39.328 19:49:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:39.328 [2024-07-24 19:49:30.778948] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:13:39.328 [2024-07-24 19:49:30.779020] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1393453 ] 00:13:39.328 [2024-07-24 19:49:30.911857] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:39.587 [2024-07-24 19:49:31.020686] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:39.587 [2024-07-24 19:49:31.081633] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:39.587 [2024-07-24 19:49:31.081667] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:39.846 19:49:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:39.846 19:49:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:13:39.846 19:49:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:13:39.846 19:49:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:40.105 BaseBdev1_malloc 00:13:40.105 19:49:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:40.363 true 00:13:40.363 19:49:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:40.622 [2024-07-24 19:49:31.970290] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:40.622 [2024-07-24 19:49:31.970335] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:40.622 [2024-07-24 19:49:31.970356] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d233a0 00:13:40.622 [2024-07-24 19:49:31.970369] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:40.622 [2024-07-24 19:49:31.972222] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:40.622 [2024-07-24 19:49:31.972252] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:40.622 BaseBdev1 00:13:40.622 19:49:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:13:40.622 19:49:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:40.881 BaseBdev2_malloc 00:13:40.881 19:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:40.881 true 00:13:41.139 19:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:41.139 [2024-07-24 19:49:32.714100] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:41.140 [2024-07-24 19:49:32.714145] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:41.140 [2024-07-24 19:49:32.714169] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1de2370 00:13:41.140 [2024-07-24 19:49:32.714182] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:41.140 [2024-07-24 19:49:32.715749] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:41.140 [2024-07-24 19:49:32.715780] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:41.140 BaseBdev2 00:13:41.399 19:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:41.399 [2024-07-24 19:49:32.958772] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:41.399 [2024-07-24 19:49:32.960102] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:41.399 [2024-07-24 19:49:32.960299] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d19340 00:13:41.399 [2024-07-24 19:49:32.960313] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:41.399 [2024-07-24 19:49:32.960517] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d1a050 00:13:41.399 [2024-07-24 19:49:32.960674] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d19340 00:13:41.399 [2024-07-24 19:49:32.960684] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d19340 00:13:41.399 [2024-07-24 19:49:32.960794] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:41.399 19:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:41.399 19:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:41.399 19:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:41.399 19:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:41.399 19:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:41.399 19:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:41.399 19:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:41.399 19:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:41.399 19:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:41.399 19:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:41.399 19:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:41.399 19:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:41.658 19:49:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:41.658 "name": "raid_bdev1", 00:13:41.658 "uuid": "2fefbed7-947f-4438-8053-a2468728a7ac", 00:13:41.658 "strip_size_kb": 0, 00:13:41.658 "state": "online", 00:13:41.658 "raid_level": "raid1", 00:13:41.658 "superblock": true, 00:13:41.658 "num_base_bdevs": 2, 00:13:41.658 "num_base_bdevs_discovered": 2, 00:13:41.658 "num_base_bdevs_operational": 2, 00:13:41.658 "base_bdevs_list": [ 00:13:41.658 { 00:13:41.658 "name": "BaseBdev1", 00:13:41.658 "uuid": "8b4d54a0-00c1-58ec-8d14-b17f8e3a436f", 00:13:41.658 "is_configured": true, 00:13:41.658 "data_offset": 2048, 00:13:41.658 "data_size": 63488 00:13:41.658 }, 00:13:41.658 { 00:13:41.658 "name": "BaseBdev2", 00:13:41.658 "uuid": "3fa69c93-24af-5638-a923-03219b8179b6", 00:13:41.658 "is_configured": true, 00:13:41.658 "data_offset": 2048, 00:13:41.658 "data_size": 63488 00:13:41.658 } 00:13:41.658 ] 00:13:41.658 }' 00:13:41.658 19:49:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:41.658 19:49:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:42.594 19:49:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:13:42.594 19:49:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:42.594 [2024-07-24 19:49:33.945769] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1de3ac0 00:13:43.531 19:49:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:13:43.531 [2024-07-24 19:49:35.070119] bdev_raid.c:2247:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:13:43.531 [2024-07-24 19:49:35.070165] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:43.531 [2024-07-24 19:49:35.070340] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1de3ac0 00:13:43.531 19:49:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:13:43.531 19:49:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid1 = \r\a\i\d\1 ]] 00:13:43.531 19:49:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ write = \w\r\i\t\e ]] 00:13:43.531 19:49:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # expected_num_base_bdevs=1 00:13:43.531 19:49:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:43.531 19:49:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:43.531 19:49:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:43.531 19:49:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:43.531 19:49:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:43.531 19:49:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:43.531 19:49:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:43.531 19:49:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:43.531 19:49:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:43.531 19:49:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:43.531 19:49:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:43.531 19:49:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:43.791 19:49:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:43.791 "name": "raid_bdev1", 00:13:43.791 "uuid": "2fefbed7-947f-4438-8053-a2468728a7ac", 00:13:43.791 "strip_size_kb": 0, 00:13:43.791 "state": "online", 00:13:43.791 "raid_level": "raid1", 00:13:43.791 "superblock": true, 00:13:43.791 "num_base_bdevs": 2, 00:13:43.791 "num_base_bdevs_discovered": 1, 00:13:43.791 "num_base_bdevs_operational": 1, 00:13:43.791 "base_bdevs_list": [ 00:13:43.791 { 00:13:43.791 "name": null, 00:13:43.791 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:43.791 "is_configured": false, 00:13:43.791 "data_offset": 2048, 00:13:43.791 "data_size": 63488 00:13:43.791 }, 00:13:43.791 { 00:13:43.791 "name": "BaseBdev2", 00:13:43.791 "uuid": "3fa69c93-24af-5638-a923-03219b8179b6", 00:13:43.791 "is_configured": true, 00:13:43.791 "data_offset": 2048, 00:13:43.791 "data_size": 63488 00:13:43.791 } 00:13:43.791 ] 00:13:43.791 }' 00:13:43.791 19:49:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:43.791 19:49:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:44.362 19:49:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:44.620 [2024-07-24 19:49:36.129850] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:44.620 [2024-07-24 19:49:36.129888] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:44.620 [2024-07-24 19:49:36.133021] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:44.620 [2024-07-24 19:49:36.133049] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:44.620 [2024-07-24 19:49:36.133103] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:44.620 [2024-07-24 19:49:36.133114] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d19340 name raid_bdev1, state offline 00:13:44.620 0 00:13:44.620 19:49:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1393453 00:13:44.620 19:49:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 1393453 ']' 00:13:44.620 19:49:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 1393453 00:13:44.620 19:49:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:13:44.620 19:49:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:44.620 19:49:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1393453 00:13:44.620 19:49:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:44.620 19:49:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:44.620 19:49:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1393453' 00:13:44.879 killing process with pid 1393453 00:13:44.879 19:49:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 1393453 00:13:44.879 [2024-07-24 19:49:36.212886] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:44.879 19:49:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 1393453 00:13:44.879 [2024-07-24 19:49:36.223470] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:44.879 19:49:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.Zd51uCAWU8 00:13:44.879 19:49:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:13:44.879 19:49:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:13:44.879 19:49:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.00 00:13:44.879 19:49:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid1 00:13:44.879 19:49:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:44.879 19:49:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:44.879 19:49:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@861 -- # [[ 0.00 = \0\.\0\0 ]] 00:13:44.879 00:13:44.879 real 0m5.761s 00:13:44.879 user 0m9.345s 00:13:44.879 sys 0m1.082s 00:13:44.879 19:49:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:44.879 19:49:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:44.879 ************************************ 00:13:44.879 END TEST raid_write_error_test 00:13:44.879 ************************************ 00:13:45.139 19:49:36 bdev_raid -- bdev/bdev_raid.sh@945 -- # for n in {2..4} 00:13:45.139 19:49:36 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:13:45.139 19:49:36 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:13:45.139 19:49:36 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:13:45.139 19:49:36 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:45.139 19:49:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:45.139 ************************************ 00:13:45.139 START TEST raid_state_function_test 00:13:45.139 ************************************ 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 3 false 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1394405 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1394405' 00:13:45.139 Process raid pid: 1394405 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1394405 /var/tmp/spdk-raid.sock 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 1394405 ']' 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:45.139 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:45.139 19:49:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:45.139 [2024-07-24 19:49:36.612274] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:13:45.139 [2024-07-24 19:49:36.612343] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:45.399 [2024-07-24 19:49:36.733134] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:45.399 [2024-07-24 19:49:36.839536] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:45.399 [2024-07-24 19:49:36.906232] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:45.399 [2024-07-24 19:49:36.906267] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:46.337 19:49:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:46.337 19:49:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:13:46.337 19:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:46.906 [2024-07-24 19:49:38.301618] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:46.906 [2024-07-24 19:49:38.301658] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:46.907 [2024-07-24 19:49:38.301669] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:46.907 [2024-07-24 19:49:38.301681] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:46.907 [2024-07-24 19:49:38.301690] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:46.907 [2024-07-24 19:49:38.301705] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:46.907 19:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:46.907 19:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:46.907 19:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:46.907 19:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:46.907 19:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:46.907 19:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:46.907 19:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:46.907 19:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:46.907 19:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:46.907 19:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:46.907 19:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:46.907 19:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:47.475 19:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:47.475 "name": "Existed_Raid", 00:13:47.475 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:47.476 "strip_size_kb": 64, 00:13:47.476 "state": "configuring", 00:13:47.476 "raid_level": "raid0", 00:13:47.476 "superblock": false, 00:13:47.476 "num_base_bdevs": 3, 00:13:47.476 "num_base_bdevs_discovered": 0, 00:13:47.476 "num_base_bdevs_operational": 3, 00:13:47.476 "base_bdevs_list": [ 00:13:47.476 { 00:13:47.476 "name": "BaseBdev1", 00:13:47.476 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:47.476 "is_configured": false, 00:13:47.476 "data_offset": 0, 00:13:47.476 "data_size": 0 00:13:47.476 }, 00:13:47.476 { 00:13:47.476 "name": "BaseBdev2", 00:13:47.476 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:47.476 "is_configured": false, 00:13:47.476 "data_offset": 0, 00:13:47.476 "data_size": 0 00:13:47.476 }, 00:13:47.476 { 00:13:47.476 "name": "BaseBdev3", 00:13:47.476 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:47.476 "is_configured": false, 00:13:47.476 "data_offset": 0, 00:13:47.476 "data_size": 0 00:13:47.476 } 00:13:47.476 ] 00:13:47.476 }' 00:13:47.476 19:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:47.476 19:49:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:48.044 19:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:48.303 [2024-07-24 19:49:39.689130] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:48.303 [2024-07-24 19:49:39.689158] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f47a10 name Existed_Raid, state configuring 00:13:48.303 19:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:48.562 [2024-07-24 19:49:39.933807] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:48.562 [2024-07-24 19:49:39.933835] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:48.562 [2024-07-24 19:49:39.933845] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:48.562 [2024-07-24 19:49:39.933857] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:48.562 [2024-07-24 19:49:39.933865] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:48.562 [2024-07-24 19:49:39.933876] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:48.562 19:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:48.820 [2024-07-24 19:49:40.192501] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:48.820 BaseBdev1 00:13:48.820 19:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:48.820 19:49:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:13:48.821 19:49:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:48.821 19:49:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:13:48.821 19:49:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:48.821 19:49:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:48.821 19:49:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:49.079 19:49:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:49.339 [ 00:13:49.339 { 00:13:49.339 "name": "BaseBdev1", 00:13:49.339 "aliases": [ 00:13:49.339 "f7cc4bbc-8751-4fc5-b66a-ecbf67b0341d" 00:13:49.339 ], 00:13:49.339 "product_name": "Malloc disk", 00:13:49.339 "block_size": 512, 00:13:49.339 "num_blocks": 65536, 00:13:49.339 "uuid": "f7cc4bbc-8751-4fc5-b66a-ecbf67b0341d", 00:13:49.339 "assigned_rate_limits": { 00:13:49.339 "rw_ios_per_sec": 0, 00:13:49.339 "rw_mbytes_per_sec": 0, 00:13:49.339 "r_mbytes_per_sec": 0, 00:13:49.339 "w_mbytes_per_sec": 0 00:13:49.339 }, 00:13:49.339 "claimed": true, 00:13:49.339 "claim_type": "exclusive_write", 00:13:49.339 "zoned": false, 00:13:49.339 "supported_io_types": { 00:13:49.339 "read": true, 00:13:49.339 "write": true, 00:13:49.339 "unmap": true, 00:13:49.339 "flush": true, 00:13:49.339 "reset": true, 00:13:49.339 "nvme_admin": false, 00:13:49.339 "nvme_io": false, 00:13:49.339 "nvme_io_md": false, 00:13:49.339 "write_zeroes": true, 00:13:49.339 "zcopy": true, 00:13:49.339 "get_zone_info": false, 00:13:49.339 "zone_management": false, 00:13:49.339 "zone_append": false, 00:13:49.339 "compare": false, 00:13:49.339 "compare_and_write": false, 00:13:49.339 "abort": true, 00:13:49.339 "seek_hole": false, 00:13:49.339 "seek_data": false, 00:13:49.339 "copy": true, 00:13:49.339 "nvme_iov_md": false 00:13:49.339 }, 00:13:49.339 "memory_domains": [ 00:13:49.339 { 00:13:49.339 "dma_device_id": "system", 00:13:49.339 "dma_device_type": 1 00:13:49.339 }, 00:13:49.339 { 00:13:49.339 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:49.339 "dma_device_type": 2 00:13:49.339 } 00:13:49.339 ], 00:13:49.339 "driver_specific": {} 00:13:49.339 } 00:13:49.339 ] 00:13:49.339 19:49:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:13:49.339 19:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:49.339 19:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:49.339 19:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:49.339 19:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:49.339 19:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:49.339 19:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:49.339 19:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:49.339 19:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:49.339 19:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:49.339 19:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:49.339 19:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:49.339 19:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:49.339 19:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:49.339 "name": "Existed_Raid", 00:13:49.339 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:49.339 "strip_size_kb": 64, 00:13:49.339 "state": "configuring", 00:13:49.339 "raid_level": "raid0", 00:13:49.339 "superblock": false, 00:13:49.339 "num_base_bdevs": 3, 00:13:49.339 "num_base_bdevs_discovered": 1, 00:13:49.339 "num_base_bdevs_operational": 3, 00:13:49.339 "base_bdevs_list": [ 00:13:49.339 { 00:13:49.339 "name": "BaseBdev1", 00:13:49.339 "uuid": "f7cc4bbc-8751-4fc5-b66a-ecbf67b0341d", 00:13:49.339 "is_configured": true, 00:13:49.339 "data_offset": 0, 00:13:49.339 "data_size": 65536 00:13:49.339 }, 00:13:49.339 { 00:13:49.339 "name": "BaseBdev2", 00:13:49.339 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:49.339 "is_configured": false, 00:13:49.339 "data_offset": 0, 00:13:49.339 "data_size": 0 00:13:49.339 }, 00:13:49.339 { 00:13:49.339 "name": "BaseBdev3", 00:13:49.339 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:49.339 "is_configured": false, 00:13:49.339 "data_offset": 0, 00:13:49.339 "data_size": 0 00:13:49.339 } 00:13:49.339 ] 00:13:49.339 }' 00:13:49.339 19:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:49.339 19:49:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:49.909 19:49:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:50.168 [2024-07-24 19:49:41.704508] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:50.168 [2024-07-24 19:49:41.704542] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f472e0 name Existed_Raid, state configuring 00:13:50.168 19:49:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:50.429 [2024-07-24 19:49:41.993298] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:50.429 [2024-07-24 19:49:41.994817] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:50.429 [2024-07-24 19:49:41.994851] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:50.429 [2024-07-24 19:49:41.994861] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:50.429 [2024-07-24 19:49:41.994873] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:50.429 19:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:50.429 19:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:50.429 19:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:50.429 19:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:50.429 19:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:50.429 19:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:50.429 19:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:50.429 19:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:50.429 19:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:50.429 19:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:50.429 19:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:50.429 19:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:50.429 19:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:50.429 19:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:50.688 19:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:50.688 "name": "Existed_Raid", 00:13:50.688 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:50.688 "strip_size_kb": 64, 00:13:50.688 "state": "configuring", 00:13:50.688 "raid_level": "raid0", 00:13:50.688 "superblock": false, 00:13:50.688 "num_base_bdevs": 3, 00:13:50.688 "num_base_bdevs_discovered": 1, 00:13:50.688 "num_base_bdevs_operational": 3, 00:13:50.688 "base_bdevs_list": [ 00:13:50.688 { 00:13:50.688 "name": "BaseBdev1", 00:13:50.688 "uuid": "f7cc4bbc-8751-4fc5-b66a-ecbf67b0341d", 00:13:50.688 "is_configured": true, 00:13:50.688 "data_offset": 0, 00:13:50.688 "data_size": 65536 00:13:50.688 }, 00:13:50.688 { 00:13:50.688 "name": "BaseBdev2", 00:13:50.688 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:50.688 "is_configured": false, 00:13:50.688 "data_offset": 0, 00:13:50.688 "data_size": 0 00:13:50.688 }, 00:13:50.688 { 00:13:50.688 "name": "BaseBdev3", 00:13:50.688 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:50.688 "is_configured": false, 00:13:50.688 "data_offset": 0, 00:13:50.688 "data_size": 0 00:13:50.688 } 00:13:50.688 ] 00:13:50.688 }' 00:13:50.688 19:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:50.688 19:49:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:51.626 19:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:51.626 [2024-07-24 19:49:43.111614] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:51.626 BaseBdev2 00:13:51.626 19:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:51.626 19:49:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:13:51.626 19:49:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:51.626 19:49:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:13:51.626 19:49:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:51.626 19:49:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:51.626 19:49:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:51.886 19:49:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:52.144 [ 00:13:52.144 { 00:13:52.144 "name": "BaseBdev2", 00:13:52.144 "aliases": [ 00:13:52.144 "16bf7472-ebe3-49f9-bd0b-dcc8311550f8" 00:13:52.144 ], 00:13:52.144 "product_name": "Malloc disk", 00:13:52.144 "block_size": 512, 00:13:52.144 "num_blocks": 65536, 00:13:52.144 "uuid": "16bf7472-ebe3-49f9-bd0b-dcc8311550f8", 00:13:52.144 "assigned_rate_limits": { 00:13:52.144 "rw_ios_per_sec": 0, 00:13:52.144 "rw_mbytes_per_sec": 0, 00:13:52.144 "r_mbytes_per_sec": 0, 00:13:52.144 "w_mbytes_per_sec": 0 00:13:52.144 }, 00:13:52.145 "claimed": true, 00:13:52.145 "claim_type": "exclusive_write", 00:13:52.145 "zoned": false, 00:13:52.145 "supported_io_types": { 00:13:52.145 "read": true, 00:13:52.145 "write": true, 00:13:52.145 "unmap": true, 00:13:52.145 "flush": true, 00:13:52.145 "reset": true, 00:13:52.145 "nvme_admin": false, 00:13:52.145 "nvme_io": false, 00:13:52.145 "nvme_io_md": false, 00:13:52.145 "write_zeroes": true, 00:13:52.145 "zcopy": true, 00:13:52.145 "get_zone_info": false, 00:13:52.145 "zone_management": false, 00:13:52.145 "zone_append": false, 00:13:52.145 "compare": false, 00:13:52.145 "compare_and_write": false, 00:13:52.145 "abort": true, 00:13:52.145 "seek_hole": false, 00:13:52.145 "seek_data": false, 00:13:52.145 "copy": true, 00:13:52.145 "nvme_iov_md": false 00:13:52.145 }, 00:13:52.145 "memory_domains": [ 00:13:52.145 { 00:13:52.145 "dma_device_id": "system", 00:13:52.145 "dma_device_type": 1 00:13:52.145 }, 00:13:52.145 { 00:13:52.145 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:52.145 "dma_device_type": 2 00:13:52.145 } 00:13:52.145 ], 00:13:52.145 "driver_specific": {} 00:13:52.145 } 00:13:52.145 ] 00:13:52.145 19:49:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:13:52.145 19:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:52.145 19:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:52.145 19:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:52.145 19:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:52.145 19:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:52.145 19:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:52.145 19:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:52.145 19:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:52.145 19:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:52.145 19:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:52.145 19:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:52.145 19:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:52.145 19:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:52.145 19:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:52.404 19:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:52.404 "name": "Existed_Raid", 00:13:52.404 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:52.404 "strip_size_kb": 64, 00:13:52.404 "state": "configuring", 00:13:52.404 "raid_level": "raid0", 00:13:52.404 "superblock": false, 00:13:52.404 "num_base_bdevs": 3, 00:13:52.404 "num_base_bdevs_discovered": 2, 00:13:52.404 "num_base_bdevs_operational": 3, 00:13:52.404 "base_bdevs_list": [ 00:13:52.404 { 00:13:52.404 "name": "BaseBdev1", 00:13:52.404 "uuid": "f7cc4bbc-8751-4fc5-b66a-ecbf67b0341d", 00:13:52.404 "is_configured": true, 00:13:52.404 "data_offset": 0, 00:13:52.404 "data_size": 65536 00:13:52.404 }, 00:13:52.404 { 00:13:52.404 "name": "BaseBdev2", 00:13:52.404 "uuid": "16bf7472-ebe3-49f9-bd0b-dcc8311550f8", 00:13:52.404 "is_configured": true, 00:13:52.404 "data_offset": 0, 00:13:52.404 "data_size": 65536 00:13:52.404 }, 00:13:52.404 { 00:13:52.404 "name": "BaseBdev3", 00:13:52.404 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:52.404 "is_configured": false, 00:13:52.404 "data_offset": 0, 00:13:52.404 "data_size": 0 00:13:52.404 } 00:13:52.404 ] 00:13:52.404 }' 00:13:52.404 19:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:52.404 19:49:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:52.971 19:49:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:53.231 [2024-07-24 19:49:44.675487] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:53.231 [2024-07-24 19:49:44.675534] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f481d0 00:13:53.231 [2024-07-24 19:49:44.675542] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:13:53.231 [2024-07-24 19:49:44.675734] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20ef370 00:13:53.231 [2024-07-24 19:49:44.675860] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f481d0 00:13:53.231 [2024-07-24 19:49:44.675870] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f481d0 00:13:53.231 [2024-07-24 19:49:44.676032] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:53.231 BaseBdev3 00:13:53.231 19:49:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:53.231 19:49:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:13:53.231 19:49:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:53.231 19:49:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:13:53.231 19:49:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:53.231 19:49:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:53.231 19:49:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:53.491 19:49:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:53.751 [ 00:13:53.751 { 00:13:53.751 "name": "BaseBdev3", 00:13:53.751 "aliases": [ 00:13:53.751 "9e98a4c4-14b7-40b1-bb46-b07fec621bb3" 00:13:53.751 ], 00:13:53.751 "product_name": "Malloc disk", 00:13:53.751 "block_size": 512, 00:13:53.751 "num_blocks": 65536, 00:13:53.751 "uuid": "9e98a4c4-14b7-40b1-bb46-b07fec621bb3", 00:13:53.751 "assigned_rate_limits": { 00:13:53.751 "rw_ios_per_sec": 0, 00:13:53.751 "rw_mbytes_per_sec": 0, 00:13:53.751 "r_mbytes_per_sec": 0, 00:13:53.751 "w_mbytes_per_sec": 0 00:13:53.751 }, 00:13:53.751 "claimed": true, 00:13:53.751 "claim_type": "exclusive_write", 00:13:53.751 "zoned": false, 00:13:53.751 "supported_io_types": { 00:13:53.751 "read": true, 00:13:53.751 "write": true, 00:13:53.751 "unmap": true, 00:13:53.751 "flush": true, 00:13:53.751 "reset": true, 00:13:53.751 "nvme_admin": false, 00:13:53.751 "nvme_io": false, 00:13:53.751 "nvme_io_md": false, 00:13:53.751 "write_zeroes": true, 00:13:53.751 "zcopy": true, 00:13:53.751 "get_zone_info": false, 00:13:53.751 "zone_management": false, 00:13:53.751 "zone_append": false, 00:13:53.751 "compare": false, 00:13:53.751 "compare_and_write": false, 00:13:53.751 "abort": true, 00:13:53.751 "seek_hole": false, 00:13:53.751 "seek_data": false, 00:13:53.751 "copy": true, 00:13:53.751 "nvme_iov_md": false 00:13:53.751 }, 00:13:53.751 "memory_domains": [ 00:13:53.751 { 00:13:53.751 "dma_device_id": "system", 00:13:53.751 "dma_device_type": 1 00:13:53.751 }, 00:13:53.751 { 00:13:53.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:53.751 "dma_device_type": 2 00:13:53.751 } 00:13:53.751 ], 00:13:53.751 "driver_specific": {} 00:13:53.751 } 00:13:53.751 ] 00:13:53.751 19:49:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:13:53.751 19:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:53.751 19:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:53.751 19:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:53.751 19:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:53.751 19:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:53.751 19:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:53.751 19:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:53.751 19:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:53.751 19:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:53.751 19:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:53.751 19:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:53.751 19:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:53.752 19:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:53.752 19:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:54.082 19:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:54.082 "name": "Existed_Raid", 00:13:54.082 "uuid": "41848e70-dc5b-48dd-b5db-df39aa1dd330", 00:13:54.082 "strip_size_kb": 64, 00:13:54.082 "state": "online", 00:13:54.082 "raid_level": "raid0", 00:13:54.082 "superblock": false, 00:13:54.082 "num_base_bdevs": 3, 00:13:54.082 "num_base_bdevs_discovered": 3, 00:13:54.082 "num_base_bdevs_operational": 3, 00:13:54.082 "base_bdevs_list": [ 00:13:54.082 { 00:13:54.082 "name": "BaseBdev1", 00:13:54.082 "uuid": "f7cc4bbc-8751-4fc5-b66a-ecbf67b0341d", 00:13:54.082 "is_configured": true, 00:13:54.082 "data_offset": 0, 00:13:54.082 "data_size": 65536 00:13:54.082 }, 00:13:54.082 { 00:13:54.082 "name": "BaseBdev2", 00:13:54.082 "uuid": "16bf7472-ebe3-49f9-bd0b-dcc8311550f8", 00:13:54.082 "is_configured": true, 00:13:54.082 "data_offset": 0, 00:13:54.082 "data_size": 65536 00:13:54.082 }, 00:13:54.082 { 00:13:54.082 "name": "BaseBdev3", 00:13:54.082 "uuid": "9e98a4c4-14b7-40b1-bb46-b07fec621bb3", 00:13:54.082 "is_configured": true, 00:13:54.082 "data_offset": 0, 00:13:54.082 "data_size": 65536 00:13:54.082 } 00:13:54.082 ] 00:13:54.082 }' 00:13:54.082 19:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:54.082 19:49:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:54.651 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:54.651 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:54.651 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:54.651 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:54.651 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:54.651 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:54.651 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:54.651 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:54.651 [2024-07-24 19:49:46.171753] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:54.651 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:54.651 "name": "Existed_Raid", 00:13:54.651 "aliases": [ 00:13:54.651 "41848e70-dc5b-48dd-b5db-df39aa1dd330" 00:13:54.651 ], 00:13:54.651 "product_name": "Raid Volume", 00:13:54.651 "block_size": 512, 00:13:54.651 "num_blocks": 196608, 00:13:54.651 "uuid": "41848e70-dc5b-48dd-b5db-df39aa1dd330", 00:13:54.651 "assigned_rate_limits": { 00:13:54.651 "rw_ios_per_sec": 0, 00:13:54.651 "rw_mbytes_per_sec": 0, 00:13:54.651 "r_mbytes_per_sec": 0, 00:13:54.651 "w_mbytes_per_sec": 0 00:13:54.651 }, 00:13:54.651 "claimed": false, 00:13:54.651 "zoned": false, 00:13:54.651 "supported_io_types": { 00:13:54.651 "read": true, 00:13:54.651 "write": true, 00:13:54.651 "unmap": true, 00:13:54.651 "flush": true, 00:13:54.651 "reset": true, 00:13:54.651 "nvme_admin": false, 00:13:54.651 "nvme_io": false, 00:13:54.651 "nvme_io_md": false, 00:13:54.651 "write_zeroes": true, 00:13:54.651 "zcopy": false, 00:13:54.651 "get_zone_info": false, 00:13:54.651 "zone_management": false, 00:13:54.651 "zone_append": false, 00:13:54.651 "compare": false, 00:13:54.651 "compare_and_write": false, 00:13:54.651 "abort": false, 00:13:54.651 "seek_hole": false, 00:13:54.651 "seek_data": false, 00:13:54.651 "copy": false, 00:13:54.651 "nvme_iov_md": false 00:13:54.651 }, 00:13:54.651 "memory_domains": [ 00:13:54.651 { 00:13:54.651 "dma_device_id": "system", 00:13:54.651 "dma_device_type": 1 00:13:54.651 }, 00:13:54.651 { 00:13:54.651 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:54.651 "dma_device_type": 2 00:13:54.651 }, 00:13:54.651 { 00:13:54.651 "dma_device_id": "system", 00:13:54.651 "dma_device_type": 1 00:13:54.651 }, 00:13:54.651 { 00:13:54.651 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:54.651 "dma_device_type": 2 00:13:54.651 }, 00:13:54.651 { 00:13:54.651 "dma_device_id": "system", 00:13:54.651 "dma_device_type": 1 00:13:54.651 }, 00:13:54.651 { 00:13:54.651 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:54.651 "dma_device_type": 2 00:13:54.651 } 00:13:54.651 ], 00:13:54.651 "driver_specific": { 00:13:54.651 "raid": { 00:13:54.651 "uuid": "41848e70-dc5b-48dd-b5db-df39aa1dd330", 00:13:54.651 "strip_size_kb": 64, 00:13:54.651 "state": "online", 00:13:54.651 "raid_level": "raid0", 00:13:54.651 "superblock": false, 00:13:54.651 "num_base_bdevs": 3, 00:13:54.651 "num_base_bdevs_discovered": 3, 00:13:54.651 "num_base_bdevs_operational": 3, 00:13:54.651 "base_bdevs_list": [ 00:13:54.651 { 00:13:54.651 "name": "BaseBdev1", 00:13:54.651 "uuid": "f7cc4bbc-8751-4fc5-b66a-ecbf67b0341d", 00:13:54.651 "is_configured": true, 00:13:54.651 "data_offset": 0, 00:13:54.651 "data_size": 65536 00:13:54.651 }, 00:13:54.651 { 00:13:54.651 "name": "BaseBdev2", 00:13:54.651 "uuid": "16bf7472-ebe3-49f9-bd0b-dcc8311550f8", 00:13:54.651 "is_configured": true, 00:13:54.651 "data_offset": 0, 00:13:54.651 "data_size": 65536 00:13:54.651 }, 00:13:54.651 { 00:13:54.651 "name": "BaseBdev3", 00:13:54.651 "uuid": "9e98a4c4-14b7-40b1-bb46-b07fec621bb3", 00:13:54.651 "is_configured": true, 00:13:54.651 "data_offset": 0, 00:13:54.651 "data_size": 65536 00:13:54.651 } 00:13:54.651 ] 00:13:54.651 } 00:13:54.651 } 00:13:54.651 }' 00:13:54.651 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:54.651 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:54.651 BaseBdev2 00:13:54.651 BaseBdev3' 00:13:54.651 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:54.651 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:54.911 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:54.911 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:54.911 "name": "BaseBdev1", 00:13:54.911 "aliases": [ 00:13:54.911 "f7cc4bbc-8751-4fc5-b66a-ecbf67b0341d" 00:13:54.911 ], 00:13:54.911 "product_name": "Malloc disk", 00:13:54.911 "block_size": 512, 00:13:54.911 "num_blocks": 65536, 00:13:54.911 "uuid": "f7cc4bbc-8751-4fc5-b66a-ecbf67b0341d", 00:13:54.911 "assigned_rate_limits": { 00:13:54.911 "rw_ios_per_sec": 0, 00:13:54.911 "rw_mbytes_per_sec": 0, 00:13:54.911 "r_mbytes_per_sec": 0, 00:13:54.911 "w_mbytes_per_sec": 0 00:13:54.911 }, 00:13:54.911 "claimed": true, 00:13:54.911 "claim_type": "exclusive_write", 00:13:54.911 "zoned": false, 00:13:54.911 "supported_io_types": { 00:13:54.911 "read": true, 00:13:54.911 "write": true, 00:13:54.911 "unmap": true, 00:13:54.911 "flush": true, 00:13:54.911 "reset": true, 00:13:54.911 "nvme_admin": false, 00:13:54.911 "nvme_io": false, 00:13:54.911 "nvme_io_md": false, 00:13:54.911 "write_zeroes": true, 00:13:54.911 "zcopy": true, 00:13:54.911 "get_zone_info": false, 00:13:54.911 "zone_management": false, 00:13:54.911 "zone_append": false, 00:13:54.911 "compare": false, 00:13:54.911 "compare_and_write": false, 00:13:54.911 "abort": true, 00:13:54.911 "seek_hole": false, 00:13:54.911 "seek_data": false, 00:13:54.911 "copy": true, 00:13:54.911 "nvme_iov_md": false 00:13:54.911 }, 00:13:54.911 "memory_domains": [ 00:13:54.911 { 00:13:54.911 "dma_device_id": "system", 00:13:54.911 "dma_device_type": 1 00:13:54.911 }, 00:13:54.911 { 00:13:54.911 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:54.911 "dma_device_type": 2 00:13:54.911 } 00:13:54.911 ], 00:13:54.911 "driver_specific": {} 00:13:54.911 }' 00:13:54.911 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:54.911 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:55.170 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:55.170 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:55.170 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:55.170 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:55.170 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:55.170 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:55.170 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:55.170 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:55.171 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:55.430 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:55.430 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:55.430 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:55.430 19:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:55.690 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:55.690 "name": "BaseBdev2", 00:13:55.690 "aliases": [ 00:13:55.691 "16bf7472-ebe3-49f9-bd0b-dcc8311550f8" 00:13:55.691 ], 00:13:55.691 "product_name": "Malloc disk", 00:13:55.691 "block_size": 512, 00:13:55.691 "num_blocks": 65536, 00:13:55.691 "uuid": "16bf7472-ebe3-49f9-bd0b-dcc8311550f8", 00:13:55.691 "assigned_rate_limits": { 00:13:55.691 "rw_ios_per_sec": 0, 00:13:55.691 "rw_mbytes_per_sec": 0, 00:13:55.691 "r_mbytes_per_sec": 0, 00:13:55.691 "w_mbytes_per_sec": 0 00:13:55.691 }, 00:13:55.691 "claimed": true, 00:13:55.691 "claim_type": "exclusive_write", 00:13:55.691 "zoned": false, 00:13:55.691 "supported_io_types": { 00:13:55.691 "read": true, 00:13:55.691 "write": true, 00:13:55.691 "unmap": true, 00:13:55.691 "flush": true, 00:13:55.691 "reset": true, 00:13:55.691 "nvme_admin": false, 00:13:55.691 "nvme_io": false, 00:13:55.691 "nvme_io_md": false, 00:13:55.691 "write_zeroes": true, 00:13:55.691 "zcopy": true, 00:13:55.691 "get_zone_info": false, 00:13:55.691 "zone_management": false, 00:13:55.691 "zone_append": false, 00:13:55.691 "compare": false, 00:13:55.691 "compare_and_write": false, 00:13:55.691 "abort": true, 00:13:55.691 "seek_hole": false, 00:13:55.691 "seek_data": false, 00:13:55.691 "copy": true, 00:13:55.691 "nvme_iov_md": false 00:13:55.691 }, 00:13:55.691 "memory_domains": [ 00:13:55.691 { 00:13:55.691 "dma_device_id": "system", 00:13:55.691 "dma_device_type": 1 00:13:55.691 }, 00:13:55.691 { 00:13:55.691 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:55.691 "dma_device_type": 2 00:13:55.691 } 00:13:55.691 ], 00:13:55.691 "driver_specific": {} 00:13:55.691 }' 00:13:55.691 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:55.691 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:55.691 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:55.691 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:55.691 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:55.691 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:55.691 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:55.691 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:55.691 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:55.691 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:55.950 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:55.950 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:55.950 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:55.950 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:55.950 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:56.210 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:56.210 "name": "BaseBdev3", 00:13:56.210 "aliases": [ 00:13:56.210 "9e98a4c4-14b7-40b1-bb46-b07fec621bb3" 00:13:56.210 ], 00:13:56.210 "product_name": "Malloc disk", 00:13:56.210 "block_size": 512, 00:13:56.210 "num_blocks": 65536, 00:13:56.210 "uuid": "9e98a4c4-14b7-40b1-bb46-b07fec621bb3", 00:13:56.210 "assigned_rate_limits": { 00:13:56.210 "rw_ios_per_sec": 0, 00:13:56.210 "rw_mbytes_per_sec": 0, 00:13:56.210 "r_mbytes_per_sec": 0, 00:13:56.210 "w_mbytes_per_sec": 0 00:13:56.210 }, 00:13:56.210 "claimed": true, 00:13:56.210 "claim_type": "exclusive_write", 00:13:56.210 "zoned": false, 00:13:56.210 "supported_io_types": { 00:13:56.210 "read": true, 00:13:56.210 "write": true, 00:13:56.210 "unmap": true, 00:13:56.210 "flush": true, 00:13:56.210 "reset": true, 00:13:56.210 "nvme_admin": false, 00:13:56.210 "nvme_io": false, 00:13:56.210 "nvme_io_md": false, 00:13:56.210 "write_zeroes": true, 00:13:56.210 "zcopy": true, 00:13:56.210 "get_zone_info": false, 00:13:56.210 "zone_management": false, 00:13:56.210 "zone_append": false, 00:13:56.210 "compare": false, 00:13:56.210 "compare_and_write": false, 00:13:56.210 "abort": true, 00:13:56.210 "seek_hole": false, 00:13:56.210 "seek_data": false, 00:13:56.210 "copy": true, 00:13:56.210 "nvme_iov_md": false 00:13:56.210 }, 00:13:56.210 "memory_domains": [ 00:13:56.210 { 00:13:56.210 "dma_device_id": "system", 00:13:56.210 "dma_device_type": 1 00:13:56.210 }, 00:13:56.210 { 00:13:56.210 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:56.210 "dma_device_type": 2 00:13:56.210 } 00:13:56.210 ], 00:13:56.210 "driver_specific": {} 00:13:56.210 }' 00:13:56.210 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:56.210 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:56.210 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:56.210 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:56.210 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:56.210 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:56.210 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:56.469 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:56.469 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:56.469 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:56.469 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:56.469 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:56.469 19:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:56.728 [2024-07-24 19:49:48.180815] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:56.728 [2024-07-24 19:49:48.180843] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:56.728 [2024-07-24 19:49:48.180883] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:56.728 19:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:56.728 19:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:13:56.728 19:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:56.728 19:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:56.728 19:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:56.728 19:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:13:56.728 19:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:56.729 19:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:56.729 19:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:56.729 19:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:56.729 19:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:56.729 19:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:56.729 19:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:56.729 19:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:56.729 19:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:56.729 19:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:56.729 19:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:56.988 19:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:56.988 "name": "Existed_Raid", 00:13:56.988 "uuid": "41848e70-dc5b-48dd-b5db-df39aa1dd330", 00:13:56.988 "strip_size_kb": 64, 00:13:56.988 "state": "offline", 00:13:56.988 "raid_level": "raid0", 00:13:56.988 "superblock": false, 00:13:56.988 "num_base_bdevs": 3, 00:13:56.988 "num_base_bdevs_discovered": 2, 00:13:56.988 "num_base_bdevs_operational": 2, 00:13:56.988 "base_bdevs_list": [ 00:13:56.988 { 00:13:56.988 "name": null, 00:13:56.988 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:56.988 "is_configured": false, 00:13:56.988 "data_offset": 0, 00:13:56.988 "data_size": 65536 00:13:56.988 }, 00:13:56.988 { 00:13:56.988 "name": "BaseBdev2", 00:13:56.988 "uuid": "16bf7472-ebe3-49f9-bd0b-dcc8311550f8", 00:13:56.988 "is_configured": true, 00:13:56.988 "data_offset": 0, 00:13:56.988 "data_size": 65536 00:13:56.988 }, 00:13:56.988 { 00:13:56.988 "name": "BaseBdev3", 00:13:56.988 "uuid": "9e98a4c4-14b7-40b1-bb46-b07fec621bb3", 00:13:56.988 "is_configured": true, 00:13:56.988 "data_offset": 0, 00:13:56.988 "data_size": 65536 00:13:56.988 } 00:13:56.988 ] 00:13:56.988 }' 00:13:56.988 19:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:56.988 19:49:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:57.555 19:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:57.555 19:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:57.555 19:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:57.555 19:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:57.815 19:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:57.815 19:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:57.815 19:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:58.074 [2024-07-24 19:49:49.462098] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:58.074 19:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:58.074 19:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:58.074 19:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.074 19:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:58.332 19:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:58.332 19:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:58.332 19:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:58.592 [2024-07-24 19:49:49.974144] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:58.592 [2024-07-24 19:49:49.974188] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f481d0 name Existed_Raid, state offline 00:13:58.592 19:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:58.592 19:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:58.592 19:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.592 19:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:58.851 19:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:58.851 19:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:58.851 19:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:58.851 19:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:58.851 19:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:58.851 19:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:59.110 BaseBdev2 00:13:59.110 19:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:59.110 19:49:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:13:59.110 19:49:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:59.110 19:49:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:13:59.110 19:49:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:59.110 19:49:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:59.110 19:49:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:59.369 19:49:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:59.629 [ 00:13:59.629 { 00:13:59.629 "name": "BaseBdev2", 00:13:59.629 "aliases": [ 00:13:59.629 "18281f88-32de-412f-8c26-a50ebd24bac9" 00:13:59.629 ], 00:13:59.629 "product_name": "Malloc disk", 00:13:59.629 "block_size": 512, 00:13:59.629 "num_blocks": 65536, 00:13:59.629 "uuid": "18281f88-32de-412f-8c26-a50ebd24bac9", 00:13:59.629 "assigned_rate_limits": { 00:13:59.629 "rw_ios_per_sec": 0, 00:13:59.629 "rw_mbytes_per_sec": 0, 00:13:59.629 "r_mbytes_per_sec": 0, 00:13:59.629 "w_mbytes_per_sec": 0 00:13:59.629 }, 00:13:59.629 "claimed": false, 00:13:59.629 "zoned": false, 00:13:59.629 "supported_io_types": { 00:13:59.629 "read": true, 00:13:59.629 "write": true, 00:13:59.629 "unmap": true, 00:13:59.629 "flush": true, 00:13:59.629 "reset": true, 00:13:59.629 "nvme_admin": false, 00:13:59.629 "nvme_io": false, 00:13:59.629 "nvme_io_md": false, 00:13:59.629 "write_zeroes": true, 00:13:59.629 "zcopy": true, 00:13:59.629 "get_zone_info": false, 00:13:59.629 "zone_management": false, 00:13:59.629 "zone_append": false, 00:13:59.629 "compare": false, 00:13:59.629 "compare_and_write": false, 00:13:59.629 "abort": true, 00:13:59.629 "seek_hole": false, 00:13:59.629 "seek_data": false, 00:13:59.629 "copy": true, 00:13:59.629 "nvme_iov_md": false 00:13:59.629 }, 00:13:59.629 "memory_domains": [ 00:13:59.629 { 00:13:59.629 "dma_device_id": "system", 00:13:59.629 "dma_device_type": 1 00:13:59.629 }, 00:13:59.629 { 00:13:59.629 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:59.629 "dma_device_type": 2 00:13:59.629 } 00:13:59.629 ], 00:13:59.629 "driver_specific": {} 00:13:59.629 } 00:13:59.629 ] 00:13:59.629 19:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:13:59.629 19:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:59.629 19:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:59.629 19:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:59.888 BaseBdev3 00:13:59.888 19:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:59.888 19:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:13:59.888 19:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:59.888 19:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:13:59.888 19:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:59.888 19:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:59.888 19:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:00.148 19:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:00.407 [ 00:14:00.407 { 00:14:00.407 "name": "BaseBdev3", 00:14:00.407 "aliases": [ 00:14:00.407 "0d11cbcd-fd55-498b-85de-f183b5566ce9" 00:14:00.407 ], 00:14:00.407 "product_name": "Malloc disk", 00:14:00.407 "block_size": 512, 00:14:00.407 "num_blocks": 65536, 00:14:00.407 "uuid": "0d11cbcd-fd55-498b-85de-f183b5566ce9", 00:14:00.407 "assigned_rate_limits": { 00:14:00.407 "rw_ios_per_sec": 0, 00:14:00.407 "rw_mbytes_per_sec": 0, 00:14:00.407 "r_mbytes_per_sec": 0, 00:14:00.407 "w_mbytes_per_sec": 0 00:14:00.407 }, 00:14:00.407 "claimed": false, 00:14:00.407 "zoned": false, 00:14:00.407 "supported_io_types": { 00:14:00.407 "read": true, 00:14:00.407 "write": true, 00:14:00.407 "unmap": true, 00:14:00.407 "flush": true, 00:14:00.407 "reset": true, 00:14:00.407 "nvme_admin": false, 00:14:00.407 "nvme_io": false, 00:14:00.407 "nvme_io_md": false, 00:14:00.407 "write_zeroes": true, 00:14:00.407 "zcopy": true, 00:14:00.407 "get_zone_info": false, 00:14:00.407 "zone_management": false, 00:14:00.407 "zone_append": false, 00:14:00.407 "compare": false, 00:14:00.407 "compare_and_write": false, 00:14:00.407 "abort": true, 00:14:00.407 "seek_hole": false, 00:14:00.407 "seek_data": false, 00:14:00.407 "copy": true, 00:14:00.407 "nvme_iov_md": false 00:14:00.407 }, 00:14:00.407 "memory_domains": [ 00:14:00.407 { 00:14:00.407 "dma_device_id": "system", 00:14:00.407 "dma_device_type": 1 00:14:00.407 }, 00:14:00.407 { 00:14:00.407 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:00.407 "dma_device_type": 2 00:14:00.407 } 00:14:00.407 ], 00:14:00.407 "driver_specific": {} 00:14:00.407 } 00:14:00.407 ] 00:14:00.407 19:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:14:00.407 19:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:00.407 19:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:00.407 19:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:00.407 [2024-07-24 19:49:51.997758] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:00.408 [2024-07-24 19:49:51.997810] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:00.408 [2024-07-24 19:49:51.997828] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:00.408 [2024-07-24 19:49:51.999144] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:00.667 19:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:00.667 19:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:00.667 19:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:00.667 19:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:00.667 19:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:00.667 19:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:00.667 19:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:00.667 19:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:00.667 19:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:00.667 19:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:00.667 19:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:00.667 19:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:00.926 19:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:00.926 "name": "Existed_Raid", 00:14:00.926 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:00.926 "strip_size_kb": 64, 00:14:00.926 "state": "configuring", 00:14:00.926 "raid_level": "raid0", 00:14:00.926 "superblock": false, 00:14:00.926 "num_base_bdevs": 3, 00:14:00.926 "num_base_bdevs_discovered": 2, 00:14:00.927 "num_base_bdevs_operational": 3, 00:14:00.927 "base_bdevs_list": [ 00:14:00.927 { 00:14:00.927 "name": "BaseBdev1", 00:14:00.927 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:00.927 "is_configured": false, 00:14:00.927 "data_offset": 0, 00:14:00.927 "data_size": 0 00:14:00.927 }, 00:14:00.927 { 00:14:00.927 "name": "BaseBdev2", 00:14:00.927 "uuid": "18281f88-32de-412f-8c26-a50ebd24bac9", 00:14:00.927 "is_configured": true, 00:14:00.927 "data_offset": 0, 00:14:00.927 "data_size": 65536 00:14:00.927 }, 00:14:00.927 { 00:14:00.927 "name": "BaseBdev3", 00:14:00.927 "uuid": "0d11cbcd-fd55-498b-85de-f183b5566ce9", 00:14:00.927 "is_configured": true, 00:14:00.927 "data_offset": 0, 00:14:00.927 "data_size": 65536 00:14:00.927 } 00:14:00.927 ] 00:14:00.927 }' 00:14:00.927 19:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:00.927 19:49:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:01.495 19:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:01.495 [2024-07-24 19:49:53.016452] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:01.495 19:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:01.495 19:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:01.495 19:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:01.495 19:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:01.495 19:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:01.495 19:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:01.495 19:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:01.495 19:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:01.495 19:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:01.495 19:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:01.495 19:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:01.495 19:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:01.754 19:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:01.754 "name": "Existed_Raid", 00:14:01.754 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:01.754 "strip_size_kb": 64, 00:14:01.754 "state": "configuring", 00:14:01.754 "raid_level": "raid0", 00:14:01.754 "superblock": false, 00:14:01.754 "num_base_bdevs": 3, 00:14:01.754 "num_base_bdevs_discovered": 1, 00:14:01.754 "num_base_bdevs_operational": 3, 00:14:01.754 "base_bdevs_list": [ 00:14:01.754 { 00:14:01.754 "name": "BaseBdev1", 00:14:01.754 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:01.754 "is_configured": false, 00:14:01.754 "data_offset": 0, 00:14:01.754 "data_size": 0 00:14:01.754 }, 00:14:01.754 { 00:14:01.754 "name": null, 00:14:01.754 "uuid": "18281f88-32de-412f-8c26-a50ebd24bac9", 00:14:01.754 "is_configured": false, 00:14:01.754 "data_offset": 0, 00:14:01.754 "data_size": 65536 00:14:01.754 }, 00:14:01.754 { 00:14:01.754 "name": "BaseBdev3", 00:14:01.754 "uuid": "0d11cbcd-fd55-498b-85de-f183b5566ce9", 00:14:01.754 "is_configured": true, 00:14:01.754 "data_offset": 0, 00:14:01.754 "data_size": 65536 00:14:01.754 } 00:14:01.754 ] 00:14:01.754 }' 00:14:01.754 19:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:01.754 19:49:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:02.322 19:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.322 19:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:02.582 19:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:02.582 19:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:03.150 [2024-07-24 19:49:54.637270] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:03.151 BaseBdev1 00:14:03.151 19:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:03.151 19:49:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:14:03.151 19:49:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:03.151 19:49:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:14:03.151 19:49:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:03.151 19:49:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:03.151 19:49:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:03.410 19:49:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:03.669 [ 00:14:03.669 { 00:14:03.669 "name": "BaseBdev1", 00:14:03.669 "aliases": [ 00:14:03.669 "6252b736-64bd-4327-b861-8191a6cf0e35" 00:14:03.669 ], 00:14:03.669 "product_name": "Malloc disk", 00:14:03.669 "block_size": 512, 00:14:03.669 "num_blocks": 65536, 00:14:03.669 "uuid": "6252b736-64bd-4327-b861-8191a6cf0e35", 00:14:03.669 "assigned_rate_limits": { 00:14:03.669 "rw_ios_per_sec": 0, 00:14:03.669 "rw_mbytes_per_sec": 0, 00:14:03.669 "r_mbytes_per_sec": 0, 00:14:03.669 "w_mbytes_per_sec": 0 00:14:03.669 }, 00:14:03.669 "claimed": true, 00:14:03.669 "claim_type": "exclusive_write", 00:14:03.669 "zoned": false, 00:14:03.669 "supported_io_types": { 00:14:03.669 "read": true, 00:14:03.669 "write": true, 00:14:03.669 "unmap": true, 00:14:03.669 "flush": true, 00:14:03.669 "reset": true, 00:14:03.669 "nvme_admin": false, 00:14:03.669 "nvme_io": false, 00:14:03.669 "nvme_io_md": false, 00:14:03.669 "write_zeroes": true, 00:14:03.669 "zcopy": true, 00:14:03.669 "get_zone_info": false, 00:14:03.669 "zone_management": false, 00:14:03.669 "zone_append": false, 00:14:03.669 "compare": false, 00:14:03.669 "compare_and_write": false, 00:14:03.669 "abort": true, 00:14:03.669 "seek_hole": false, 00:14:03.669 "seek_data": false, 00:14:03.669 "copy": true, 00:14:03.669 "nvme_iov_md": false 00:14:03.669 }, 00:14:03.669 "memory_domains": [ 00:14:03.669 { 00:14:03.669 "dma_device_id": "system", 00:14:03.669 "dma_device_type": 1 00:14:03.669 }, 00:14:03.669 { 00:14:03.669 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:03.669 "dma_device_type": 2 00:14:03.669 } 00:14:03.669 ], 00:14:03.669 "driver_specific": {} 00:14:03.669 } 00:14:03.669 ] 00:14:03.669 19:49:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:14:03.669 19:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:03.669 19:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:03.669 19:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:03.669 19:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:03.669 19:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:03.669 19:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:03.669 19:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:03.670 19:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:03.670 19:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:03.670 19:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:03.670 19:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:03.670 19:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:03.929 19:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:03.929 "name": "Existed_Raid", 00:14:03.929 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:03.929 "strip_size_kb": 64, 00:14:03.929 "state": "configuring", 00:14:03.929 "raid_level": "raid0", 00:14:03.929 "superblock": false, 00:14:03.929 "num_base_bdevs": 3, 00:14:03.929 "num_base_bdevs_discovered": 2, 00:14:03.929 "num_base_bdevs_operational": 3, 00:14:03.929 "base_bdevs_list": [ 00:14:03.929 { 00:14:03.929 "name": "BaseBdev1", 00:14:03.929 "uuid": "6252b736-64bd-4327-b861-8191a6cf0e35", 00:14:03.929 "is_configured": true, 00:14:03.929 "data_offset": 0, 00:14:03.930 "data_size": 65536 00:14:03.930 }, 00:14:03.930 { 00:14:03.930 "name": null, 00:14:03.930 "uuid": "18281f88-32de-412f-8c26-a50ebd24bac9", 00:14:03.930 "is_configured": false, 00:14:03.930 "data_offset": 0, 00:14:03.930 "data_size": 65536 00:14:03.930 }, 00:14:03.930 { 00:14:03.930 "name": "BaseBdev3", 00:14:03.930 "uuid": "0d11cbcd-fd55-498b-85de-f183b5566ce9", 00:14:03.930 "is_configured": true, 00:14:03.930 "data_offset": 0, 00:14:03.930 "data_size": 65536 00:14:03.930 } 00:14:03.930 ] 00:14:03.930 }' 00:14:03.930 19:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:03.930 19:49:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:04.498 19:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:04.498 19:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:04.758 19:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:04.758 19:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:05.018 [2024-07-24 19:49:56.466138] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:05.018 19:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:05.018 19:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:05.018 19:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:05.018 19:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:05.018 19:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:05.018 19:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:05.018 19:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:05.018 19:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:05.018 19:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:05.018 19:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:05.018 19:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:05.018 19:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:05.277 19:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:05.277 "name": "Existed_Raid", 00:14:05.277 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:05.277 "strip_size_kb": 64, 00:14:05.277 "state": "configuring", 00:14:05.277 "raid_level": "raid0", 00:14:05.277 "superblock": false, 00:14:05.277 "num_base_bdevs": 3, 00:14:05.278 "num_base_bdevs_discovered": 1, 00:14:05.278 "num_base_bdevs_operational": 3, 00:14:05.278 "base_bdevs_list": [ 00:14:05.278 { 00:14:05.278 "name": "BaseBdev1", 00:14:05.278 "uuid": "6252b736-64bd-4327-b861-8191a6cf0e35", 00:14:05.278 "is_configured": true, 00:14:05.278 "data_offset": 0, 00:14:05.278 "data_size": 65536 00:14:05.278 }, 00:14:05.278 { 00:14:05.278 "name": null, 00:14:05.278 "uuid": "18281f88-32de-412f-8c26-a50ebd24bac9", 00:14:05.278 "is_configured": false, 00:14:05.278 "data_offset": 0, 00:14:05.278 "data_size": 65536 00:14:05.278 }, 00:14:05.278 { 00:14:05.278 "name": null, 00:14:05.278 "uuid": "0d11cbcd-fd55-498b-85de-f183b5566ce9", 00:14:05.278 "is_configured": false, 00:14:05.278 "data_offset": 0, 00:14:05.278 "data_size": 65536 00:14:05.278 } 00:14:05.278 ] 00:14:05.278 }' 00:14:05.278 19:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:05.278 19:49:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:05.846 19:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:05.846 19:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:06.137 19:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:06.137 19:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:06.396 [2024-07-24 19:49:57.809715] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:06.396 19:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:06.396 19:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:06.396 19:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:06.396 19:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:06.396 19:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:06.396 19:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:06.396 19:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:06.396 19:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:06.396 19:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:06.396 19:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:06.396 19:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:06.396 19:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:06.655 19:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:06.655 "name": "Existed_Raid", 00:14:06.655 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:06.655 "strip_size_kb": 64, 00:14:06.655 "state": "configuring", 00:14:06.655 "raid_level": "raid0", 00:14:06.655 "superblock": false, 00:14:06.655 "num_base_bdevs": 3, 00:14:06.655 "num_base_bdevs_discovered": 2, 00:14:06.655 "num_base_bdevs_operational": 3, 00:14:06.655 "base_bdevs_list": [ 00:14:06.655 { 00:14:06.655 "name": "BaseBdev1", 00:14:06.655 "uuid": "6252b736-64bd-4327-b861-8191a6cf0e35", 00:14:06.655 "is_configured": true, 00:14:06.655 "data_offset": 0, 00:14:06.655 "data_size": 65536 00:14:06.655 }, 00:14:06.655 { 00:14:06.655 "name": null, 00:14:06.655 "uuid": "18281f88-32de-412f-8c26-a50ebd24bac9", 00:14:06.655 "is_configured": false, 00:14:06.655 "data_offset": 0, 00:14:06.655 "data_size": 65536 00:14:06.655 }, 00:14:06.655 { 00:14:06.655 "name": "BaseBdev3", 00:14:06.655 "uuid": "0d11cbcd-fd55-498b-85de-f183b5566ce9", 00:14:06.655 "is_configured": true, 00:14:06.655 "data_offset": 0, 00:14:06.655 "data_size": 65536 00:14:06.655 } 00:14:06.655 ] 00:14:06.655 }' 00:14:06.655 19:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:06.655 19:49:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:07.224 19:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:07.224 19:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:07.483 19:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:07.483 19:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:07.742 [2024-07-24 19:49:59.141270] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:07.742 19:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:07.742 19:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:07.742 19:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:07.742 19:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:07.742 19:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:07.742 19:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:07.742 19:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:07.742 19:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:07.742 19:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:07.742 19:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:07.742 19:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:07.742 19:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:08.002 19:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:08.002 "name": "Existed_Raid", 00:14:08.002 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:08.002 "strip_size_kb": 64, 00:14:08.002 "state": "configuring", 00:14:08.002 "raid_level": "raid0", 00:14:08.002 "superblock": false, 00:14:08.002 "num_base_bdevs": 3, 00:14:08.002 "num_base_bdevs_discovered": 1, 00:14:08.002 "num_base_bdevs_operational": 3, 00:14:08.002 "base_bdevs_list": [ 00:14:08.002 { 00:14:08.002 "name": null, 00:14:08.002 "uuid": "6252b736-64bd-4327-b861-8191a6cf0e35", 00:14:08.002 "is_configured": false, 00:14:08.002 "data_offset": 0, 00:14:08.002 "data_size": 65536 00:14:08.002 }, 00:14:08.002 { 00:14:08.002 "name": null, 00:14:08.002 "uuid": "18281f88-32de-412f-8c26-a50ebd24bac9", 00:14:08.002 "is_configured": false, 00:14:08.002 "data_offset": 0, 00:14:08.002 "data_size": 65536 00:14:08.002 }, 00:14:08.002 { 00:14:08.002 "name": "BaseBdev3", 00:14:08.002 "uuid": "0d11cbcd-fd55-498b-85de-f183b5566ce9", 00:14:08.002 "is_configured": true, 00:14:08.002 "data_offset": 0, 00:14:08.002 "data_size": 65536 00:14:08.002 } 00:14:08.002 ] 00:14:08.002 }' 00:14:08.002 19:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:08.002 19:49:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:08.570 19:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:08.570 19:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:08.829 19:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:08.829 19:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:09.088 [2024-07-24 19:50:00.513209] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:09.088 19:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:09.088 19:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:09.088 19:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:09.088 19:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:09.088 19:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:09.088 19:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:09.088 19:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:09.088 19:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:09.088 19:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:09.088 19:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:09.088 19:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:09.089 19:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:09.348 19:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:09.348 "name": "Existed_Raid", 00:14:09.348 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:09.348 "strip_size_kb": 64, 00:14:09.348 "state": "configuring", 00:14:09.348 "raid_level": "raid0", 00:14:09.348 "superblock": false, 00:14:09.348 "num_base_bdevs": 3, 00:14:09.348 "num_base_bdevs_discovered": 2, 00:14:09.348 "num_base_bdevs_operational": 3, 00:14:09.348 "base_bdevs_list": [ 00:14:09.348 { 00:14:09.348 "name": null, 00:14:09.348 "uuid": "6252b736-64bd-4327-b861-8191a6cf0e35", 00:14:09.348 "is_configured": false, 00:14:09.348 "data_offset": 0, 00:14:09.348 "data_size": 65536 00:14:09.348 }, 00:14:09.348 { 00:14:09.348 "name": "BaseBdev2", 00:14:09.348 "uuid": "18281f88-32de-412f-8c26-a50ebd24bac9", 00:14:09.348 "is_configured": true, 00:14:09.348 "data_offset": 0, 00:14:09.348 "data_size": 65536 00:14:09.348 }, 00:14:09.348 { 00:14:09.348 "name": "BaseBdev3", 00:14:09.348 "uuid": "0d11cbcd-fd55-498b-85de-f183b5566ce9", 00:14:09.348 "is_configured": true, 00:14:09.348 "data_offset": 0, 00:14:09.348 "data_size": 65536 00:14:09.348 } 00:14:09.348 ] 00:14:09.348 }' 00:14:09.348 19:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:09.348 19:50:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:09.915 19:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:09.915 19:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:10.173 19:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:10.173 19:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:10.173 19:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:10.432 19:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 6252b736-64bd-4327-b861-8191a6cf0e35 00:14:10.692 [2024-07-24 19:50:02.106003] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:10.692 [2024-07-24 19:50:02.106043] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x20ef4b0 00:14:10.692 [2024-07-24 19:50:02.106052] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:14:10.692 [2024-07-24 19:50:02.106250] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e620d0 00:14:10.692 [2024-07-24 19:50:02.106374] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20ef4b0 00:14:10.692 [2024-07-24 19:50:02.106385] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x20ef4b0 00:14:10.692 [2024-07-24 19:50:02.106561] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:10.692 NewBaseBdev 00:14:10.692 19:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:10.692 19:50:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:14:10.692 19:50:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:10.692 19:50:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:14:10.692 19:50:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:10.692 19:50:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:10.692 19:50:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:10.951 19:50:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:11.209 [ 00:14:11.209 { 00:14:11.209 "name": "NewBaseBdev", 00:14:11.209 "aliases": [ 00:14:11.209 "6252b736-64bd-4327-b861-8191a6cf0e35" 00:14:11.209 ], 00:14:11.209 "product_name": "Malloc disk", 00:14:11.209 "block_size": 512, 00:14:11.209 "num_blocks": 65536, 00:14:11.209 "uuid": "6252b736-64bd-4327-b861-8191a6cf0e35", 00:14:11.209 "assigned_rate_limits": { 00:14:11.209 "rw_ios_per_sec": 0, 00:14:11.209 "rw_mbytes_per_sec": 0, 00:14:11.209 "r_mbytes_per_sec": 0, 00:14:11.209 "w_mbytes_per_sec": 0 00:14:11.209 }, 00:14:11.209 "claimed": true, 00:14:11.209 "claim_type": "exclusive_write", 00:14:11.209 "zoned": false, 00:14:11.209 "supported_io_types": { 00:14:11.209 "read": true, 00:14:11.209 "write": true, 00:14:11.210 "unmap": true, 00:14:11.210 "flush": true, 00:14:11.210 "reset": true, 00:14:11.210 "nvme_admin": false, 00:14:11.210 "nvme_io": false, 00:14:11.210 "nvme_io_md": false, 00:14:11.210 "write_zeroes": true, 00:14:11.210 "zcopy": true, 00:14:11.210 "get_zone_info": false, 00:14:11.210 "zone_management": false, 00:14:11.210 "zone_append": false, 00:14:11.210 "compare": false, 00:14:11.210 "compare_and_write": false, 00:14:11.210 "abort": true, 00:14:11.210 "seek_hole": false, 00:14:11.210 "seek_data": false, 00:14:11.210 "copy": true, 00:14:11.210 "nvme_iov_md": false 00:14:11.210 }, 00:14:11.210 "memory_domains": [ 00:14:11.210 { 00:14:11.210 "dma_device_id": "system", 00:14:11.210 "dma_device_type": 1 00:14:11.210 }, 00:14:11.210 { 00:14:11.210 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:11.210 "dma_device_type": 2 00:14:11.210 } 00:14:11.210 ], 00:14:11.210 "driver_specific": {} 00:14:11.210 } 00:14:11.210 ] 00:14:11.210 19:50:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:14:11.210 19:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:11.210 19:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:11.210 19:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:11.210 19:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:11.210 19:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:11.210 19:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:11.210 19:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:11.210 19:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:11.210 19:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:11.210 19:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:11.210 19:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:11.210 19:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:11.469 19:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:11.469 "name": "Existed_Raid", 00:14:11.469 "uuid": "356e61e2-2efc-47e2-ad4f-4ff1dcf4253e", 00:14:11.469 "strip_size_kb": 64, 00:14:11.469 "state": "online", 00:14:11.469 "raid_level": "raid0", 00:14:11.469 "superblock": false, 00:14:11.469 "num_base_bdevs": 3, 00:14:11.469 "num_base_bdevs_discovered": 3, 00:14:11.469 "num_base_bdevs_operational": 3, 00:14:11.469 "base_bdevs_list": [ 00:14:11.469 { 00:14:11.469 "name": "NewBaseBdev", 00:14:11.469 "uuid": "6252b736-64bd-4327-b861-8191a6cf0e35", 00:14:11.469 "is_configured": true, 00:14:11.469 "data_offset": 0, 00:14:11.469 "data_size": 65536 00:14:11.469 }, 00:14:11.469 { 00:14:11.469 "name": "BaseBdev2", 00:14:11.469 "uuid": "18281f88-32de-412f-8c26-a50ebd24bac9", 00:14:11.469 "is_configured": true, 00:14:11.469 "data_offset": 0, 00:14:11.469 "data_size": 65536 00:14:11.469 }, 00:14:11.469 { 00:14:11.469 "name": "BaseBdev3", 00:14:11.469 "uuid": "0d11cbcd-fd55-498b-85de-f183b5566ce9", 00:14:11.469 "is_configured": true, 00:14:11.469 "data_offset": 0, 00:14:11.469 "data_size": 65536 00:14:11.469 } 00:14:11.469 ] 00:14:11.469 }' 00:14:11.469 19:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:11.469 19:50:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:12.037 19:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:12.037 19:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:12.037 19:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:12.037 19:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:12.037 19:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:12.037 19:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:12.037 19:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:12.037 19:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:12.297 [2024-07-24 19:50:03.722595] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:12.297 19:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:12.297 "name": "Existed_Raid", 00:14:12.297 "aliases": [ 00:14:12.297 "356e61e2-2efc-47e2-ad4f-4ff1dcf4253e" 00:14:12.297 ], 00:14:12.297 "product_name": "Raid Volume", 00:14:12.297 "block_size": 512, 00:14:12.297 "num_blocks": 196608, 00:14:12.297 "uuid": "356e61e2-2efc-47e2-ad4f-4ff1dcf4253e", 00:14:12.297 "assigned_rate_limits": { 00:14:12.297 "rw_ios_per_sec": 0, 00:14:12.297 "rw_mbytes_per_sec": 0, 00:14:12.297 "r_mbytes_per_sec": 0, 00:14:12.297 "w_mbytes_per_sec": 0 00:14:12.297 }, 00:14:12.297 "claimed": false, 00:14:12.297 "zoned": false, 00:14:12.297 "supported_io_types": { 00:14:12.297 "read": true, 00:14:12.297 "write": true, 00:14:12.297 "unmap": true, 00:14:12.297 "flush": true, 00:14:12.297 "reset": true, 00:14:12.297 "nvme_admin": false, 00:14:12.297 "nvme_io": false, 00:14:12.297 "nvme_io_md": false, 00:14:12.297 "write_zeroes": true, 00:14:12.297 "zcopy": false, 00:14:12.297 "get_zone_info": false, 00:14:12.297 "zone_management": false, 00:14:12.297 "zone_append": false, 00:14:12.297 "compare": false, 00:14:12.297 "compare_and_write": false, 00:14:12.297 "abort": false, 00:14:12.297 "seek_hole": false, 00:14:12.297 "seek_data": false, 00:14:12.297 "copy": false, 00:14:12.297 "nvme_iov_md": false 00:14:12.297 }, 00:14:12.297 "memory_domains": [ 00:14:12.297 { 00:14:12.297 "dma_device_id": "system", 00:14:12.297 "dma_device_type": 1 00:14:12.297 }, 00:14:12.297 { 00:14:12.297 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:12.297 "dma_device_type": 2 00:14:12.297 }, 00:14:12.297 { 00:14:12.297 "dma_device_id": "system", 00:14:12.297 "dma_device_type": 1 00:14:12.297 }, 00:14:12.297 { 00:14:12.297 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:12.297 "dma_device_type": 2 00:14:12.297 }, 00:14:12.297 { 00:14:12.297 "dma_device_id": "system", 00:14:12.297 "dma_device_type": 1 00:14:12.297 }, 00:14:12.297 { 00:14:12.297 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:12.297 "dma_device_type": 2 00:14:12.297 } 00:14:12.297 ], 00:14:12.297 "driver_specific": { 00:14:12.297 "raid": { 00:14:12.297 "uuid": "356e61e2-2efc-47e2-ad4f-4ff1dcf4253e", 00:14:12.297 "strip_size_kb": 64, 00:14:12.297 "state": "online", 00:14:12.297 "raid_level": "raid0", 00:14:12.297 "superblock": false, 00:14:12.297 "num_base_bdevs": 3, 00:14:12.297 "num_base_bdevs_discovered": 3, 00:14:12.297 "num_base_bdevs_operational": 3, 00:14:12.297 "base_bdevs_list": [ 00:14:12.297 { 00:14:12.297 "name": "NewBaseBdev", 00:14:12.297 "uuid": "6252b736-64bd-4327-b861-8191a6cf0e35", 00:14:12.297 "is_configured": true, 00:14:12.297 "data_offset": 0, 00:14:12.297 "data_size": 65536 00:14:12.298 }, 00:14:12.298 { 00:14:12.298 "name": "BaseBdev2", 00:14:12.298 "uuid": "18281f88-32de-412f-8c26-a50ebd24bac9", 00:14:12.298 "is_configured": true, 00:14:12.298 "data_offset": 0, 00:14:12.298 "data_size": 65536 00:14:12.298 }, 00:14:12.298 { 00:14:12.298 "name": "BaseBdev3", 00:14:12.298 "uuid": "0d11cbcd-fd55-498b-85de-f183b5566ce9", 00:14:12.298 "is_configured": true, 00:14:12.298 "data_offset": 0, 00:14:12.298 "data_size": 65536 00:14:12.298 } 00:14:12.298 ] 00:14:12.298 } 00:14:12.298 } 00:14:12.298 }' 00:14:12.298 19:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:12.298 19:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:12.298 BaseBdev2 00:14:12.298 BaseBdev3' 00:14:12.298 19:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:12.298 19:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:12.298 19:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:12.557 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:12.557 "name": "NewBaseBdev", 00:14:12.557 "aliases": [ 00:14:12.557 "6252b736-64bd-4327-b861-8191a6cf0e35" 00:14:12.557 ], 00:14:12.557 "product_name": "Malloc disk", 00:14:12.557 "block_size": 512, 00:14:12.557 "num_blocks": 65536, 00:14:12.557 "uuid": "6252b736-64bd-4327-b861-8191a6cf0e35", 00:14:12.557 "assigned_rate_limits": { 00:14:12.557 "rw_ios_per_sec": 0, 00:14:12.557 "rw_mbytes_per_sec": 0, 00:14:12.557 "r_mbytes_per_sec": 0, 00:14:12.557 "w_mbytes_per_sec": 0 00:14:12.557 }, 00:14:12.557 "claimed": true, 00:14:12.557 "claim_type": "exclusive_write", 00:14:12.557 "zoned": false, 00:14:12.557 "supported_io_types": { 00:14:12.557 "read": true, 00:14:12.557 "write": true, 00:14:12.557 "unmap": true, 00:14:12.557 "flush": true, 00:14:12.557 "reset": true, 00:14:12.557 "nvme_admin": false, 00:14:12.557 "nvme_io": false, 00:14:12.557 "nvme_io_md": false, 00:14:12.557 "write_zeroes": true, 00:14:12.557 "zcopy": true, 00:14:12.557 "get_zone_info": false, 00:14:12.557 "zone_management": false, 00:14:12.557 "zone_append": false, 00:14:12.557 "compare": false, 00:14:12.557 "compare_and_write": false, 00:14:12.557 "abort": true, 00:14:12.557 "seek_hole": false, 00:14:12.557 "seek_data": false, 00:14:12.557 "copy": true, 00:14:12.557 "nvme_iov_md": false 00:14:12.557 }, 00:14:12.557 "memory_domains": [ 00:14:12.557 { 00:14:12.557 "dma_device_id": "system", 00:14:12.557 "dma_device_type": 1 00:14:12.557 }, 00:14:12.557 { 00:14:12.557 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:12.557 "dma_device_type": 2 00:14:12.557 } 00:14:12.557 ], 00:14:12.557 "driver_specific": {} 00:14:12.557 }' 00:14:12.557 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:12.557 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:12.557 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:12.557 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:12.824 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:12.824 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:12.824 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:12.824 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:12.824 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:12.824 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:12.824 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:12.824 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:12.824 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:12.824 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:12.824 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:13.159 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:13.159 "name": "BaseBdev2", 00:14:13.159 "aliases": [ 00:14:13.159 "18281f88-32de-412f-8c26-a50ebd24bac9" 00:14:13.159 ], 00:14:13.159 "product_name": "Malloc disk", 00:14:13.159 "block_size": 512, 00:14:13.159 "num_blocks": 65536, 00:14:13.159 "uuid": "18281f88-32de-412f-8c26-a50ebd24bac9", 00:14:13.159 "assigned_rate_limits": { 00:14:13.159 "rw_ios_per_sec": 0, 00:14:13.159 "rw_mbytes_per_sec": 0, 00:14:13.159 "r_mbytes_per_sec": 0, 00:14:13.159 "w_mbytes_per_sec": 0 00:14:13.159 }, 00:14:13.159 "claimed": true, 00:14:13.159 "claim_type": "exclusive_write", 00:14:13.159 "zoned": false, 00:14:13.159 "supported_io_types": { 00:14:13.159 "read": true, 00:14:13.159 "write": true, 00:14:13.159 "unmap": true, 00:14:13.159 "flush": true, 00:14:13.159 "reset": true, 00:14:13.159 "nvme_admin": false, 00:14:13.159 "nvme_io": false, 00:14:13.159 "nvme_io_md": false, 00:14:13.159 "write_zeroes": true, 00:14:13.159 "zcopy": true, 00:14:13.159 "get_zone_info": false, 00:14:13.159 "zone_management": false, 00:14:13.159 "zone_append": false, 00:14:13.159 "compare": false, 00:14:13.159 "compare_and_write": false, 00:14:13.159 "abort": true, 00:14:13.159 "seek_hole": false, 00:14:13.159 "seek_data": false, 00:14:13.159 "copy": true, 00:14:13.159 "nvme_iov_md": false 00:14:13.159 }, 00:14:13.159 "memory_domains": [ 00:14:13.159 { 00:14:13.159 "dma_device_id": "system", 00:14:13.159 "dma_device_type": 1 00:14:13.159 }, 00:14:13.159 { 00:14:13.159 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:13.159 "dma_device_type": 2 00:14:13.159 } 00:14:13.159 ], 00:14:13.159 "driver_specific": {} 00:14:13.159 }' 00:14:13.159 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:13.159 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:13.159 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:13.159 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:13.419 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:13.419 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:13.419 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:13.419 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:13.419 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:13.419 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:13.419 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:13.419 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:13.419 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:13.419 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:13.419 19:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:13.988 19:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:13.988 "name": "BaseBdev3", 00:14:13.988 "aliases": [ 00:14:13.988 "0d11cbcd-fd55-498b-85de-f183b5566ce9" 00:14:13.988 ], 00:14:13.988 "product_name": "Malloc disk", 00:14:13.988 "block_size": 512, 00:14:13.988 "num_blocks": 65536, 00:14:13.988 "uuid": "0d11cbcd-fd55-498b-85de-f183b5566ce9", 00:14:13.988 "assigned_rate_limits": { 00:14:13.988 "rw_ios_per_sec": 0, 00:14:13.988 "rw_mbytes_per_sec": 0, 00:14:13.988 "r_mbytes_per_sec": 0, 00:14:13.988 "w_mbytes_per_sec": 0 00:14:13.988 }, 00:14:13.988 "claimed": true, 00:14:13.988 "claim_type": "exclusive_write", 00:14:13.988 "zoned": false, 00:14:13.988 "supported_io_types": { 00:14:13.988 "read": true, 00:14:13.988 "write": true, 00:14:13.988 "unmap": true, 00:14:13.988 "flush": true, 00:14:13.988 "reset": true, 00:14:13.988 "nvme_admin": false, 00:14:13.988 "nvme_io": false, 00:14:13.988 "nvme_io_md": false, 00:14:13.988 "write_zeroes": true, 00:14:13.988 "zcopy": true, 00:14:13.988 "get_zone_info": false, 00:14:13.988 "zone_management": false, 00:14:13.988 "zone_append": false, 00:14:13.988 "compare": false, 00:14:13.988 "compare_and_write": false, 00:14:13.988 "abort": true, 00:14:13.988 "seek_hole": false, 00:14:13.988 "seek_data": false, 00:14:13.988 "copy": true, 00:14:13.988 "nvme_iov_md": false 00:14:13.988 }, 00:14:13.988 "memory_domains": [ 00:14:13.988 { 00:14:13.988 "dma_device_id": "system", 00:14:13.988 "dma_device_type": 1 00:14:13.988 }, 00:14:13.988 { 00:14:13.988 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:13.988 "dma_device_type": 2 00:14:13.988 } 00:14:13.988 ], 00:14:13.988 "driver_specific": {} 00:14:13.988 }' 00:14:13.988 19:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:13.988 19:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:13.988 19:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:13.988 19:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:13.988 19:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:13.988 19:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:13.988 19:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:13.988 19:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:13.988 19:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:13.988 19:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:14.247 19:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:14.247 19:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:14.247 19:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:14.506 [2024-07-24 19:50:05.851954] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:14.506 [2024-07-24 19:50:05.851982] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:14.506 [2024-07-24 19:50:05.852036] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:14.506 [2024-07-24 19:50:05.852089] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:14.506 [2024-07-24 19:50:05.852100] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20ef4b0 name Existed_Raid, state offline 00:14:14.506 19:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1394405 00:14:14.506 19:50:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 1394405 ']' 00:14:14.506 19:50:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 1394405 00:14:14.506 19:50:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:14:14.506 19:50:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:14:14.506 19:50:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1394405 00:14:14.506 19:50:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:14:14.506 19:50:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:14:14.506 19:50:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1394405' 00:14:14.506 killing process with pid 1394405 00:14:14.506 19:50:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 1394405 00:14:14.506 [2024-07-24 19:50:05.921105] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:14.506 19:50:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 1394405 00:14:14.506 [2024-07-24 19:50:05.951295] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:14:14.766 00:14:14.766 real 0m29.631s 00:14:14.766 user 0m54.475s 00:14:14.766 sys 0m5.259s 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:14.766 ************************************ 00:14:14.766 END TEST raid_state_function_test 00:14:14.766 ************************************ 00:14:14.766 19:50:06 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:14:14.766 19:50:06 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:14:14.766 19:50:06 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:14:14.766 19:50:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:14.766 ************************************ 00:14:14.766 START TEST raid_state_function_test_sb 00:14:14.766 ************************************ 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 3 true 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1398788 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1398788' 00:14:14.766 Process raid pid: 1398788 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1398788 /var/tmp/spdk-raid.sock 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1398788 ']' 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:14.766 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:14:14.766 19:50:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:14.766 [2024-07-24 19:50:06.326299] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:14:14.766 [2024-07-24 19:50:06.326363] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:15.026 [2024-07-24 19:50:06.447050] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:15.026 [2024-07-24 19:50:06.555536] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:15.026 [2024-07-24 19:50:06.617952] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:15.026 [2024-07-24 19:50:06.617982] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:15.284 19:50:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:14:15.284 19:50:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:14:15.284 19:50:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:15.543 [2024-07-24 19:50:07.019458] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:15.543 [2024-07-24 19:50:07.019494] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:15.543 [2024-07-24 19:50:07.019505] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:15.543 [2024-07-24 19:50:07.019516] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:15.543 [2024-07-24 19:50:07.019525] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:15.543 [2024-07-24 19:50:07.019536] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:15.543 19:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:15.543 19:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:15.543 19:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:15.543 19:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:15.543 19:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:15.543 19:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:15.543 19:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:15.543 19:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:15.543 19:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:15.543 19:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:15.543 19:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:15.543 19:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:15.802 19:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:15.802 "name": "Existed_Raid", 00:14:15.802 "uuid": "0606a76e-7b2c-4dea-beed-acb226cc70d5", 00:14:15.802 "strip_size_kb": 64, 00:14:15.802 "state": "configuring", 00:14:15.802 "raid_level": "raid0", 00:14:15.802 "superblock": true, 00:14:15.802 "num_base_bdevs": 3, 00:14:15.802 "num_base_bdevs_discovered": 0, 00:14:15.802 "num_base_bdevs_operational": 3, 00:14:15.802 "base_bdevs_list": [ 00:14:15.802 { 00:14:15.802 "name": "BaseBdev1", 00:14:15.802 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:15.802 "is_configured": false, 00:14:15.802 "data_offset": 0, 00:14:15.802 "data_size": 0 00:14:15.802 }, 00:14:15.802 { 00:14:15.802 "name": "BaseBdev2", 00:14:15.802 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:15.802 "is_configured": false, 00:14:15.802 "data_offset": 0, 00:14:15.802 "data_size": 0 00:14:15.802 }, 00:14:15.802 { 00:14:15.802 "name": "BaseBdev3", 00:14:15.802 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:15.802 "is_configured": false, 00:14:15.802 "data_offset": 0, 00:14:15.802 "data_size": 0 00:14:15.802 } 00:14:15.802 ] 00:14:15.802 }' 00:14:15.802 19:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:15.802 19:50:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:16.370 19:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:16.629 [2024-07-24 19:50:08.090154] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:16.629 [2024-07-24 19:50:08.090179] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13d3a10 name Existed_Raid, state configuring 00:14:16.629 19:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:16.889 [2024-07-24 19:50:08.338840] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:16.889 [2024-07-24 19:50:08.338870] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:16.889 [2024-07-24 19:50:08.338885] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:16.889 [2024-07-24 19:50:08.338897] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:16.889 [2024-07-24 19:50:08.338905] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:16.889 [2024-07-24 19:50:08.338916] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:16.889 19:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:17.148 [2024-07-24 19:50:08.597299] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:17.148 BaseBdev1 00:14:17.148 19:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:17.148 19:50:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:14:17.148 19:50:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:17.148 19:50:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:14:17.148 19:50:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:17.148 19:50:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:17.148 19:50:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:17.407 19:50:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:17.667 [ 00:14:17.667 { 00:14:17.667 "name": "BaseBdev1", 00:14:17.667 "aliases": [ 00:14:17.667 "121a6a85-794f-449c-9fc7-e90b4b18e41c" 00:14:17.667 ], 00:14:17.667 "product_name": "Malloc disk", 00:14:17.667 "block_size": 512, 00:14:17.667 "num_blocks": 65536, 00:14:17.667 "uuid": "121a6a85-794f-449c-9fc7-e90b4b18e41c", 00:14:17.667 "assigned_rate_limits": { 00:14:17.667 "rw_ios_per_sec": 0, 00:14:17.667 "rw_mbytes_per_sec": 0, 00:14:17.667 "r_mbytes_per_sec": 0, 00:14:17.667 "w_mbytes_per_sec": 0 00:14:17.667 }, 00:14:17.667 "claimed": true, 00:14:17.667 "claim_type": "exclusive_write", 00:14:17.667 "zoned": false, 00:14:17.667 "supported_io_types": { 00:14:17.667 "read": true, 00:14:17.667 "write": true, 00:14:17.667 "unmap": true, 00:14:17.667 "flush": true, 00:14:17.667 "reset": true, 00:14:17.667 "nvme_admin": false, 00:14:17.667 "nvme_io": false, 00:14:17.667 "nvme_io_md": false, 00:14:17.667 "write_zeroes": true, 00:14:17.667 "zcopy": true, 00:14:17.667 "get_zone_info": false, 00:14:17.667 "zone_management": false, 00:14:17.667 "zone_append": false, 00:14:17.667 "compare": false, 00:14:17.667 "compare_and_write": false, 00:14:17.667 "abort": true, 00:14:17.667 "seek_hole": false, 00:14:17.667 "seek_data": false, 00:14:17.667 "copy": true, 00:14:17.667 "nvme_iov_md": false 00:14:17.667 }, 00:14:17.667 "memory_domains": [ 00:14:17.667 { 00:14:17.667 "dma_device_id": "system", 00:14:17.667 "dma_device_type": 1 00:14:17.667 }, 00:14:17.667 { 00:14:17.667 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.667 "dma_device_type": 2 00:14:17.667 } 00:14:17.667 ], 00:14:17.667 "driver_specific": {} 00:14:17.667 } 00:14:17.667 ] 00:14:17.667 19:50:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:14:17.667 19:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:17.667 19:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:17.667 19:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:17.667 19:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:17.667 19:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:17.667 19:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:17.667 19:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:17.667 19:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:17.667 19:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:17.667 19:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:17.667 19:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:17.667 19:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:17.927 19:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:17.927 "name": "Existed_Raid", 00:14:17.927 "uuid": "1737e4e7-cea6-4adc-97e4-35ef3ac87186", 00:14:17.927 "strip_size_kb": 64, 00:14:17.927 "state": "configuring", 00:14:17.927 "raid_level": "raid0", 00:14:17.927 "superblock": true, 00:14:17.927 "num_base_bdevs": 3, 00:14:17.927 "num_base_bdevs_discovered": 1, 00:14:17.927 "num_base_bdevs_operational": 3, 00:14:17.927 "base_bdevs_list": [ 00:14:17.927 { 00:14:17.927 "name": "BaseBdev1", 00:14:17.927 "uuid": "121a6a85-794f-449c-9fc7-e90b4b18e41c", 00:14:17.927 "is_configured": true, 00:14:17.927 "data_offset": 2048, 00:14:17.927 "data_size": 63488 00:14:17.927 }, 00:14:17.927 { 00:14:17.927 "name": "BaseBdev2", 00:14:17.927 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:17.927 "is_configured": false, 00:14:17.927 "data_offset": 0, 00:14:17.927 "data_size": 0 00:14:17.927 }, 00:14:17.927 { 00:14:17.927 "name": "BaseBdev3", 00:14:17.927 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:17.927 "is_configured": false, 00:14:17.927 "data_offset": 0, 00:14:17.927 "data_size": 0 00:14:17.927 } 00:14:17.927 ] 00:14:17.927 }' 00:14:17.927 19:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:17.927 19:50:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:18.495 19:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:18.754 [2024-07-24 19:50:10.153411] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:18.754 [2024-07-24 19:50:10.153456] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13d32e0 name Existed_Raid, state configuring 00:14:18.754 19:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:19.013 [2024-07-24 19:50:10.402112] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:19.013 [2024-07-24 19:50:10.403603] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:19.013 [2024-07-24 19:50:10.403635] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:19.013 [2024-07-24 19:50:10.403646] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:19.013 [2024-07-24 19:50:10.403658] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:19.013 19:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:19.013 19:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:19.013 19:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:19.013 19:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:19.013 19:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:19.013 19:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:19.013 19:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:19.013 19:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:19.013 19:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:19.013 19:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:19.013 19:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:19.013 19:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:19.013 19:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:19.013 19:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:19.272 19:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:19.272 "name": "Existed_Raid", 00:14:19.272 "uuid": "28737960-b788-4fa0-bbef-51e3f5e399fa", 00:14:19.272 "strip_size_kb": 64, 00:14:19.272 "state": "configuring", 00:14:19.272 "raid_level": "raid0", 00:14:19.272 "superblock": true, 00:14:19.272 "num_base_bdevs": 3, 00:14:19.272 "num_base_bdevs_discovered": 1, 00:14:19.272 "num_base_bdevs_operational": 3, 00:14:19.272 "base_bdevs_list": [ 00:14:19.272 { 00:14:19.272 "name": "BaseBdev1", 00:14:19.272 "uuid": "121a6a85-794f-449c-9fc7-e90b4b18e41c", 00:14:19.272 "is_configured": true, 00:14:19.272 "data_offset": 2048, 00:14:19.272 "data_size": 63488 00:14:19.272 }, 00:14:19.272 { 00:14:19.272 "name": "BaseBdev2", 00:14:19.272 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:19.272 "is_configured": false, 00:14:19.272 "data_offset": 0, 00:14:19.272 "data_size": 0 00:14:19.272 }, 00:14:19.272 { 00:14:19.272 "name": "BaseBdev3", 00:14:19.272 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:19.272 "is_configured": false, 00:14:19.272 "data_offset": 0, 00:14:19.272 "data_size": 0 00:14:19.272 } 00:14:19.272 ] 00:14:19.272 }' 00:14:19.272 19:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:19.272 19:50:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:19.839 19:50:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:19.839 [2024-07-24 19:50:11.319904] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:19.839 BaseBdev2 00:14:19.839 19:50:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:19.839 19:50:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:14:19.839 19:50:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:19.839 19:50:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:14:19.839 19:50:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:19.839 19:50:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:19.839 19:50:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:20.098 19:50:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:20.357 [ 00:14:20.358 { 00:14:20.358 "name": "BaseBdev2", 00:14:20.358 "aliases": [ 00:14:20.358 "00e13fb4-994c-441c-9f29-8131609712c3" 00:14:20.358 ], 00:14:20.358 "product_name": "Malloc disk", 00:14:20.358 "block_size": 512, 00:14:20.358 "num_blocks": 65536, 00:14:20.358 "uuid": "00e13fb4-994c-441c-9f29-8131609712c3", 00:14:20.358 "assigned_rate_limits": { 00:14:20.358 "rw_ios_per_sec": 0, 00:14:20.358 "rw_mbytes_per_sec": 0, 00:14:20.358 "r_mbytes_per_sec": 0, 00:14:20.358 "w_mbytes_per_sec": 0 00:14:20.358 }, 00:14:20.358 "claimed": true, 00:14:20.358 "claim_type": "exclusive_write", 00:14:20.358 "zoned": false, 00:14:20.358 "supported_io_types": { 00:14:20.358 "read": true, 00:14:20.358 "write": true, 00:14:20.358 "unmap": true, 00:14:20.358 "flush": true, 00:14:20.358 "reset": true, 00:14:20.358 "nvme_admin": false, 00:14:20.358 "nvme_io": false, 00:14:20.358 "nvme_io_md": false, 00:14:20.358 "write_zeroes": true, 00:14:20.358 "zcopy": true, 00:14:20.358 "get_zone_info": false, 00:14:20.358 "zone_management": false, 00:14:20.358 "zone_append": false, 00:14:20.358 "compare": false, 00:14:20.358 "compare_and_write": false, 00:14:20.358 "abort": true, 00:14:20.358 "seek_hole": false, 00:14:20.358 "seek_data": false, 00:14:20.358 "copy": true, 00:14:20.358 "nvme_iov_md": false 00:14:20.358 }, 00:14:20.358 "memory_domains": [ 00:14:20.358 { 00:14:20.358 "dma_device_id": "system", 00:14:20.358 "dma_device_type": 1 00:14:20.358 }, 00:14:20.358 { 00:14:20.358 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.358 "dma_device_type": 2 00:14:20.358 } 00:14:20.358 ], 00:14:20.358 "driver_specific": {} 00:14:20.358 } 00:14:20.358 ] 00:14:20.358 19:50:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:14:20.358 19:50:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:20.358 19:50:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:20.358 19:50:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:20.358 19:50:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:20.358 19:50:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:20.358 19:50:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:20.358 19:50:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:20.358 19:50:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:20.358 19:50:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:20.358 19:50:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:20.358 19:50:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:20.358 19:50:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:20.358 19:50:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:20.358 19:50:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:20.617 19:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:20.617 "name": "Existed_Raid", 00:14:20.617 "uuid": "28737960-b788-4fa0-bbef-51e3f5e399fa", 00:14:20.617 "strip_size_kb": 64, 00:14:20.617 "state": "configuring", 00:14:20.617 "raid_level": "raid0", 00:14:20.617 "superblock": true, 00:14:20.617 "num_base_bdevs": 3, 00:14:20.617 "num_base_bdevs_discovered": 2, 00:14:20.617 "num_base_bdevs_operational": 3, 00:14:20.617 "base_bdevs_list": [ 00:14:20.617 { 00:14:20.617 "name": "BaseBdev1", 00:14:20.617 "uuid": "121a6a85-794f-449c-9fc7-e90b4b18e41c", 00:14:20.617 "is_configured": true, 00:14:20.617 "data_offset": 2048, 00:14:20.617 "data_size": 63488 00:14:20.617 }, 00:14:20.617 { 00:14:20.617 "name": "BaseBdev2", 00:14:20.617 "uuid": "00e13fb4-994c-441c-9f29-8131609712c3", 00:14:20.617 "is_configured": true, 00:14:20.617 "data_offset": 2048, 00:14:20.617 "data_size": 63488 00:14:20.617 }, 00:14:20.617 { 00:14:20.617 "name": "BaseBdev3", 00:14:20.617 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:20.617 "is_configured": false, 00:14:20.617 "data_offset": 0, 00:14:20.617 "data_size": 0 00:14:20.617 } 00:14:20.617 ] 00:14:20.617 }' 00:14:20.617 19:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:20.617 19:50:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:21.184 19:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:21.443 [2024-07-24 19:50:12.851346] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:21.443 [2024-07-24 19:50:12.851513] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x13d41d0 00:14:21.443 [2024-07-24 19:50:12.851528] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:21.443 [2024-07-24 19:50:12.851700] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13d4f30 00:14:21.443 [2024-07-24 19:50:12.851822] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13d41d0 00:14:21.443 [2024-07-24 19:50:12.851832] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x13d41d0 00:14:21.443 [2024-07-24 19:50:12.851921] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:21.443 BaseBdev3 00:14:21.443 19:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:21.443 19:50:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:14:21.443 19:50:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:21.443 19:50:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:14:21.443 19:50:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:21.443 19:50:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:21.443 19:50:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:21.702 19:50:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:21.964 [ 00:14:21.964 { 00:14:21.964 "name": "BaseBdev3", 00:14:21.964 "aliases": [ 00:14:21.964 "19852cf7-0fe0-4f64-b0cf-4351f29eaadd" 00:14:21.964 ], 00:14:21.964 "product_name": "Malloc disk", 00:14:21.964 "block_size": 512, 00:14:21.964 "num_blocks": 65536, 00:14:21.964 "uuid": "19852cf7-0fe0-4f64-b0cf-4351f29eaadd", 00:14:21.964 "assigned_rate_limits": { 00:14:21.964 "rw_ios_per_sec": 0, 00:14:21.964 "rw_mbytes_per_sec": 0, 00:14:21.964 "r_mbytes_per_sec": 0, 00:14:21.964 "w_mbytes_per_sec": 0 00:14:21.964 }, 00:14:21.964 "claimed": true, 00:14:21.964 "claim_type": "exclusive_write", 00:14:21.964 "zoned": false, 00:14:21.964 "supported_io_types": { 00:14:21.964 "read": true, 00:14:21.964 "write": true, 00:14:21.964 "unmap": true, 00:14:21.964 "flush": true, 00:14:21.964 "reset": true, 00:14:21.964 "nvme_admin": false, 00:14:21.964 "nvme_io": false, 00:14:21.964 "nvme_io_md": false, 00:14:21.964 "write_zeroes": true, 00:14:21.964 "zcopy": true, 00:14:21.964 "get_zone_info": false, 00:14:21.964 "zone_management": false, 00:14:21.964 "zone_append": false, 00:14:21.964 "compare": false, 00:14:21.964 "compare_and_write": false, 00:14:21.964 "abort": true, 00:14:21.964 "seek_hole": false, 00:14:21.964 "seek_data": false, 00:14:21.964 "copy": true, 00:14:21.964 "nvme_iov_md": false 00:14:21.964 }, 00:14:21.964 "memory_domains": [ 00:14:21.964 { 00:14:21.964 "dma_device_id": "system", 00:14:21.964 "dma_device_type": 1 00:14:21.964 }, 00:14:21.964 { 00:14:21.964 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:21.964 "dma_device_type": 2 00:14:21.964 } 00:14:21.964 ], 00:14:21.964 "driver_specific": {} 00:14:21.964 } 00:14:21.964 ] 00:14:21.964 19:50:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:14:21.964 19:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:21.964 19:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:21.964 19:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:21.964 19:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:21.964 19:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:21.964 19:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:21.964 19:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:21.964 19:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:21.964 19:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:21.964 19:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:21.965 19:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:21.965 19:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:21.965 19:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:21.965 19:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:22.223 19:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:22.223 "name": "Existed_Raid", 00:14:22.223 "uuid": "28737960-b788-4fa0-bbef-51e3f5e399fa", 00:14:22.223 "strip_size_kb": 64, 00:14:22.223 "state": "online", 00:14:22.223 "raid_level": "raid0", 00:14:22.223 "superblock": true, 00:14:22.223 "num_base_bdevs": 3, 00:14:22.223 "num_base_bdevs_discovered": 3, 00:14:22.223 "num_base_bdevs_operational": 3, 00:14:22.223 "base_bdevs_list": [ 00:14:22.223 { 00:14:22.223 "name": "BaseBdev1", 00:14:22.223 "uuid": "121a6a85-794f-449c-9fc7-e90b4b18e41c", 00:14:22.223 "is_configured": true, 00:14:22.223 "data_offset": 2048, 00:14:22.223 "data_size": 63488 00:14:22.223 }, 00:14:22.223 { 00:14:22.223 "name": "BaseBdev2", 00:14:22.223 "uuid": "00e13fb4-994c-441c-9f29-8131609712c3", 00:14:22.223 "is_configured": true, 00:14:22.223 "data_offset": 2048, 00:14:22.223 "data_size": 63488 00:14:22.223 }, 00:14:22.223 { 00:14:22.223 "name": "BaseBdev3", 00:14:22.223 "uuid": "19852cf7-0fe0-4f64-b0cf-4351f29eaadd", 00:14:22.223 "is_configured": true, 00:14:22.223 "data_offset": 2048, 00:14:22.223 "data_size": 63488 00:14:22.223 } 00:14:22.223 ] 00:14:22.223 }' 00:14:22.223 19:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:22.223 19:50:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:22.791 19:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:22.791 19:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:22.791 19:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:22.791 19:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:22.791 19:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:22.791 19:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:22.791 19:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:22.791 19:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:23.050 [2024-07-24 19:50:14.439861] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:23.050 19:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:23.050 "name": "Existed_Raid", 00:14:23.050 "aliases": [ 00:14:23.050 "28737960-b788-4fa0-bbef-51e3f5e399fa" 00:14:23.050 ], 00:14:23.050 "product_name": "Raid Volume", 00:14:23.050 "block_size": 512, 00:14:23.050 "num_blocks": 190464, 00:14:23.050 "uuid": "28737960-b788-4fa0-bbef-51e3f5e399fa", 00:14:23.050 "assigned_rate_limits": { 00:14:23.050 "rw_ios_per_sec": 0, 00:14:23.050 "rw_mbytes_per_sec": 0, 00:14:23.050 "r_mbytes_per_sec": 0, 00:14:23.050 "w_mbytes_per_sec": 0 00:14:23.050 }, 00:14:23.050 "claimed": false, 00:14:23.050 "zoned": false, 00:14:23.050 "supported_io_types": { 00:14:23.050 "read": true, 00:14:23.050 "write": true, 00:14:23.050 "unmap": true, 00:14:23.050 "flush": true, 00:14:23.050 "reset": true, 00:14:23.050 "nvme_admin": false, 00:14:23.050 "nvme_io": false, 00:14:23.050 "nvme_io_md": false, 00:14:23.050 "write_zeroes": true, 00:14:23.050 "zcopy": false, 00:14:23.050 "get_zone_info": false, 00:14:23.050 "zone_management": false, 00:14:23.050 "zone_append": false, 00:14:23.050 "compare": false, 00:14:23.050 "compare_and_write": false, 00:14:23.050 "abort": false, 00:14:23.050 "seek_hole": false, 00:14:23.050 "seek_data": false, 00:14:23.050 "copy": false, 00:14:23.050 "nvme_iov_md": false 00:14:23.050 }, 00:14:23.050 "memory_domains": [ 00:14:23.050 { 00:14:23.050 "dma_device_id": "system", 00:14:23.050 "dma_device_type": 1 00:14:23.050 }, 00:14:23.050 { 00:14:23.050 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:23.050 "dma_device_type": 2 00:14:23.050 }, 00:14:23.050 { 00:14:23.050 "dma_device_id": "system", 00:14:23.050 "dma_device_type": 1 00:14:23.050 }, 00:14:23.050 { 00:14:23.050 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:23.050 "dma_device_type": 2 00:14:23.050 }, 00:14:23.050 { 00:14:23.050 "dma_device_id": "system", 00:14:23.050 "dma_device_type": 1 00:14:23.050 }, 00:14:23.050 { 00:14:23.050 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:23.050 "dma_device_type": 2 00:14:23.050 } 00:14:23.050 ], 00:14:23.050 "driver_specific": { 00:14:23.050 "raid": { 00:14:23.050 "uuid": "28737960-b788-4fa0-bbef-51e3f5e399fa", 00:14:23.050 "strip_size_kb": 64, 00:14:23.050 "state": "online", 00:14:23.050 "raid_level": "raid0", 00:14:23.050 "superblock": true, 00:14:23.050 "num_base_bdevs": 3, 00:14:23.050 "num_base_bdevs_discovered": 3, 00:14:23.050 "num_base_bdevs_operational": 3, 00:14:23.050 "base_bdevs_list": [ 00:14:23.050 { 00:14:23.050 "name": "BaseBdev1", 00:14:23.050 "uuid": "121a6a85-794f-449c-9fc7-e90b4b18e41c", 00:14:23.050 "is_configured": true, 00:14:23.050 "data_offset": 2048, 00:14:23.050 "data_size": 63488 00:14:23.050 }, 00:14:23.050 { 00:14:23.051 "name": "BaseBdev2", 00:14:23.051 "uuid": "00e13fb4-994c-441c-9f29-8131609712c3", 00:14:23.051 "is_configured": true, 00:14:23.051 "data_offset": 2048, 00:14:23.051 "data_size": 63488 00:14:23.051 }, 00:14:23.051 { 00:14:23.051 "name": "BaseBdev3", 00:14:23.051 "uuid": "19852cf7-0fe0-4f64-b0cf-4351f29eaadd", 00:14:23.051 "is_configured": true, 00:14:23.051 "data_offset": 2048, 00:14:23.051 "data_size": 63488 00:14:23.051 } 00:14:23.051 ] 00:14:23.051 } 00:14:23.051 } 00:14:23.051 }' 00:14:23.051 19:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:23.051 19:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:23.051 BaseBdev2 00:14:23.051 BaseBdev3' 00:14:23.051 19:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:23.051 19:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:23.051 19:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:23.618 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:23.618 "name": "BaseBdev1", 00:14:23.618 "aliases": [ 00:14:23.618 "121a6a85-794f-449c-9fc7-e90b4b18e41c" 00:14:23.618 ], 00:14:23.618 "product_name": "Malloc disk", 00:14:23.618 "block_size": 512, 00:14:23.618 "num_blocks": 65536, 00:14:23.618 "uuid": "121a6a85-794f-449c-9fc7-e90b4b18e41c", 00:14:23.618 "assigned_rate_limits": { 00:14:23.618 "rw_ios_per_sec": 0, 00:14:23.618 "rw_mbytes_per_sec": 0, 00:14:23.618 "r_mbytes_per_sec": 0, 00:14:23.618 "w_mbytes_per_sec": 0 00:14:23.618 }, 00:14:23.618 "claimed": true, 00:14:23.618 "claim_type": "exclusive_write", 00:14:23.618 "zoned": false, 00:14:23.618 "supported_io_types": { 00:14:23.618 "read": true, 00:14:23.618 "write": true, 00:14:23.618 "unmap": true, 00:14:23.618 "flush": true, 00:14:23.618 "reset": true, 00:14:23.618 "nvme_admin": false, 00:14:23.618 "nvme_io": false, 00:14:23.618 "nvme_io_md": false, 00:14:23.618 "write_zeroes": true, 00:14:23.618 "zcopy": true, 00:14:23.618 "get_zone_info": false, 00:14:23.618 "zone_management": false, 00:14:23.618 "zone_append": false, 00:14:23.618 "compare": false, 00:14:23.618 "compare_and_write": false, 00:14:23.618 "abort": true, 00:14:23.618 "seek_hole": false, 00:14:23.618 "seek_data": false, 00:14:23.618 "copy": true, 00:14:23.618 "nvme_iov_md": false 00:14:23.618 }, 00:14:23.618 "memory_domains": [ 00:14:23.618 { 00:14:23.618 "dma_device_id": "system", 00:14:23.618 "dma_device_type": 1 00:14:23.618 }, 00:14:23.618 { 00:14:23.618 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:23.618 "dma_device_type": 2 00:14:23.618 } 00:14:23.618 ], 00:14:23.618 "driver_specific": {} 00:14:23.618 }' 00:14:23.618 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:23.618 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:23.618 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:23.618 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:23.618 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:23.876 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:23.876 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:23.876 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:23.876 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:23.876 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:23.876 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:23.876 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:23.876 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:23.876 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:23.876 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:24.135 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:24.135 "name": "BaseBdev2", 00:14:24.135 "aliases": [ 00:14:24.135 "00e13fb4-994c-441c-9f29-8131609712c3" 00:14:24.135 ], 00:14:24.135 "product_name": "Malloc disk", 00:14:24.135 "block_size": 512, 00:14:24.135 "num_blocks": 65536, 00:14:24.135 "uuid": "00e13fb4-994c-441c-9f29-8131609712c3", 00:14:24.135 "assigned_rate_limits": { 00:14:24.135 "rw_ios_per_sec": 0, 00:14:24.135 "rw_mbytes_per_sec": 0, 00:14:24.135 "r_mbytes_per_sec": 0, 00:14:24.135 "w_mbytes_per_sec": 0 00:14:24.135 }, 00:14:24.135 "claimed": true, 00:14:24.135 "claim_type": "exclusive_write", 00:14:24.135 "zoned": false, 00:14:24.135 "supported_io_types": { 00:14:24.135 "read": true, 00:14:24.135 "write": true, 00:14:24.135 "unmap": true, 00:14:24.135 "flush": true, 00:14:24.135 "reset": true, 00:14:24.135 "nvme_admin": false, 00:14:24.135 "nvme_io": false, 00:14:24.135 "nvme_io_md": false, 00:14:24.135 "write_zeroes": true, 00:14:24.135 "zcopy": true, 00:14:24.135 "get_zone_info": false, 00:14:24.135 "zone_management": false, 00:14:24.135 "zone_append": false, 00:14:24.135 "compare": false, 00:14:24.135 "compare_and_write": false, 00:14:24.135 "abort": true, 00:14:24.135 "seek_hole": false, 00:14:24.135 "seek_data": false, 00:14:24.135 "copy": true, 00:14:24.135 "nvme_iov_md": false 00:14:24.135 }, 00:14:24.135 "memory_domains": [ 00:14:24.135 { 00:14:24.135 "dma_device_id": "system", 00:14:24.135 "dma_device_type": 1 00:14:24.135 }, 00:14:24.135 { 00:14:24.135 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:24.135 "dma_device_type": 2 00:14:24.135 } 00:14:24.135 ], 00:14:24.135 "driver_specific": {} 00:14:24.135 }' 00:14:24.135 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:24.135 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:24.394 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:24.394 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:24.395 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:24.395 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:24.395 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:24.395 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:24.395 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:24.395 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:24.395 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:24.653 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:24.653 19:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:24.653 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:24.653 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:24.912 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:24.913 "name": "BaseBdev3", 00:14:24.913 "aliases": [ 00:14:24.913 "19852cf7-0fe0-4f64-b0cf-4351f29eaadd" 00:14:24.913 ], 00:14:24.913 "product_name": "Malloc disk", 00:14:24.913 "block_size": 512, 00:14:24.913 "num_blocks": 65536, 00:14:24.913 "uuid": "19852cf7-0fe0-4f64-b0cf-4351f29eaadd", 00:14:24.913 "assigned_rate_limits": { 00:14:24.913 "rw_ios_per_sec": 0, 00:14:24.913 "rw_mbytes_per_sec": 0, 00:14:24.913 "r_mbytes_per_sec": 0, 00:14:24.913 "w_mbytes_per_sec": 0 00:14:24.913 }, 00:14:24.913 "claimed": true, 00:14:24.913 "claim_type": "exclusive_write", 00:14:24.913 "zoned": false, 00:14:24.913 "supported_io_types": { 00:14:24.913 "read": true, 00:14:24.913 "write": true, 00:14:24.913 "unmap": true, 00:14:24.913 "flush": true, 00:14:24.913 "reset": true, 00:14:24.913 "nvme_admin": false, 00:14:24.913 "nvme_io": false, 00:14:24.913 "nvme_io_md": false, 00:14:24.913 "write_zeroes": true, 00:14:24.913 "zcopy": true, 00:14:24.913 "get_zone_info": false, 00:14:24.913 "zone_management": false, 00:14:24.913 "zone_append": false, 00:14:24.913 "compare": false, 00:14:24.913 "compare_and_write": false, 00:14:24.913 "abort": true, 00:14:24.913 "seek_hole": false, 00:14:24.913 "seek_data": false, 00:14:24.913 "copy": true, 00:14:24.913 "nvme_iov_md": false 00:14:24.913 }, 00:14:24.913 "memory_domains": [ 00:14:24.913 { 00:14:24.913 "dma_device_id": "system", 00:14:24.913 "dma_device_type": 1 00:14:24.913 }, 00:14:24.913 { 00:14:24.913 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:24.913 "dma_device_type": 2 00:14:24.913 } 00:14:24.913 ], 00:14:24.913 "driver_specific": {} 00:14:24.913 }' 00:14:24.913 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:24.913 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:24.913 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:24.913 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:24.913 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:24.913 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:24.913 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:24.913 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:25.172 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:25.172 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:25.172 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:25.172 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:25.172 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:25.431 [2024-07-24 19:50:16.821931] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:25.431 [2024-07-24 19:50:16.821955] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:25.431 [2024-07-24 19:50:16.821995] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:25.431 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:25.431 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:14:25.431 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:25.431 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:14:25.431 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:25.431 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:14:25.431 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:25.431 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:25.432 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:25.432 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:25.432 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:25.432 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:25.432 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:25.432 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:25.432 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:25.432 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:25.432 19:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:25.691 19:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:25.691 "name": "Existed_Raid", 00:14:25.691 "uuid": "28737960-b788-4fa0-bbef-51e3f5e399fa", 00:14:25.691 "strip_size_kb": 64, 00:14:25.691 "state": "offline", 00:14:25.691 "raid_level": "raid0", 00:14:25.691 "superblock": true, 00:14:25.691 "num_base_bdevs": 3, 00:14:25.691 "num_base_bdevs_discovered": 2, 00:14:25.691 "num_base_bdevs_operational": 2, 00:14:25.691 "base_bdevs_list": [ 00:14:25.691 { 00:14:25.691 "name": null, 00:14:25.691 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:25.691 "is_configured": false, 00:14:25.691 "data_offset": 2048, 00:14:25.691 "data_size": 63488 00:14:25.691 }, 00:14:25.691 { 00:14:25.691 "name": "BaseBdev2", 00:14:25.691 "uuid": "00e13fb4-994c-441c-9f29-8131609712c3", 00:14:25.691 "is_configured": true, 00:14:25.691 "data_offset": 2048, 00:14:25.691 "data_size": 63488 00:14:25.691 }, 00:14:25.691 { 00:14:25.691 "name": "BaseBdev3", 00:14:25.691 "uuid": "19852cf7-0fe0-4f64-b0cf-4351f29eaadd", 00:14:25.691 "is_configured": true, 00:14:25.691 "data_offset": 2048, 00:14:25.691 "data_size": 63488 00:14:25.691 } 00:14:25.691 ] 00:14:25.691 }' 00:14:25.691 19:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:25.691 19:50:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:26.259 19:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:26.259 19:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:26.259 19:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:26.259 19:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:26.518 19:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:26.518 19:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:26.518 19:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:26.777 [2024-07-24 19:50:18.182570] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:26.777 19:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:26.777 19:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:26.777 19:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:26.777 19:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:27.037 19:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:27.037 19:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:27.037 19:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:27.296 [2024-07-24 19:50:18.686617] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:27.296 [2024-07-24 19:50:18.686662] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13d41d0 name Existed_Raid, state offline 00:14:27.296 19:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:27.296 19:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:27.296 19:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:27.296 19:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:27.555 19:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:27.555 19:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:27.555 19:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:14:27.555 19:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:27.555 19:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:27.555 19:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:27.814 BaseBdev2 00:14:27.814 19:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:27.814 19:50:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:14:27.814 19:50:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:27.814 19:50:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:14:27.814 19:50:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:27.814 19:50:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:27.814 19:50:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:28.074 19:50:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:28.348 [ 00:14:28.348 { 00:14:28.348 "name": "BaseBdev2", 00:14:28.348 "aliases": [ 00:14:28.348 "64e05946-ded0-4fca-9fd1-be9a6194a6b5" 00:14:28.348 ], 00:14:28.348 "product_name": "Malloc disk", 00:14:28.348 "block_size": 512, 00:14:28.348 "num_blocks": 65536, 00:14:28.348 "uuid": "64e05946-ded0-4fca-9fd1-be9a6194a6b5", 00:14:28.348 "assigned_rate_limits": { 00:14:28.348 "rw_ios_per_sec": 0, 00:14:28.348 "rw_mbytes_per_sec": 0, 00:14:28.348 "r_mbytes_per_sec": 0, 00:14:28.348 "w_mbytes_per_sec": 0 00:14:28.348 }, 00:14:28.348 "claimed": false, 00:14:28.348 "zoned": false, 00:14:28.348 "supported_io_types": { 00:14:28.348 "read": true, 00:14:28.348 "write": true, 00:14:28.348 "unmap": true, 00:14:28.348 "flush": true, 00:14:28.348 "reset": true, 00:14:28.348 "nvme_admin": false, 00:14:28.348 "nvme_io": false, 00:14:28.348 "nvme_io_md": false, 00:14:28.348 "write_zeroes": true, 00:14:28.348 "zcopy": true, 00:14:28.348 "get_zone_info": false, 00:14:28.348 "zone_management": false, 00:14:28.348 "zone_append": false, 00:14:28.348 "compare": false, 00:14:28.348 "compare_and_write": false, 00:14:28.348 "abort": true, 00:14:28.348 "seek_hole": false, 00:14:28.348 "seek_data": false, 00:14:28.348 "copy": true, 00:14:28.348 "nvme_iov_md": false 00:14:28.348 }, 00:14:28.348 "memory_domains": [ 00:14:28.348 { 00:14:28.348 "dma_device_id": "system", 00:14:28.348 "dma_device_type": 1 00:14:28.348 }, 00:14:28.348 { 00:14:28.348 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:28.348 "dma_device_type": 2 00:14:28.348 } 00:14:28.348 ], 00:14:28.348 "driver_specific": {} 00:14:28.348 } 00:14:28.348 ] 00:14:28.348 19:50:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:14:28.348 19:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:28.348 19:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:28.348 19:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:28.666 BaseBdev3 00:14:28.666 19:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:28.666 19:50:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:14:28.666 19:50:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:28.666 19:50:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:14:28.666 19:50:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:28.666 19:50:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:28.666 19:50:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:28.666 19:50:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:28.925 [ 00:14:28.925 { 00:14:28.925 "name": "BaseBdev3", 00:14:28.925 "aliases": [ 00:14:28.925 "a065792c-e610-4513-a9cb-88dd27a6d345" 00:14:28.925 ], 00:14:28.925 "product_name": "Malloc disk", 00:14:28.925 "block_size": 512, 00:14:28.925 "num_blocks": 65536, 00:14:28.925 "uuid": "a065792c-e610-4513-a9cb-88dd27a6d345", 00:14:28.925 "assigned_rate_limits": { 00:14:28.925 "rw_ios_per_sec": 0, 00:14:28.925 "rw_mbytes_per_sec": 0, 00:14:28.925 "r_mbytes_per_sec": 0, 00:14:28.925 "w_mbytes_per_sec": 0 00:14:28.925 }, 00:14:28.925 "claimed": false, 00:14:28.925 "zoned": false, 00:14:28.925 "supported_io_types": { 00:14:28.925 "read": true, 00:14:28.925 "write": true, 00:14:28.925 "unmap": true, 00:14:28.925 "flush": true, 00:14:28.925 "reset": true, 00:14:28.925 "nvme_admin": false, 00:14:28.925 "nvme_io": false, 00:14:28.925 "nvme_io_md": false, 00:14:28.925 "write_zeroes": true, 00:14:28.925 "zcopy": true, 00:14:28.925 "get_zone_info": false, 00:14:28.925 "zone_management": false, 00:14:28.925 "zone_append": false, 00:14:28.925 "compare": false, 00:14:28.925 "compare_and_write": false, 00:14:28.925 "abort": true, 00:14:28.925 "seek_hole": false, 00:14:28.925 "seek_data": false, 00:14:28.925 "copy": true, 00:14:28.925 "nvme_iov_md": false 00:14:28.925 }, 00:14:28.925 "memory_domains": [ 00:14:28.925 { 00:14:28.926 "dma_device_id": "system", 00:14:28.926 "dma_device_type": 1 00:14:28.926 }, 00:14:28.926 { 00:14:28.926 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:28.926 "dma_device_type": 2 00:14:28.926 } 00:14:28.926 ], 00:14:28.926 "driver_specific": {} 00:14:28.926 } 00:14:28.926 ] 00:14:28.926 19:50:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:14:28.926 19:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:28.926 19:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:28.926 19:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:29.185 [2024-07-24 19:50:20.664932] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:29.185 [2024-07-24 19:50:20.664975] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:29.185 [2024-07-24 19:50:20.664994] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:29.185 [2024-07-24 19:50:20.666360] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:29.185 19:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:29.185 19:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:29.185 19:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:29.185 19:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:29.185 19:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:29.185 19:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:29.185 19:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:29.185 19:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:29.185 19:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:29.185 19:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:29.185 19:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:29.185 19:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:29.444 19:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:29.444 "name": "Existed_Raid", 00:14:29.444 "uuid": "98ed0e39-4270-4955-83f3-11b99695870f", 00:14:29.444 "strip_size_kb": 64, 00:14:29.444 "state": "configuring", 00:14:29.444 "raid_level": "raid0", 00:14:29.444 "superblock": true, 00:14:29.444 "num_base_bdevs": 3, 00:14:29.444 "num_base_bdevs_discovered": 2, 00:14:29.444 "num_base_bdevs_operational": 3, 00:14:29.444 "base_bdevs_list": [ 00:14:29.444 { 00:14:29.444 "name": "BaseBdev1", 00:14:29.444 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:29.444 "is_configured": false, 00:14:29.444 "data_offset": 0, 00:14:29.444 "data_size": 0 00:14:29.444 }, 00:14:29.444 { 00:14:29.444 "name": "BaseBdev2", 00:14:29.444 "uuid": "64e05946-ded0-4fca-9fd1-be9a6194a6b5", 00:14:29.444 "is_configured": true, 00:14:29.444 "data_offset": 2048, 00:14:29.444 "data_size": 63488 00:14:29.444 }, 00:14:29.445 { 00:14:29.445 "name": "BaseBdev3", 00:14:29.445 "uuid": "a065792c-e610-4513-a9cb-88dd27a6d345", 00:14:29.445 "is_configured": true, 00:14:29.445 "data_offset": 2048, 00:14:29.445 "data_size": 63488 00:14:29.445 } 00:14:29.445 ] 00:14:29.445 }' 00:14:29.445 19:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:29.445 19:50:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:30.013 19:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:30.273 [2024-07-24 19:50:21.731730] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:30.273 19:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:30.273 19:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:30.273 19:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:30.273 19:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:30.273 19:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:30.273 19:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:30.273 19:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:30.273 19:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:30.273 19:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:30.273 19:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:30.273 19:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:30.273 19:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:30.532 19:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:30.532 "name": "Existed_Raid", 00:14:30.532 "uuid": "98ed0e39-4270-4955-83f3-11b99695870f", 00:14:30.532 "strip_size_kb": 64, 00:14:30.532 "state": "configuring", 00:14:30.532 "raid_level": "raid0", 00:14:30.532 "superblock": true, 00:14:30.532 "num_base_bdevs": 3, 00:14:30.532 "num_base_bdevs_discovered": 1, 00:14:30.532 "num_base_bdevs_operational": 3, 00:14:30.532 "base_bdevs_list": [ 00:14:30.532 { 00:14:30.532 "name": "BaseBdev1", 00:14:30.532 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:30.532 "is_configured": false, 00:14:30.532 "data_offset": 0, 00:14:30.532 "data_size": 0 00:14:30.532 }, 00:14:30.532 { 00:14:30.532 "name": null, 00:14:30.532 "uuid": "64e05946-ded0-4fca-9fd1-be9a6194a6b5", 00:14:30.532 "is_configured": false, 00:14:30.532 "data_offset": 2048, 00:14:30.532 "data_size": 63488 00:14:30.532 }, 00:14:30.532 { 00:14:30.532 "name": "BaseBdev3", 00:14:30.532 "uuid": "a065792c-e610-4513-a9cb-88dd27a6d345", 00:14:30.532 "is_configured": true, 00:14:30.532 "data_offset": 2048, 00:14:30.532 "data_size": 63488 00:14:30.532 } 00:14:30.532 ] 00:14:30.532 }' 00:14:30.532 19:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:30.532 19:50:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:31.102 19:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:31.102 19:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:31.361 19:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:31.361 19:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:31.621 [2024-07-24 19:50:23.126926] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:31.621 BaseBdev1 00:14:31.621 19:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:31.621 19:50:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:14:31.621 19:50:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:31.621 19:50:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:14:31.621 19:50:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:31.621 19:50:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:31.621 19:50:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:31.882 19:50:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:32.143 [ 00:14:32.143 { 00:14:32.143 "name": "BaseBdev1", 00:14:32.143 "aliases": [ 00:14:32.143 "48c5472f-fda9-43c9-a2eb-53340f184e3f" 00:14:32.143 ], 00:14:32.143 "product_name": "Malloc disk", 00:14:32.143 "block_size": 512, 00:14:32.143 "num_blocks": 65536, 00:14:32.143 "uuid": "48c5472f-fda9-43c9-a2eb-53340f184e3f", 00:14:32.143 "assigned_rate_limits": { 00:14:32.143 "rw_ios_per_sec": 0, 00:14:32.143 "rw_mbytes_per_sec": 0, 00:14:32.143 "r_mbytes_per_sec": 0, 00:14:32.143 "w_mbytes_per_sec": 0 00:14:32.143 }, 00:14:32.143 "claimed": true, 00:14:32.143 "claim_type": "exclusive_write", 00:14:32.143 "zoned": false, 00:14:32.143 "supported_io_types": { 00:14:32.143 "read": true, 00:14:32.143 "write": true, 00:14:32.143 "unmap": true, 00:14:32.143 "flush": true, 00:14:32.143 "reset": true, 00:14:32.143 "nvme_admin": false, 00:14:32.143 "nvme_io": false, 00:14:32.143 "nvme_io_md": false, 00:14:32.143 "write_zeroes": true, 00:14:32.143 "zcopy": true, 00:14:32.143 "get_zone_info": false, 00:14:32.143 "zone_management": false, 00:14:32.143 "zone_append": false, 00:14:32.143 "compare": false, 00:14:32.143 "compare_and_write": false, 00:14:32.143 "abort": true, 00:14:32.143 "seek_hole": false, 00:14:32.143 "seek_data": false, 00:14:32.143 "copy": true, 00:14:32.143 "nvme_iov_md": false 00:14:32.143 }, 00:14:32.143 "memory_domains": [ 00:14:32.143 { 00:14:32.143 "dma_device_id": "system", 00:14:32.143 "dma_device_type": 1 00:14:32.143 }, 00:14:32.143 { 00:14:32.143 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:32.143 "dma_device_type": 2 00:14:32.143 } 00:14:32.143 ], 00:14:32.143 "driver_specific": {} 00:14:32.143 } 00:14:32.143 ] 00:14:32.143 19:50:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:14:32.143 19:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:32.143 19:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:32.143 19:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:32.143 19:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:32.143 19:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:32.143 19:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:32.143 19:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:32.143 19:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:32.143 19:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:32.143 19:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:32.143 19:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:32.143 19:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:32.402 19:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:32.402 "name": "Existed_Raid", 00:14:32.402 "uuid": "98ed0e39-4270-4955-83f3-11b99695870f", 00:14:32.402 "strip_size_kb": 64, 00:14:32.402 "state": "configuring", 00:14:32.402 "raid_level": "raid0", 00:14:32.402 "superblock": true, 00:14:32.402 "num_base_bdevs": 3, 00:14:32.402 "num_base_bdevs_discovered": 2, 00:14:32.402 "num_base_bdevs_operational": 3, 00:14:32.402 "base_bdevs_list": [ 00:14:32.402 { 00:14:32.402 "name": "BaseBdev1", 00:14:32.402 "uuid": "48c5472f-fda9-43c9-a2eb-53340f184e3f", 00:14:32.402 "is_configured": true, 00:14:32.402 "data_offset": 2048, 00:14:32.402 "data_size": 63488 00:14:32.402 }, 00:14:32.402 { 00:14:32.402 "name": null, 00:14:32.402 "uuid": "64e05946-ded0-4fca-9fd1-be9a6194a6b5", 00:14:32.402 "is_configured": false, 00:14:32.402 "data_offset": 2048, 00:14:32.402 "data_size": 63488 00:14:32.402 }, 00:14:32.402 { 00:14:32.402 "name": "BaseBdev3", 00:14:32.402 "uuid": "a065792c-e610-4513-a9cb-88dd27a6d345", 00:14:32.402 "is_configured": true, 00:14:32.402 "data_offset": 2048, 00:14:32.402 "data_size": 63488 00:14:32.402 } 00:14:32.402 ] 00:14:32.402 }' 00:14:32.402 19:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:32.402 19:50:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:32.970 19:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:32.970 19:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:33.229 19:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:33.229 19:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:33.488 [2024-07-24 19:50:24.947795] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:33.488 19:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:33.488 19:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:33.488 19:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:33.488 19:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:33.488 19:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:33.488 19:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:33.488 19:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:33.488 19:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:33.488 19:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:33.488 19:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:33.488 19:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:33.488 19:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:33.748 19:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:33.748 "name": "Existed_Raid", 00:14:33.748 "uuid": "98ed0e39-4270-4955-83f3-11b99695870f", 00:14:33.748 "strip_size_kb": 64, 00:14:33.748 "state": "configuring", 00:14:33.748 "raid_level": "raid0", 00:14:33.748 "superblock": true, 00:14:33.748 "num_base_bdevs": 3, 00:14:33.748 "num_base_bdevs_discovered": 1, 00:14:33.748 "num_base_bdevs_operational": 3, 00:14:33.748 "base_bdevs_list": [ 00:14:33.748 { 00:14:33.748 "name": "BaseBdev1", 00:14:33.748 "uuid": "48c5472f-fda9-43c9-a2eb-53340f184e3f", 00:14:33.748 "is_configured": true, 00:14:33.748 "data_offset": 2048, 00:14:33.748 "data_size": 63488 00:14:33.748 }, 00:14:33.748 { 00:14:33.748 "name": null, 00:14:33.748 "uuid": "64e05946-ded0-4fca-9fd1-be9a6194a6b5", 00:14:33.748 "is_configured": false, 00:14:33.748 "data_offset": 2048, 00:14:33.748 "data_size": 63488 00:14:33.748 }, 00:14:33.748 { 00:14:33.748 "name": null, 00:14:33.748 "uuid": "a065792c-e610-4513-a9cb-88dd27a6d345", 00:14:33.748 "is_configured": false, 00:14:33.748 "data_offset": 2048, 00:14:33.748 "data_size": 63488 00:14:33.748 } 00:14:33.748 ] 00:14:33.748 }' 00:14:33.748 19:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:33.748 19:50:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:34.317 19:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:34.317 19:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:34.576 19:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:34.576 19:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:34.836 [2024-07-24 19:50:26.199130] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:34.836 19:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:34.836 19:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:34.836 19:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:34.836 19:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:34.836 19:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:34.836 19:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:34.836 19:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:34.836 19:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:34.836 19:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:34.836 19:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:34.836 19:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:34.836 19:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:35.095 19:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:35.095 "name": "Existed_Raid", 00:14:35.095 "uuid": "98ed0e39-4270-4955-83f3-11b99695870f", 00:14:35.095 "strip_size_kb": 64, 00:14:35.095 "state": "configuring", 00:14:35.095 "raid_level": "raid0", 00:14:35.095 "superblock": true, 00:14:35.095 "num_base_bdevs": 3, 00:14:35.095 "num_base_bdevs_discovered": 2, 00:14:35.095 "num_base_bdevs_operational": 3, 00:14:35.095 "base_bdevs_list": [ 00:14:35.095 { 00:14:35.095 "name": "BaseBdev1", 00:14:35.095 "uuid": "48c5472f-fda9-43c9-a2eb-53340f184e3f", 00:14:35.095 "is_configured": true, 00:14:35.095 "data_offset": 2048, 00:14:35.095 "data_size": 63488 00:14:35.095 }, 00:14:35.095 { 00:14:35.095 "name": null, 00:14:35.095 "uuid": "64e05946-ded0-4fca-9fd1-be9a6194a6b5", 00:14:35.095 "is_configured": false, 00:14:35.095 "data_offset": 2048, 00:14:35.095 "data_size": 63488 00:14:35.095 }, 00:14:35.095 { 00:14:35.095 "name": "BaseBdev3", 00:14:35.095 "uuid": "a065792c-e610-4513-a9cb-88dd27a6d345", 00:14:35.095 "is_configured": true, 00:14:35.095 "data_offset": 2048, 00:14:35.095 "data_size": 63488 00:14:35.095 } 00:14:35.095 ] 00:14:35.095 }' 00:14:35.095 19:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:35.095 19:50:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:35.663 19:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:35.664 19:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:35.664 19:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:35.664 19:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:35.923 [2024-07-24 19:50:27.474526] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:35.923 19:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:35.923 19:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:35.923 19:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:35.923 19:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:35.923 19:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:35.923 19:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:35.923 19:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:35.923 19:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:35.923 19:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:35.923 19:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:35.923 19:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:35.923 19:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:36.182 19:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:36.182 "name": "Existed_Raid", 00:14:36.182 "uuid": "98ed0e39-4270-4955-83f3-11b99695870f", 00:14:36.182 "strip_size_kb": 64, 00:14:36.182 "state": "configuring", 00:14:36.182 "raid_level": "raid0", 00:14:36.182 "superblock": true, 00:14:36.182 "num_base_bdevs": 3, 00:14:36.182 "num_base_bdevs_discovered": 1, 00:14:36.182 "num_base_bdevs_operational": 3, 00:14:36.182 "base_bdevs_list": [ 00:14:36.182 { 00:14:36.182 "name": null, 00:14:36.182 "uuid": "48c5472f-fda9-43c9-a2eb-53340f184e3f", 00:14:36.182 "is_configured": false, 00:14:36.182 "data_offset": 2048, 00:14:36.182 "data_size": 63488 00:14:36.182 }, 00:14:36.182 { 00:14:36.182 "name": null, 00:14:36.182 "uuid": "64e05946-ded0-4fca-9fd1-be9a6194a6b5", 00:14:36.182 "is_configured": false, 00:14:36.182 "data_offset": 2048, 00:14:36.182 "data_size": 63488 00:14:36.182 }, 00:14:36.182 { 00:14:36.182 "name": "BaseBdev3", 00:14:36.182 "uuid": "a065792c-e610-4513-a9cb-88dd27a6d345", 00:14:36.182 "is_configured": true, 00:14:36.182 "data_offset": 2048, 00:14:36.182 "data_size": 63488 00:14:36.182 } 00:14:36.182 ] 00:14:36.182 }' 00:14:36.182 19:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:36.182 19:50:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:37.120 19:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:37.121 19:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:37.121 19:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:37.121 19:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:37.380 [2024-07-24 19:50:28.870499] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:37.380 19:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:37.380 19:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:37.380 19:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:37.380 19:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:37.380 19:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:37.380 19:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:37.380 19:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:37.380 19:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:37.380 19:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:37.380 19:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:37.380 19:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:37.380 19:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:37.639 19:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:37.639 "name": "Existed_Raid", 00:14:37.639 "uuid": "98ed0e39-4270-4955-83f3-11b99695870f", 00:14:37.639 "strip_size_kb": 64, 00:14:37.639 "state": "configuring", 00:14:37.639 "raid_level": "raid0", 00:14:37.639 "superblock": true, 00:14:37.639 "num_base_bdevs": 3, 00:14:37.639 "num_base_bdevs_discovered": 2, 00:14:37.639 "num_base_bdevs_operational": 3, 00:14:37.639 "base_bdevs_list": [ 00:14:37.639 { 00:14:37.639 "name": null, 00:14:37.639 "uuid": "48c5472f-fda9-43c9-a2eb-53340f184e3f", 00:14:37.639 "is_configured": false, 00:14:37.639 "data_offset": 2048, 00:14:37.639 "data_size": 63488 00:14:37.639 }, 00:14:37.639 { 00:14:37.640 "name": "BaseBdev2", 00:14:37.640 "uuid": "64e05946-ded0-4fca-9fd1-be9a6194a6b5", 00:14:37.640 "is_configured": true, 00:14:37.640 "data_offset": 2048, 00:14:37.640 "data_size": 63488 00:14:37.640 }, 00:14:37.640 { 00:14:37.640 "name": "BaseBdev3", 00:14:37.640 "uuid": "a065792c-e610-4513-a9cb-88dd27a6d345", 00:14:37.640 "is_configured": true, 00:14:37.640 "data_offset": 2048, 00:14:37.640 "data_size": 63488 00:14:37.640 } 00:14:37.640 ] 00:14:37.640 }' 00:14:37.640 19:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:37.640 19:50:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:38.577 19:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:38.577 19:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:38.577 19:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:38.577 19:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:38.577 19:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:39.146 19:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 48c5472f-fda9-43c9-a2eb-53340f184e3f 00:14:39.405 [2024-07-24 19:50:30.852246] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:39.405 [2024-07-24 19:50:30.852409] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x13d4ac0 00:14:39.405 [2024-07-24 19:50:30.852423] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:39.405 [2024-07-24 19:50:30.852593] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13d39e0 00:14:39.405 [2024-07-24 19:50:30.852707] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13d4ac0 00:14:39.405 [2024-07-24 19:50:30.852717] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x13d4ac0 00:14:39.405 [2024-07-24 19:50:30.852807] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:39.405 NewBaseBdev 00:14:39.405 19:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:39.405 19:50:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:14:39.405 19:50:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:39.405 19:50:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:14:39.405 19:50:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:39.405 19:50:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:39.405 19:50:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:39.664 19:50:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:39.924 [ 00:14:39.924 { 00:14:39.924 "name": "NewBaseBdev", 00:14:39.924 "aliases": [ 00:14:39.924 "48c5472f-fda9-43c9-a2eb-53340f184e3f" 00:14:39.924 ], 00:14:39.924 "product_name": "Malloc disk", 00:14:39.924 "block_size": 512, 00:14:39.924 "num_blocks": 65536, 00:14:39.924 "uuid": "48c5472f-fda9-43c9-a2eb-53340f184e3f", 00:14:39.924 "assigned_rate_limits": { 00:14:39.924 "rw_ios_per_sec": 0, 00:14:39.924 "rw_mbytes_per_sec": 0, 00:14:39.924 "r_mbytes_per_sec": 0, 00:14:39.924 "w_mbytes_per_sec": 0 00:14:39.924 }, 00:14:39.924 "claimed": true, 00:14:39.924 "claim_type": "exclusive_write", 00:14:39.924 "zoned": false, 00:14:39.924 "supported_io_types": { 00:14:39.924 "read": true, 00:14:39.924 "write": true, 00:14:39.924 "unmap": true, 00:14:39.924 "flush": true, 00:14:39.924 "reset": true, 00:14:39.924 "nvme_admin": false, 00:14:39.924 "nvme_io": false, 00:14:39.924 "nvme_io_md": false, 00:14:39.924 "write_zeroes": true, 00:14:39.924 "zcopy": true, 00:14:39.924 "get_zone_info": false, 00:14:39.924 "zone_management": false, 00:14:39.924 "zone_append": false, 00:14:39.924 "compare": false, 00:14:39.924 "compare_and_write": false, 00:14:39.924 "abort": true, 00:14:39.924 "seek_hole": false, 00:14:39.924 "seek_data": false, 00:14:39.924 "copy": true, 00:14:39.924 "nvme_iov_md": false 00:14:39.924 }, 00:14:39.924 "memory_domains": [ 00:14:39.924 { 00:14:39.924 "dma_device_id": "system", 00:14:39.924 "dma_device_type": 1 00:14:39.924 }, 00:14:39.924 { 00:14:39.924 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:39.924 "dma_device_type": 2 00:14:39.924 } 00:14:39.924 ], 00:14:39.924 "driver_specific": {} 00:14:39.924 } 00:14:39.924 ] 00:14:39.924 19:50:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:14:39.924 19:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:39.924 19:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:39.924 19:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:39.924 19:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:39.924 19:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:39.924 19:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:39.924 19:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:39.924 19:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:39.924 19:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:39.924 19:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:39.924 19:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:39.924 19:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:40.183 19:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:40.183 "name": "Existed_Raid", 00:14:40.183 "uuid": "98ed0e39-4270-4955-83f3-11b99695870f", 00:14:40.183 "strip_size_kb": 64, 00:14:40.183 "state": "online", 00:14:40.183 "raid_level": "raid0", 00:14:40.183 "superblock": true, 00:14:40.183 "num_base_bdevs": 3, 00:14:40.183 "num_base_bdevs_discovered": 3, 00:14:40.183 "num_base_bdevs_operational": 3, 00:14:40.183 "base_bdevs_list": [ 00:14:40.183 { 00:14:40.183 "name": "NewBaseBdev", 00:14:40.183 "uuid": "48c5472f-fda9-43c9-a2eb-53340f184e3f", 00:14:40.183 "is_configured": true, 00:14:40.183 "data_offset": 2048, 00:14:40.183 "data_size": 63488 00:14:40.183 }, 00:14:40.183 { 00:14:40.183 "name": "BaseBdev2", 00:14:40.183 "uuid": "64e05946-ded0-4fca-9fd1-be9a6194a6b5", 00:14:40.183 "is_configured": true, 00:14:40.183 "data_offset": 2048, 00:14:40.183 "data_size": 63488 00:14:40.183 }, 00:14:40.183 { 00:14:40.183 "name": "BaseBdev3", 00:14:40.183 "uuid": "a065792c-e610-4513-a9cb-88dd27a6d345", 00:14:40.183 "is_configured": true, 00:14:40.183 "data_offset": 2048, 00:14:40.183 "data_size": 63488 00:14:40.183 } 00:14:40.183 ] 00:14:40.183 }' 00:14:40.183 19:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:40.183 19:50:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:40.752 19:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:40.752 19:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:40.752 19:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:40.752 19:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:40.752 19:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:40.752 19:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:40.752 19:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:40.752 19:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:41.012 [2024-07-24 19:50:32.420701] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:41.012 19:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:41.012 "name": "Existed_Raid", 00:14:41.012 "aliases": [ 00:14:41.012 "98ed0e39-4270-4955-83f3-11b99695870f" 00:14:41.012 ], 00:14:41.012 "product_name": "Raid Volume", 00:14:41.012 "block_size": 512, 00:14:41.012 "num_blocks": 190464, 00:14:41.012 "uuid": "98ed0e39-4270-4955-83f3-11b99695870f", 00:14:41.012 "assigned_rate_limits": { 00:14:41.012 "rw_ios_per_sec": 0, 00:14:41.012 "rw_mbytes_per_sec": 0, 00:14:41.012 "r_mbytes_per_sec": 0, 00:14:41.012 "w_mbytes_per_sec": 0 00:14:41.012 }, 00:14:41.012 "claimed": false, 00:14:41.012 "zoned": false, 00:14:41.012 "supported_io_types": { 00:14:41.012 "read": true, 00:14:41.012 "write": true, 00:14:41.012 "unmap": true, 00:14:41.012 "flush": true, 00:14:41.012 "reset": true, 00:14:41.012 "nvme_admin": false, 00:14:41.012 "nvme_io": false, 00:14:41.012 "nvme_io_md": false, 00:14:41.012 "write_zeroes": true, 00:14:41.012 "zcopy": false, 00:14:41.012 "get_zone_info": false, 00:14:41.012 "zone_management": false, 00:14:41.012 "zone_append": false, 00:14:41.012 "compare": false, 00:14:41.012 "compare_and_write": false, 00:14:41.012 "abort": false, 00:14:41.012 "seek_hole": false, 00:14:41.012 "seek_data": false, 00:14:41.012 "copy": false, 00:14:41.012 "nvme_iov_md": false 00:14:41.012 }, 00:14:41.012 "memory_domains": [ 00:14:41.012 { 00:14:41.012 "dma_device_id": "system", 00:14:41.012 "dma_device_type": 1 00:14:41.012 }, 00:14:41.012 { 00:14:41.012 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:41.012 "dma_device_type": 2 00:14:41.012 }, 00:14:41.012 { 00:14:41.012 "dma_device_id": "system", 00:14:41.012 "dma_device_type": 1 00:14:41.012 }, 00:14:41.012 { 00:14:41.012 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:41.012 "dma_device_type": 2 00:14:41.012 }, 00:14:41.012 { 00:14:41.012 "dma_device_id": "system", 00:14:41.012 "dma_device_type": 1 00:14:41.012 }, 00:14:41.012 { 00:14:41.012 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:41.012 "dma_device_type": 2 00:14:41.012 } 00:14:41.012 ], 00:14:41.012 "driver_specific": { 00:14:41.012 "raid": { 00:14:41.012 "uuid": "98ed0e39-4270-4955-83f3-11b99695870f", 00:14:41.012 "strip_size_kb": 64, 00:14:41.012 "state": "online", 00:14:41.012 "raid_level": "raid0", 00:14:41.012 "superblock": true, 00:14:41.012 "num_base_bdevs": 3, 00:14:41.012 "num_base_bdevs_discovered": 3, 00:14:41.012 "num_base_bdevs_operational": 3, 00:14:41.012 "base_bdevs_list": [ 00:14:41.012 { 00:14:41.012 "name": "NewBaseBdev", 00:14:41.012 "uuid": "48c5472f-fda9-43c9-a2eb-53340f184e3f", 00:14:41.012 "is_configured": true, 00:14:41.012 "data_offset": 2048, 00:14:41.012 "data_size": 63488 00:14:41.012 }, 00:14:41.012 { 00:14:41.012 "name": "BaseBdev2", 00:14:41.012 "uuid": "64e05946-ded0-4fca-9fd1-be9a6194a6b5", 00:14:41.012 "is_configured": true, 00:14:41.012 "data_offset": 2048, 00:14:41.012 "data_size": 63488 00:14:41.012 }, 00:14:41.012 { 00:14:41.012 "name": "BaseBdev3", 00:14:41.012 "uuid": "a065792c-e610-4513-a9cb-88dd27a6d345", 00:14:41.012 "is_configured": true, 00:14:41.012 "data_offset": 2048, 00:14:41.012 "data_size": 63488 00:14:41.012 } 00:14:41.012 ] 00:14:41.012 } 00:14:41.012 } 00:14:41.012 }' 00:14:41.012 19:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:41.012 19:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:41.012 BaseBdev2 00:14:41.012 BaseBdev3' 00:14:41.012 19:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:41.012 19:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:41.012 19:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:41.271 19:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:41.271 "name": "NewBaseBdev", 00:14:41.271 "aliases": [ 00:14:41.271 "48c5472f-fda9-43c9-a2eb-53340f184e3f" 00:14:41.271 ], 00:14:41.271 "product_name": "Malloc disk", 00:14:41.271 "block_size": 512, 00:14:41.271 "num_blocks": 65536, 00:14:41.271 "uuid": "48c5472f-fda9-43c9-a2eb-53340f184e3f", 00:14:41.271 "assigned_rate_limits": { 00:14:41.271 "rw_ios_per_sec": 0, 00:14:41.271 "rw_mbytes_per_sec": 0, 00:14:41.271 "r_mbytes_per_sec": 0, 00:14:41.271 "w_mbytes_per_sec": 0 00:14:41.271 }, 00:14:41.271 "claimed": true, 00:14:41.271 "claim_type": "exclusive_write", 00:14:41.272 "zoned": false, 00:14:41.272 "supported_io_types": { 00:14:41.272 "read": true, 00:14:41.272 "write": true, 00:14:41.272 "unmap": true, 00:14:41.272 "flush": true, 00:14:41.272 "reset": true, 00:14:41.272 "nvme_admin": false, 00:14:41.272 "nvme_io": false, 00:14:41.272 "nvme_io_md": false, 00:14:41.272 "write_zeroes": true, 00:14:41.272 "zcopy": true, 00:14:41.272 "get_zone_info": false, 00:14:41.272 "zone_management": false, 00:14:41.272 "zone_append": false, 00:14:41.272 "compare": false, 00:14:41.272 "compare_and_write": false, 00:14:41.272 "abort": true, 00:14:41.272 "seek_hole": false, 00:14:41.272 "seek_data": false, 00:14:41.272 "copy": true, 00:14:41.272 "nvme_iov_md": false 00:14:41.272 }, 00:14:41.272 "memory_domains": [ 00:14:41.272 { 00:14:41.272 "dma_device_id": "system", 00:14:41.272 "dma_device_type": 1 00:14:41.272 }, 00:14:41.272 { 00:14:41.272 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:41.272 "dma_device_type": 2 00:14:41.272 } 00:14:41.272 ], 00:14:41.272 "driver_specific": {} 00:14:41.272 }' 00:14:41.272 19:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:41.272 19:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:41.272 19:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:41.272 19:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:41.530 19:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:41.530 19:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:41.530 19:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:41.530 19:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:41.530 19:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:41.530 19:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.530 19:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.530 19:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:41.530 19:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:41.530 19:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:41.530 19:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:41.789 19:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:41.789 "name": "BaseBdev2", 00:14:41.789 "aliases": [ 00:14:41.789 "64e05946-ded0-4fca-9fd1-be9a6194a6b5" 00:14:41.789 ], 00:14:41.789 "product_name": "Malloc disk", 00:14:41.789 "block_size": 512, 00:14:41.789 "num_blocks": 65536, 00:14:41.789 "uuid": "64e05946-ded0-4fca-9fd1-be9a6194a6b5", 00:14:41.789 "assigned_rate_limits": { 00:14:41.789 "rw_ios_per_sec": 0, 00:14:41.789 "rw_mbytes_per_sec": 0, 00:14:41.789 "r_mbytes_per_sec": 0, 00:14:41.789 "w_mbytes_per_sec": 0 00:14:41.789 }, 00:14:41.789 "claimed": true, 00:14:41.789 "claim_type": "exclusive_write", 00:14:41.789 "zoned": false, 00:14:41.789 "supported_io_types": { 00:14:41.789 "read": true, 00:14:41.789 "write": true, 00:14:41.789 "unmap": true, 00:14:41.789 "flush": true, 00:14:41.789 "reset": true, 00:14:41.789 "nvme_admin": false, 00:14:41.789 "nvme_io": false, 00:14:41.789 "nvme_io_md": false, 00:14:41.789 "write_zeroes": true, 00:14:41.789 "zcopy": true, 00:14:41.789 "get_zone_info": false, 00:14:41.789 "zone_management": false, 00:14:41.789 "zone_append": false, 00:14:41.789 "compare": false, 00:14:41.789 "compare_and_write": false, 00:14:41.789 "abort": true, 00:14:41.789 "seek_hole": false, 00:14:41.789 "seek_data": false, 00:14:41.789 "copy": true, 00:14:41.789 "nvme_iov_md": false 00:14:41.789 }, 00:14:41.789 "memory_domains": [ 00:14:41.789 { 00:14:41.789 "dma_device_id": "system", 00:14:41.789 "dma_device_type": 1 00:14:41.789 }, 00:14:41.789 { 00:14:41.789 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:41.789 "dma_device_type": 2 00:14:41.789 } 00:14:41.789 ], 00:14:41.789 "driver_specific": {} 00:14:41.789 }' 00:14:41.789 19:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:42.048 19:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:42.048 19:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:42.048 19:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:42.048 19:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:42.048 19:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:42.048 19:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:42.048 19:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:42.307 19:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:42.307 19:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:42.307 19:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:42.307 19:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:42.307 19:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:42.307 19:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:42.307 19:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:42.566 19:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:42.566 "name": "BaseBdev3", 00:14:42.566 "aliases": [ 00:14:42.566 "a065792c-e610-4513-a9cb-88dd27a6d345" 00:14:42.566 ], 00:14:42.566 "product_name": "Malloc disk", 00:14:42.566 "block_size": 512, 00:14:42.566 "num_blocks": 65536, 00:14:42.566 "uuid": "a065792c-e610-4513-a9cb-88dd27a6d345", 00:14:42.566 "assigned_rate_limits": { 00:14:42.566 "rw_ios_per_sec": 0, 00:14:42.566 "rw_mbytes_per_sec": 0, 00:14:42.566 "r_mbytes_per_sec": 0, 00:14:42.566 "w_mbytes_per_sec": 0 00:14:42.566 }, 00:14:42.566 "claimed": true, 00:14:42.566 "claim_type": "exclusive_write", 00:14:42.566 "zoned": false, 00:14:42.566 "supported_io_types": { 00:14:42.566 "read": true, 00:14:42.566 "write": true, 00:14:42.566 "unmap": true, 00:14:42.566 "flush": true, 00:14:42.566 "reset": true, 00:14:42.566 "nvme_admin": false, 00:14:42.566 "nvme_io": false, 00:14:42.566 "nvme_io_md": false, 00:14:42.566 "write_zeroes": true, 00:14:42.566 "zcopy": true, 00:14:42.566 "get_zone_info": false, 00:14:42.566 "zone_management": false, 00:14:42.566 "zone_append": false, 00:14:42.566 "compare": false, 00:14:42.566 "compare_and_write": false, 00:14:42.566 "abort": true, 00:14:42.566 "seek_hole": false, 00:14:42.566 "seek_data": false, 00:14:42.566 "copy": true, 00:14:42.566 "nvme_iov_md": false 00:14:42.566 }, 00:14:42.566 "memory_domains": [ 00:14:42.566 { 00:14:42.566 "dma_device_id": "system", 00:14:42.566 "dma_device_type": 1 00:14:42.566 }, 00:14:42.566 { 00:14:42.566 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:42.566 "dma_device_type": 2 00:14:42.566 } 00:14:42.566 ], 00:14:42.566 "driver_specific": {} 00:14:42.566 }' 00:14:42.566 19:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:42.566 19:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:42.566 19:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:42.566 19:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:42.825 19:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:42.825 19:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:42.826 19:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:42.826 19:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:42.826 19:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:42.826 19:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:42.826 19:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:42.826 19:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:42.826 19:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:43.085 [2024-07-24 19:50:34.610209] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:43.085 [2024-07-24 19:50:34.610233] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:43.085 [2024-07-24 19:50:34.610283] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:43.085 [2024-07-24 19:50:34.610333] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:43.085 [2024-07-24 19:50:34.610345] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13d4ac0 name Existed_Raid, state offline 00:14:43.085 19:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1398788 00:14:43.085 19:50:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1398788 ']' 00:14:43.085 19:50:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 1398788 00:14:43.085 19:50:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:14:43.085 19:50:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:14:43.085 19:50:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1398788 00:14:43.344 19:50:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:14:43.344 19:50:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:14:43.344 19:50:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1398788' 00:14:43.344 killing process with pid 1398788 00:14:43.344 19:50:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 1398788 00:14:43.344 [2024-07-24 19:50:34.696143] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:43.344 19:50:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 1398788 00:14:43.344 [2024-07-24 19:50:34.723962] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:43.603 19:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:14:43.603 00:14:43.603 real 0m28.688s 00:14:43.603 user 0m53.131s 00:14:43.603 sys 0m5.135s 00:14:43.603 19:50:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:14:43.603 19:50:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:43.603 ************************************ 00:14:43.603 END TEST raid_state_function_test_sb 00:14:43.603 ************************************ 00:14:43.603 19:50:34 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:14:43.603 19:50:34 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:14:43.603 19:50:34 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:14:43.603 19:50:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:43.603 ************************************ 00:14:43.603 START TEST raid_superblock_test 00:14:43.603 ************************************ 00:14:43.603 19:50:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid0 3 00:14:43.603 19:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=raid0 00:14:43.603 19:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=3 00:14:43.603 19:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:14:43.603 19:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:14:43.603 19:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:14:43.603 19:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:14:43.603 19:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:14:43.603 19:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:14:43.603 19:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:14:43.603 19:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:14:43.603 19:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:14:43.603 19:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:14:43.603 19:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:14:43.603 19:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' raid0 '!=' raid1 ']' 00:14:43.603 19:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # strip_size=64 00:14:43.603 19:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # strip_size_create_arg='-z 64' 00:14:43.603 19:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:43.603 19:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=1403179 00:14:43.603 19:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 1403179 /var/tmp/spdk-raid.sock 00:14:43.603 19:50:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1403179 ']' 00:14:43.603 19:50:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:43.604 19:50:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:14:43.604 19:50:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:43.604 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:43.604 19:50:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:14:43.604 19:50:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:43.604 [2024-07-24 19:50:35.086606] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:14:43.604 [2024-07-24 19:50:35.086677] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1403179 ] 00:14:43.862 [2024-07-24 19:50:35.219469] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:43.862 [2024-07-24 19:50:35.321566] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:43.862 [2024-07-24 19:50:35.387253] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:43.862 [2024-07-24 19:50:35.387293] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:44.429 19:50:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:14:44.429 19:50:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:14:44.429 19:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:14:44.429 19:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:14:44.429 19:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:14:44.429 19:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:14:44.429 19:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:14:44.429 19:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:44.429 19:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:14:44.429 19:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:44.429 19:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:14:44.687 malloc1 00:14:44.687 19:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:44.946 [2024-07-24 19:50:36.457663] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:44.946 [2024-07-24 19:50:36.457715] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:44.946 [2024-07-24 19:50:36.457736] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd30590 00:14:44.946 [2024-07-24 19:50:36.457749] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:44.946 [2024-07-24 19:50:36.459377] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:44.946 [2024-07-24 19:50:36.459417] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:44.946 pt1 00:14:44.946 19:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:14:44.946 19:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:14:44.946 19:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:14:44.946 19:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:14:44.946 19:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:14:44.946 19:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:44.946 19:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:14:44.946 19:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:44.946 19:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:14:45.205 malloc2 00:14:45.205 19:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:45.464 [2024-07-24 19:50:36.983850] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:45.464 [2024-07-24 19:50:36.983898] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:45.464 [2024-07-24 19:50:36.983914] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xed6690 00:14:45.464 [2024-07-24 19:50:36.983926] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:45.464 [2024-07-24 19:50:36.985322] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:45.464 [2024-07-24 19:50:36.985351] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:45.464 pt2 00:14:45.464 19:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:14:45.464 19:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:14:45.464 19:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc3 00:14:45.464 19:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt3 00:14:45.464 19:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:14:45.464 19:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:45.464 19:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:14:45.464 19:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:45.464 19:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:14:45.756 malloc3 00:14:45.756 19:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:46.015 [2024-07-24 19:50:37.485773] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:46.015 [2024-07-24 19:50:37.485820] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:46.015 [2024-07-24 19:50:37.485836] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xed7fc0 00:14:46.015 [2024-07-24 19:50:37.485849] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:46.015 [2024-07-24 19:50:37.487294] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:46.015 [2024-07-24 19:50:37.487322] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:46.015 pt3 00:14:46.015 19:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:14:46.015 19:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:14:46.015 19:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:14:46.274 [2024-07-24 19:50:37.666270] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:46.274 [2024-07-24 19:50:37.667475] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:46.274 [2024-07-24 19:50:37.667529] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:46.274 [2024-07-24 19:50:37.667685] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xed8d10 00:14:46.274 [2024-07-24 19:50:37.667696] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:46.274 [2024-07-24 19:50:37.667883] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd47480 00:14:46.274 [2024-07-24 19:50:37.668023] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xed8d10 00:14:46.274 [2024-07-24 19:50:37.668033] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xed8d10 00:14:46.274 [2024-07-24 19:50:37.668126] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:46.274 19:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:46.274 19:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:46.274 19:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:46.274 19:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:46.274 19:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:46.274 19:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:46.274 19:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:46.274 19:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:46.274 19:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:46.274 19:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:46.274 19:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:46.274 19:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:46.533 19:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:46.533 "name": "raid_bdev1", 00:14:46.533 "uuid": "abd8a1f3-1ad0-4195-b0b5-ceec06cee9ea", 00:14:46.533 "strip_size_kb": 64, 00:14:46.533 "state": "online", 00:14:46.533 "raid_level": "raid0", 00:14:46.533 "superblock": true, 00:14:46.533 "num_base_bdevs": 3, 00:14:46.533 "num_base_bdevs_discovered": 3, 00:14:46.533 "num_base_bdevs_operational": 3, 00:14:46.533 "base_bdevs_list": [ 00:14:46.533 { 00:14:46.533 "name": "pt1", 00:14:46.533 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:46.533 "is_configured": true, 00:14:46.533 "data_offset": 2048, 00:14:46.533 "data_size": 63488 00:14:46.533 }, 00:14:46.533 { 00:14:46.533 "name": "pt2", 00:14:46.533 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:46.533 "is_configured": true, 00:14:46.533 "data_offset": 2048, 00:14:46.533 "data_size": 63488 00:14:46.533 }, 00:14:46.533 { 00:14:46.533 "name": "pt3", 00:14:46.533 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:46.533 "is_configured": true, 00:14:46.533 "data_offset": 2048, 00:14:46.533 "data_size": 63488 00:14:46.533 } 00:14:46.533 ] 00:14:46.533 }' 00:14:46.533 19:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:46.533 19:50:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:47.101 19:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:14:47.101 19:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:47.101 19:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:47.101 19:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:47.101 19:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:47.101 19:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:47.101 19:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:47.101 19:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:47.361 [2024-07-24 19:50:38.797514] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:47.361 19:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:47.361 "name": "raid_bdev1", 00:14:47.361 "aliases": [ 00:14:47.361 "abd8a1f3-1ad0-4195-b0b5-ceec06cee9ea" 00:14:47.361 ], 00:14:47.361 "product_name": "Raid Volume", 00:14:47.361 "block_size": 512, 00:14:47.361 "num_blocks": 190464, 00:14:47.361 "uuid": "abd8a1f3-1ad0-4195-b0b5-ceec06cee9ea", 00:14:47.361 "assigned_rate_limits": { 00:14:47.361 "rw_ios_per_sec": 0, 00:14:47.361 "rw_mbytes_per_sec": 0, 00:14:47.361 "r_mbytes_per_sec": 0, 00:14:47.361 "w_mbytes_per_sec": 0 00:14:47.361 }, 00:14:47.361 "claimed": false, 00:14:47.361 "zoned": false, 00:14:47.361 "supported_io_types": { 00:14:47.361 "read": true, 00:14:47.361 "write": true, 00:14:47.361 "unmap": true, 00:14:47.361 "flush": true, 00:14:47.361 "reset": true, 00:14:47.361 "nvme_admin": false, 00:14:47.361 "nvme_io": false, 00:14:47.361 "nvme_io_md": false, 00:14:47.361 "write_zeroes": true, 00:14:47.361 "zcopy": false, 00:14:47.361 "get_zone_info": false, 00:14:47.361 "zone_management": false, 00:14:47.361 "zone_append": false, 00:14:47.361 "compare": false, 00:14:47.361 "compare_and_write": false, 00:14:47.361 "abort": false, 00:14:47.361 "seek_hole": false, 00:14:47.361 "seek_data": false, 00:14:47.361 "copy": false, 00:14:47.361 "nvme_iov_md": false 00:14:47.361 }, 00:14:47.361 "memory_domains": [ 00:14:47.361 { 00:14:47.361 "dma_device_id": "system", 00:14:47.361 "dma_device_type": 1 00:14:47.361 }, 00:14:47.361 { 00:14:47.361 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:47.361 "dma_device_type": 2 00:14:47.361 }, 00:14:47.361 { 00:14:47.361 "dma_device_id": "system", 00:14:47.361 "dma_device_type": 1 00:14:47.361 }, 00:14:47.361 { 00:14:47.361 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:47.361 "dma_device_type": 2 00:14:47.361 }, 00:14:47.361 { 00:14:47.361 "dma_device_id": "system", 00:14:47.361 "dma_device_type": 1 00:14:47.361 }, 00:14:47.361 { 00:14:47.361 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:47.361 "dma_device_type": 2 00:14:47.361 } 00:14:47.361 ], 00:14:47.361 "driver_specific": { 00:14:47.361 "raid": { 00:14:47.361 "uuid": "abd8a1f3-1ad0-4195-b0b5-ceec06cee9ea", 00:14:47.361 "strip_size_kb": 64, 00:14:47.361 "state": "online", 00:14:47.361 "raid_level": "raid0", 00:14:47.361 "superblock": true, 00:14:47.361 "num_base_bdevs": 3, 00:14:47.361 "num_base_bdevs_discovered": 3, 00:14:47.361 "num_base_bdevs_operational": 3, 00:14:47.361 "base_bdevs_list": [ 00:14:47.361 { 00:14:47.361 "name": "pt1", 00:14:47.361 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:47.361 "is_configured": true, 00:14:47.361 "data_offset": 2048, 00:14:47.361 "data_size": 63488 00:14:47.361 }, 00:14:47.361 { 00:14:47.361 "name": "pt2", 00:14:47.361 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:47.361 "is_configured": true, 00:14:47.361 "data_offset": 2048, 00:14:47.361 "data_size": 63488 00:14:47.361 }, 00:14:47.361 { 00:14:47.361 "name": "pt3", 00:14:47.361 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:47.361 "is_configured": true, 00:14:47.361 "data_offset": 2048, 00:14:47.361 "data_size": 63488 00:14:47.361 } 00:14:47.361 ] 00:14:47.361 } 00:14:47.361 } 00:14:47.361 }' 00:14:47.361 19:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:47.361 19:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:47.361 pt2 00:14:47.361 pt3' 00:14:47.361 19:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:47.361 19:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:47.361 19:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:47.620 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:47.620 "name": "pt1", 00:14:47.620 "aliases": [ 00:14:47.620 "00000000-0000-0000-0000-000000000001" 00:14:47.620 ], 00:14:47.620 "product_name": "passthru", 00:14:47.620 "block_size": 512, 00:14:47.620 "num_blocks": 65536, 00:14:47.620 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:47.620 "assigned_rate_limits": { 00:14:47.620 "rw_ios_per_sec": 0, 00:14:47.620 "rw_mbytes_per_sec": 0, 00:14:47.620 "r_mbytes_per_sec": 0, 00:14:47.620 "w_mbytes_per_sec": 0 00:14:47.620 }, 00:14:47.620 "claimed": true, 00:14:47.620 "claim_type": "exclusive_write", 00:14:47.620 "zoned": false, 00:14:47.620 "supported_io_types": { 00:14:47.620 "read": true, 00:14:47.620 "write": true, 00:14:47.620 "unmap": true, 00:14:47.620 "flush": true, 00:14:47.620 "reset": true, 00:14:47.620 "nvme_admin": false, 00:14:47.620 "nvme_io": false, 00:14:47.620 "nvme_io_md": false, 00:14:47.620 "write_zeroes": true, 00:14:47.620 "zcopy": true, 00:14:47.620 "get_zone_info": false, 00:14:47.620 "zone_management": false, 00:14:47.620 "zone_append": false, 00:14:47.620 "compare": false, 00:14:47.620 "compare_and_write": false, 00:14:47.620 "abort": true, 00:14:47.620 "seek_hole": false, 00:14:47.620 "seek_data": false, 00:14:47.620 "copy": true, 00:14:47.620 "nvme_iov_md": false 00:14:47.620 }, 00:14:47.620 "memory_domains": [ 00:14:47.620 { 00:14:47.620 "dma_device_id": "system", 00:14:47.620 "dma_device_type": 1 00:14:47.620 }, 00:14:47.620 { 00:14:47.620 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:47.620 "dma_device_type": 2 00:14:47.620 } 00:14:47.620 ], 00:14:47.620 "driver_specific": { 00:14:47.620 "passthru": { 00:14:47.620 "name": "pt1", 00:14:47.620 "base_bdev_name": "malloc1" 00:14:47.620 } 00:14:47.620 } 00:14:47.620 }' 00:14:47.620 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:47.620 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:47.620 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:47.620 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:47.879 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:47.879 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:47.879 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:47.879 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:47.879 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:47.879 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:47.879 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:48.138 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:48.138 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:48.138 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:48.138 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:48.138 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:48.138 "name": "pt2", 00:14:48.138 "aliases": [ 00:14:48.138 "00000000-0000-0000-0000-000000000002" 00:14:48.138 ], 00:14:48.138 "product_name": "passthru", 00:14:48.138 "block_size": 512, 00:14:48.138 "num_blocks": 65536, 00:14:48.138 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:48.138 "assigned_rate_limits": { 00:14:48.138 "rw_ios_per_sec": 0, 00:14:48.138 "rw_mbytes_per_sec": 0, 00:14:48.138 "r_mbytes_per_sec": 0, 00:14:48.138 "w_mbytes_per_sec": 0 00:14:48.138 }, 00:14:48.138 "claimed": true, 00:14:48.138 "claim_type": "exclusive_write", 00:14:48.138 "zoned": false, 00:14:48.138 "supported_io_types": { 00:14:48.138 "read": true, 00:14:48.138 "write": true, 00:14:48.138 "unmap": true, 00:14:48.138 "flush": true, 00:14:48.138 "reset": true, 00:14:48.138 "nvme_admin": false, 00:14:48.138 "nvme_io": false, 00:14:48.138 "nvme_io_md": false, 00:14:48.138 "write_zeroes": true, 00:14:48.138 "zcopy": true, 00:14:48.138 "get_zone_info": false, 00:14:48.138 "zone_management": false, 00:14:48.138 "zone_append": false, 00:14:48.138 "compare": false, 00:14:48.138 "compare_and_write": false, 00:14:48.138 "abort": true, 00:14:48.138 "seek_hole": false, 00:14:48.138 "seek_data": false, 00:14:48.138 "copy": true, 00:14:48.138 "nvme_iov_md": false 00:14:48.138 }, 00:14:48.138 "memory_domains": [ 00:14:48.138 { 00:14:48.138 "dma_device_id": "system", 00:14:48.138 "dma_device_type": 1 00:14:48.138 }, 00:14:48.138 { 00:14:48.138 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:48.138 "dma_device_type": 2 00:14:48.138 } 00:14:48.138 ], 00:14:48.138 "driver_specific": { 00:14:48.138 "passthru": { 00:14:48.138 "name": "pt2", 00:14:48.138 "base_bdev_name": "malloc2" 00:14:48.138 } 00:14:48.138 } 00:14:48.138 }' 00:14:48.138 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:48.138 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:48.397 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:48.397 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:48.397 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:48.397 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:48.397 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:48.397 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:48.397 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:48.397 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:48.397 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:48.397 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:48.397 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:48.656 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:48.656 19:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:48.914 19:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:48.914 "name": "pt3", 00:14:48.914 "aliases": [ 00:14:48.914 "00000000-0000-0000-0000-000000000003" 00:14:48.914 ], 00:14:48.914 "product_name": "passthru", 00:14:48.914 "block_size": 512, 00:14:48.914 "num_blocks": 65536, 00:14:48.914 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:48.914 "assigned_rate_limits": { 00:14:48.914 "rw_ios_per_sec": 0, 00:14:48.914 "rw_mbytes_per_sec": 0, 00:14:48.914 "r_mbytes_per_sec": 0, 00:14:48.914 "w_mbytes_per_sec": 0 00:14:48.914 }, 00:14:48.914 "claimed": true, 00:14:48.914 "claim_type": "exclusive_write", 00:14:48.914 "zoned": false, 00:14:48.914 "supported_io_types": { 00:14:48.914 "read": true, 00:14:48.914 "write": true, 00:14:48.914 "unmap": true, 00:14:48.914 "flush": true, 00:14:48.914 "reset": true, 00:14:48.914 "nvme_admin": false, 00:14:48.914 "nvme_io": false, 00:14:48.914 "nvme_io_md": false, 00:14:48.914 "write_zeroes": true, 00:14:48.914 "zcopy": true, 00:14:48.914 "get_zone_info": false, 00:14:48.914 "zone_management": false, 00:14:48.914 "zone_append": false, 00:14:48.914 "compare": false, 00:14:48.914 "compare_and_write": false, 00:14:48.914 "abort": true, 00:14:48.914 "seek_hole": false, 00:14:48.914 "seek_data": false, 00:14:48.914 "copy": true, 00:14:48.914 "nvme_iov_md": false 00:14:48.914 }, 00:14:48.914 "memory_domains": [ 00:14:48.914 { 00:14:48.914 "dma_device_id": "system", 00:14:48.914 "dma_device_type": 1 00:14:48.914 }, 00:14:48.914 { 00:14:48.914 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:48.914 "dma_device_type": 2 00:14:48.914 } 00:14:48.914 ], 00:14:48.914 "driver_specific": { 00:14:48.914 "passthru": { 00:14:48.914 "name": "pt3", 00:14:48.914 "base_bdev_name": "malloc3" 00:14:48.914 } 00:14:48.914 } 00:14:48.914 }' 00:14:48.914 19:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:48.914 19:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:48.914 19:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:48.914 19:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:48.914 19:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:48.914 19:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:48.914 19:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:48.914 19:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:49.174 19:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:49.174 19:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:49.174 19:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:49.174 19:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:49.174 19:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:49.174 19:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:14:49.433 [2024-07-24 19:50:40.830886] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:49.433 19:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=abd8a1f3-1ad0-4195-b0b5-ceec06cee9ea 00:14:49.433 19:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z abd8a1f3-1ad0-4195-b0b5-ceec06cee9ea ']' 00:14:49.433 19:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:49.692 [2024-07-24 19:50:41.083283] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:49.692 [2024-07-24 19:50:41.083304] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:49.692 [2024-07-24 19:50:41.083357] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:49.692 [2024-07-24 19:50:41.083417] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:49.692 [2024-07-24 19:50:41.083429] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xed8d10 name raid_bdev1, state offline 00:14:49.692 19:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.692 19:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:14:49.952 19:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:14:49.952 19:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:14:49.952 19:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:14:49.952 19:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:50.211 19:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:14:50.211 19:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:50.470 19:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:14:50.470 19:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:14:50.470 19:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:14:50.470 19:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:14:50.730 19:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:14:50.730 19:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:50.730 19:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:14:50.730 19:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:50.730 19:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:50.730 19:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:14:50.730 19:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:50.730 19:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:14:50.730 19:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:50.730 19:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:14:50.730 19:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:50.730 19:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:14:50.730 19:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:50.989 [2024-07-24 19:50:42.502989] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:14:50.989 [2024-07-24 19:50:42.504396] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:14:50.989 [2024-07-24 19:50:42.504442] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:14:50.989 [2024-07-24 19:50:42.504492] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:14:50.989 [2024-07-24 19:50:42.504534] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:14:50.989 [2024-07-24 19:50:42.504557] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:14:50.989 [2024-07-24 19:50:42.504575] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:50.989 [2024-07-24 19:50:42.504585] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd27c50 name raid_bdev1, state configuring 00:14:50.989 request: 00:14:50.989 { 00:14:50.989 "name": "raid_bdev1", 00:14:50.989 "raid_level": "raid0", 00:14:50.989 "base_bdevs": [ 00:14:50.989 "malloc1", 00:14:50.989 "malloc2", 00:14:50.989 "malloc3" 00:14:50.989 ], 00:14:50.989 "strip_size_kb": 64, 00:14:50.989 "superblock": false, 00:14:50.989 "method": "bdev_raid_create", 00:14:50.989 "req_id": 1 00:14:50.989 } 00:14:50.989 Got JSON-RPC error response 00:14:50.989 response: 00:14:50.989 { 00:14:50.989 "code": -17, 00:14:50.989 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:14:50.989 } 00:14:50.989 19:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:14:50.989 19:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:14:50.989 19:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:14:50.989 19:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:14:50.989 19:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:50.989 19:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:14:51.249 19:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:14:51.249 19:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:14:51.249 19:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:51.509 [2024-07-24 19:50:42.992209] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:51.509 [2024-07-24 19:50:42.992253] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:51.509 [2024-07-24 19:50:42.992271] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xed6460 00:14:51.509 [2024-07-24 19:50:42.992284] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:51.509 [2024-07-24 19:50:42.993958] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:51.509 [2024-07-24 19:50:42.993991] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:51.509 [2024-07-24 19:50:42.994062] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:51.509 [2024-07-24 19:50:42.994090] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:51.509 pt1 00:14:51.509 19:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:14:51.509 19:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:51.509 19:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:51.509 19:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:51.509 19:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:51.509 19:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:51.509 19:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:51.509 19:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:51.509 19:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:51.509 19:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:51.509 19:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:51.509 19:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:51.768 19:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:51.768 "name": "raid_bdev1", 00:14:51.768 "uuid": "abd8a1f3-1ad0-4195-b0b5-ceec06cee9ea", 00:14:51.768 "strip_size_kb": 64, 00:14:51.768 "state": "configuring", 00:14:51.768 "raid_level": "raid0", 00:14:51.768 "superblock": true, 00:14:51.768 "num_base_bdevs": 3, 00:14:51.768 "num_base_bdevs_discovered": 1, 00:14:51.768 "num_base_bdevs_operational": 3, 00:14:51.768 "base_bdevs_list": [ 00:14:51.768 { 00:14:51.768 "name": "pt1", 00:14:51.768 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:51.768 "is_configured": true, 00:14:51.768 "data_offset": 2048, 00:14:51.768 "data_size": 63488 00:14:51.768 }, 00:14:51.768 { 00:14:51.768 "name": null, 00:14:51.768 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:51.768 "is_configured": false, 00:14:51.768 "data_offset": 2048, 00:14:51.768 "data_size": 63488 00:14:51.768 }, 00:14:51.768 { 00:14:51.768 "name": null, 00:14:51.768 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:51.768 "is_configured": false, 00:14:51.768 "data_offset": 2048, 00:14:51.768 "data_size": 63488 00:14:51.768 } 00:14:51.768 ] 00:14:51.768 }' 00:14:51.768 19:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:51.768 19:50:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:52.335 19:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 3 -gt 2 ']' 00:14:52.335 19:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:52.594 [2024-07-24 19:50:44.091147] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:52.594 [2024-07-24 19:50:44.091199] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:52.594 [2024-07-24 19:50:44.091219] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd28260 00:14:52.594 [2024-07-24 19:50:44.091231] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:52.594 [2024-07-24 19:50:44.091585] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:52.594 [2024-07-24 19:50:44.091603] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:52.594 [2024-07-24 19:50:44.091664] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:52.594 [2024-07-24 19:50:44.091684] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:52.594 pt2 00:14:52.594 19:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@488 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:52.853 [2024-07-24 19:50:44.283670] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:14:52.853 19:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@489 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:14:52.853 19:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:52.853 19:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:52.853 19:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:52.853 19:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:52.853 19:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:52.853 19:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:52.853 19:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:52.853 19:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:52.853 19:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:52.853 19:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:52.853 19:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.112 19:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:53.112 "name": "raid_bdev1", 00:14:53.112 "uuid": "abd8a1f3-1ad0-4195-b0b5-ceec06cee9ea", 00:14:53.112 "strip_size_kb": 64, 00:14:53.112 "state": "configuring", 00:14:53.112 "raid_level": "raid0", 00:14:53.112 "superblock": true, 00:14:53.112 "num_base_bdevs": 3, 00:14:53.112 "num_base_bdevs_discovered": 1, 00:14:53.112 "num_base_bdevs_operational": 3, 00:14:53.112 "base_bdevs_list": [ 00:14:53.112 { 00:14:53.112 "name": "pt1", 00:14:53.112 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:53.112 "is_configured": true, 00:14:53.112 "data_offset": 2048, 00:14:53.112 "data_size": 63488 00:14:53.112 }, 00:14:53.112 { 00:14:53.112 "name": null, 00:14:53.112 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:53.112 "is_configured": false, 00:14:53.112 "data_offset": 2048, 00:14:53.112 "data_size": 63488 00:14:53.112 }, 00:14:53.112 { 00:14:53.112 "name": null, 00:14:53.112 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:53.112 "is_configured": false, 00:14:53.112 "data_offset": 2048, 00:14:53.112 "data_size": 63488 00:14:53.112 } 00:14:53.112 ] 00:14:53.112 }' 00:14:53.113 19:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:53.113 19:50:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:53.679 19:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:14:53.679 19:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:14:53.679 19:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:53.938 [2024-07-24 19:50:45.306396] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:53.938 [2024-07-24 19:50:45.306444] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:53.938 [2024-07-24 19:50:45.306463] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xed6a60 00:14:53.938 [2024-07-24 19:50:45.306480] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:53.938 [2024-07-24 19:50:45.306817] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:53.938 [2024-07-24 19:50:45.306835] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:53.938 [2024-07-24 19:50:45.306895] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:53.938 [2024-07-24 19:50:45.306914] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:53.938 pt2 00:14:53.938 19:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:14:53.938 19:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:14:53.938 19:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:54.201 [2024-07-24 19:50:45.555054] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:54.201 [2024-07-24 19:50:45.555088] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:54.201 [2024-07-24 19:50:45.555105] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd307c0 00:14:54.201 [2024-07-24 19:50:45.555116] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:54.201 [2024-07-24 19:50:45.555425] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:54.201 [2024-07-24 19:50:45.555442] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:54.201 [2024-07-24 19:50:45.555498] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:14:54.201 [2024-07-24 19:50:45.555515] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:54.201 [2024-07-24 19:50:45.555617] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xd273c0 00:14:54.201 [2024-07-24 19:50:45.555628] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:54.201 [2024-07-24 19:50:45.555801] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd2b730 00:14:54.201 [2024-07-24 19:50:45.555937] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd273c0 00:14:54.201 [2024-07-24 19:50:45.555948] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd273c0 00:14:54.201 [2024-07-24 19:50:45.556042] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:54.201 pt3 00:14:54.201 19:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:14:54.201 19:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:14:54.201 19:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:54.201 19:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:54.201 19:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:54.201 19:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:54.201 19:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:54.201 19:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:54.201 19:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:54.201 19:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:54.201 19:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:54.201 19:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:54.201 19:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:54.201 19:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:54.459 19:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:54.459 "name": "raid_bdev1", 00:14:54.459 "uuid": "abd8a1f3-1ad0-4195-b0b5-ceec06cee9ea", 00:14:54.459 "strip_size_kb": 64, 00:14:54.459 "state": "online", 00:14:54.459 "raid_level": "raid0", 00:14:54.459 "superblock": true, 00:14:54.459 "num_base_bdevs": 3, 00:14:54.459 "num_base_bdevs_discovered": 3, 00:14:54.459 "num_base_bdevs_operational": 3, 00:14:54.459 "base_bdevs_list": [ 00:14:54.459 { 00:14:54.459 "name": "pt1", 00:14:54.459 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:54.459 "is_configured": true, 00:14:54.459 "data_offset": 2048, 00:14:54.459 "data_size": 63488 00:14:54.459 }, 00:14:54.459 { 00:14:54.459 "name": "pt2", 00:14:54.459 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:54.459 "is_configured": true, 00:14:54.460 "data_offset": 2048, 00:14:54.460 "data_size": 63488 00:14:54.460 }, 00:14:54.460 { 00:14:54.460 "name": "pt3", 00:14:54.460 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:54.460 "is_configured": true, 00:14:54.460 "data_offset": 2048, 00:14:54.460 "data_size": 63488 00:14:54.460 } 00:14:54.460 ] 00:14:54.460 }' 00:14:54.460 19:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:54.460 19:50:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:55.027 19:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:14:55.027 19:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:55.027 19:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:55.027 19:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:55.027 19:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:55.027 19:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:55.027 19:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:55.027 19:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:55.287 [2024-07-24 19:50:46.654243] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:55.287 19:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:55.287 "name": "raid_bdev1", 00:14:55.287 "aliases": [ 00:14:55.287 "abd8a1f3-1ad0-4195-b0b5-ceec06cee9ea" 00:14:55.287 ], 00:14:55.287 "product_name": "Raid Volume", 00:14:55.287 "block_size": 512, 00:14:55.287 "num_blocks": 190464, 00:14:55.287 "uuid": "abd8a1f3-1ad0-4195-b0b5-ceec06cee9ea", 00:14:55.287 "assigned_rate_limits": { 00:14:55.287 "rw_ios_per_sec": 0, 00:14:55.287 "rw_mbytes_per_sec": 0, 00:14:55.287 "r_mbytes_per_sec": 0, 00:14:55.287 "w_mbytes_per_sec": 0 00:14:55.287 }, 00:14:55.287 "claimed": false, 00:14:55.287 "zoned": false, 00:14:55.287 "supported_io_types": { 00:14:55.287 "read": true, 00:14:55.287 "write": true, 00:14:55.287 "unmap": true, 00:14:55.287 "flush": true, 00:14:55.287 "reset": true, 00:14:55.287 "nvme_admin": false, 00:14:55.287 "nvme_io": false, 00:14:55.287 "nvme_io_md": false, 00:14:55.287 "write_zeroes": true, 00:14:55.287 "zcopy": false, 00:14:55.287 "get_zone_info": false, 00:14:55.287 "zone_management": false, 00:14:55.287 "zone_append": false, 00:14:55.287 "compare": false, 00:14:55.287 "compare_and_write": false, 00:14:55.287 "abort": false, 00:14:55.287 "seek_hole": false, 00:14:55.287 "seek_data": false, 00:14:55.287 "copy": false, 00:14:55.287 "nvme_iov_md": false 00:14:55.287 }, 00:14:55.287 "memory_domains": [ 00:14:55.287 { 00:14:55.287 "dma_device_id": "system", 00:14:55.287 "dma_device_type": 1 00:14:55.287 }, 00:14:55.287 { 00:14:55.287 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:55.287 "dma_device_type": 2 00:14:55.287 }, 00:14:55.287 { 00:14:55.287 "dma_device_id": "system", 00:14:55.287 "dma_device_type": 1 00:14:55.287 }, 00:14:55.287 { 00:14:55.287 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:55.287 "dma_device_type": 2 00:14:55.287 }, 00:14:55.287 { 00:14:55.287 "dma_device_id": "system", 00:14:55.287 "dma_device_type": 1 00:14:55.287 }, 00:14:55.287 { 00:14:55.287 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:55.287 "dma_device_type": 2 00:14:55.287 } 00:14:55.287 ], 00:14:55.287 "driver_specific": { 00:14:55.287 "raid": { 00:14:55.287 "uuid": "abd8a1f3-1ad0-4195-b0b5-ceec06cee9ea", 00:14:55.287 "strip_size_kb": 64, 00:14:55.287 "state": "online", 00:14:55.287 "raid_level": "raid0", 00:14:55.287 "superblock": true, 00:14:55.287 "num_base_bdevs": 3, 00:14:55.287 "num_base_bdevs_discovered": 3, 00:14:55.287 "num_base_bdevs_operational": 3, 00:14:55.287 "base_bdevs_list": [ 00:14:55.287 { 00:14:55.287 "name": "pt1", 00:14:55.287 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:55.287 "is_configured": true, 00:14:55.287 "data_offset": 2048, 00:14:55.287 "data_size": 63488 00:14:55.287 }, 00:14:55.287 { 00:14:55.287 "name": "pt2", 00:14:55.287 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:55.287 "is_configured": true, 00:14:55.287 "data_offset": 2048, 00:14:55.287 "data_size": 63488 00:14:55.287 }, 00:14:55.287 { 00:14:55.287 "name": "pt3", 00:14:55.287 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:55.287 "is_configured": true, 00:14:55.287 "data_offset": 2048, 00:14:55.287 "data_size": 63488 00:14:55.287 } 00:14:55.287 ] 00:14:55.287 } 00:14:55.287 } 00:14:55.287 }' 00:14:55.287 19:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:55.287 19:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:55.287 pt2 00:14:55.287 pt3' 00:14:55.287 19:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:55.287 19:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:55.287 19:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:55.547 19:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:55.547 "name": "pt1", 00:14:55.547 "aliases": [ 00:14:55.547 "00000000-0000-0000-0000-000000000001" 00:14:55.547 ], 00:14:55.547 "product_name": "passthru", 00:14:55.547 "block_size": 512, 00:14:55.547 "num_blocks": 65536, 00:14:55.547 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:55.547 "assigned_rate_limits": { 00:14:55.547 "rw_ios_per_sec": 0, 00:14:55.547 "rw_mbytes_per_sec": 0, 00:14:55.547 "r_mbytes_per_sec": 0, 00:14:55.547 "w_mbytes_per_sec": 0 00:14:55.547 }, 00:14:55.547 "claimed": true, 00:14:55.547 "claim_type": "exclusive_write", 00:14:55.547 "zoned": false, 00:14:55.547 "supported_io_types": { 00:14:55.547 "read": true, 00:14:55.547 "write": true, 00:14:55.547 "unmap": true, 00:14:55.547 "flush": true, 00:14:55.547 "reset": true, 00:14:55.547 "nvme_admin": false, 00:14:55.547 "nvme_io": false, 00:14:55.547 "nvme_io_md": false, 00:14:55.547 "write_zeroes": true, 00:14:55.547 "zcopy": true, 00:14:55.547 "get_zone_info": false, 00:14:55.547 "zone_management": false, 00:14:55.547 "zone_append": false, 00:14:55.547 "compare": false, 00:14:55.547 "compare_and_write": false, 00:14:55.547 "abort": true, 00:14:55.547 "seek_hole": false, 00:14:55.547 "seek_data": false, 00:14:55.547 "copy": true, 00:14:55.547 "nvme_iov_md": false 00:14:55.547 }, 00:14:55.547 "memory_domains": [ 00:14:55.547 { 00:14:55.547 "dma_device_id": "system", 00:14:55.547 "dma_device_type": 1 00:14:55.547 }, 00:14:55.547 { 00:14:55.547 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:55.547 "dma_device_type": 2 00:14:55.547 } 00:14:55.547 ], 00:14:55.547 "driver_specific": { 00:14:55.547 "passthru": { 00:14:55.547 "name": "pt1", 00:14:55.547 "base_bdev_name": "malloc1" 00:14:55.547 } 00:14:55.547 } 00:14:55.547 }' 00:14:55.547 19:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:55.547 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:55.547 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:55.547 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:55.547 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:55.806 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:55.806 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:55.806 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:55.806 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:55.806 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:55.806 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:55.806 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:55.806 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:55.806 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:55.806 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:56.065 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:56.065 "name": "pt2", 00:14:56.065 "aliases": [ 00:14:56.065 "00000000-0000-0000-0000-000000000002" 00:14:56.065 ], 00:14:56.065 "product_name": "passthru", 00:14:56.065 "block_size": 512, 00:14:56.065 "num_blocks": 65536, 00:14:56.065 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:56.065 "assigned_rate_limits": { 00:14:56.065 "rw_ios_per_sec": 0, 00:14:56.065 "rw_mbytes_per_sec": 0, 00:14:56.065 "r_mbytes_per_sec": 0, 00:14:56.065 "w_mbytes_per_sec": 0 00:14:56.065 }, 00:14:56.065 "claimed": true, 00:14:56.065 "claim_type": "exclusive_write", 00:14:56.065 "zoned": false, 00:14:56.065 "supported_io_types": { 00:14:56.065 "read": true, 00:14:56.065 "write": true, 00:14:56.065 "unmap": true, 00:14:56.065 "flush": true, 00:14:56.065 "reset": true, 00:14:56.065 "nvme_admin": false, 00:14:56.065 "nvme_io": false, 00:14:56.065 "nvme_io_md": false, 00:14:56.065 "write_zeroes": true, 00:14:56.065 "zcopy": true, 00:14:56.065 "get_zone_info": false, 00:14:56.065 "zone_management": false, 00:14:56.065 "zone_append": false, 00:14:56.065 "compare": false, 00:14:56.065 "compare_and_write": false, 00:14:56.065 "abort": true, 00:14:56.065 "seek_hole": false, 00:14:56.065 "seek_data": false, 00:14:56.065 "copy": true, 00:14:56.065 "nvme_iov_md": false 00:14:56.065 }, 00:14:56.065 "memory_domains": [ 00:14:56.065 { 00:14:56.065 "dma_device_id": "system", 00:14:56.065 "dma_device_type": 1 00:14:56.065 }, 00:14:56.065 { 00:14:56.065 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:56.065 "dma_device_type": 2 00:14:56.065 } 00:14:56.065 ], 00:14:56.065 "driver_specific": { 00:14:56.065 "passthru": { 00:14:56.065 "name": "pt2", 00:14:56.065 "base_bdev_name": "malloc2" 00:14:56.065 } 00:14:56.065 } 00:14:56.065 }' 00:14:56.065 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:56.065 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:56.065 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:56.065 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:56.065 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:56.323 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:56.323 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:56.323 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:56.323 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:56.323 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:56.323 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:56.323 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:56.323 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:56.323 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:56.323 19:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:56.581 19:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:56.581 "name": "pt3", 00:14:56.581 "aliases": [ 00:14:56.581 "00000000-0000-0000-0000-000000000003" 00:14:56.581 ], 00:14:56.581 "product_name": "passthru", 00:14:56.581 "block_size": 512, 00:14:56.581 "num_blocks": 65536, 00:14:56.581 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:56.581 "assigned_rate_limits": { 00:14:56.581 "rw_ios_per_sec": 0, 00:14:56.581 "rw_mbytes_per_sec": 0, 00:14:56.581 "r_mbytes_per_sec": 0, 00:14:56.581 "w_mbytes_per_sec": 0 00:14:56.581 }, 00:14:56.581 "claimed": true, 00:14:56.581 "claim_type": "exclusive_write", 00:14:56.581 "zoned": false, 00:14:56.581 "supported_io_types": { 00:14:56.581 "read": true, 00:14:56.581 "write": true, 00:14:56.581 "unmap": true, 00:14:56.581 "flush": true, 00:14:56.581 "reset": true, 00:14:56.581 "nvme_admin": false, 00:14:56.581 "nvme_io": false, 00:14:56.581 "nvme_io_md": false, 00:14:56.581 "write_zeroes": true, 00:14:56.581 "zcopy": true, 00:14:56.581 "get_zone_info": false, 00:14:56.581 "zone_management": false, 00:14:56.581 "zone_append": false, 00:14:56.581 "compare": false, 00:14:56.581 "compare_and_write": false, 00:14:56.581 "abort": true, 00:14:56.581 "seek_hole": false, 00:14:56.581 "seek_data": false, 00:14:56.581 "copy": true, 00:14:56.581 "nvme_iov_md": false 00:14:56.581 }, 00:14:56.581 "memory_domains": [ 00:14:56.581 { 00:14:56.581 "dma_device_id": "system", 00:14:56.581 "dma_device_type": 1 00:14:56.581 }, 00:14:56.581 { 00:14:56.581 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:56.581 "dma_device_type": 2 00:14:56.581 } 00:14:56.581 ], 00:14:56.581 "driver_specific": { 00:14:56.581 "passthru": { 00:14:56.581 "name": "pt3", 00:14:56.581 "base_bdev_name": "malloc3" 00:14:56.581 } 00:14:56.581 } 00:14:56.581 }' 00:14:56.581 19:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:56.581 19:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:56.581 19:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:56.581 19:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:56.581 19:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:56.841 19:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:56.841 19:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:56.841 19:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:56.841 19:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:56.841 19:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:56.841 19:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:56.841 19:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:56.841 19:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:56.841 19:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:14:57.100 [2024-07-24 19:50:48.583410] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:57.100 19:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' abd8a1f3-1ad0-4195-b0b5-ceec06cee9ea '!=' abd8a1f3-1ad0-4195-b0b5-ceec06cee9ea ']' 00:14:57.100 19:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy raid0 00:14:57.100 19:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:57.100 19:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:57.100 19:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 1403179 00:14:57.100 19:50:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1403179 ']' 00:14:57.100 19:50:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1403179 00:14:57.100 19:50:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:14:57.100 19:50:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:14:57.100 19:50:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1403179 00:14:57.100 19:50:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:14:57.100 19:50:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:14:57.100 19:50:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1403179' 00:14:57.100 killing process with pid 1403179 00:14:57.100 19:50:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 1403179 00:14:57.100 [2024-07-24 19:50:48.654830] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:57.100 [2024-07-24 19:50:48.654890] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:57.100 [2024-07-24 19:50:48.654944] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:57.100 [2024-07-24 19:50:48.654956] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd273c0 name raid_bdev1, state offline 00:14:57.100 19:50:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 1403179 00:14:57.100 [2024-07-24 19:50:48.685474] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:57.360 19:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:14:57.360 00:14:57.360 real 0m13.881s 00:14:57.360 user 0m24.862s 00:14:57.360 sys 0m2.635s 00:14:57.360 19:50:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:14:57.360 19:50:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:57.360 ************************************ 00:14:57.360 END TEST raid_superblock_test 00:14:57.360 ************************************ 00:14:57.620 19:50:48 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:14:57.620 19:50:48 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:14:57.620 19:50:48 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:14:57.620 19:50:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:57.620 ************************************ 00:14:57.620 START TEST raid_read_error_test 00:14:57.620 ************************************ 00:14:57.620 19:50:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 3 read 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid0 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=3 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid0 '!=' raid1 ']' 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.fKJxy97Gd3 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1405227 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1405227 /var/tmp/spdk-raid.sock 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 1405227 ']' 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:57.620 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:14:57.620 19:50:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:57.620 [2024-07-24 19:50:49.081362] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:14:57.620 [2024-07-24 19:50:49.081441] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1405227 ] 00:14:57.880 [2024-07-24 19:50:49.214281] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:57.880 [2024-07-24 19:50:49.320424] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:57.880 [2024-07-24 19:50:49.389458] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:57.880 [2024-07-24 19:50:49.389487] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:58.449 19:50:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:14:58.449 19:50:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:14:58.449 19:50:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:14:58.449 19:50:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:58.708 BaseBdev1_malloc 00:14:58.708 19:50:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:58.966 true 00:14:58.966 19:50:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:59.224 [2024-07-24 19:50:50.748475] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:59.224 [2024-07-24 19:50:50.748522] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:59.224 [2024-07-24 19:50:50.748542] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10093a0 00:14:59.224 [2024-07-24 19:50:50.748554] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:59.224 [2024-07-24 19:50:50.750201] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:59.224 [2024-07-24 19:50:50.750231] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:59.224 BaseBdev1 00:14:59.224 19:50:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:14:59.224 19:50:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:59.484 BaseBdev2_malloc 00:14:59.484 19:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:59.743 true 00:14:59.743 19:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:00.002 [2024-07-24 19:50:51.535237] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:00.002 [2024-07-24 19:50:51.535283] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:00.002 [2024-07-24 19:50:51.535307] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10c8370 00:15:00.002 [2024-07-24 19:50:51.535320] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:00.002 [2024-07-24 19:50:51.536773] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:00.002 [2024-07-24 19:50:51.536818] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:00.002 BaseBdev2 00:15:00.002 19:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:15:00.002 19:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:00.261 BaseBdev3_malloc 00:15:00.261 19:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:00.521 true 00:15:00.521 19:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:00.780 [2024-07-24 19:50:52.277688] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:00.780 [2024-07-24 19:50:52.277729] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:00.780 [2024-07-24 19:50:52.277751] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xffe2d0 00:15:00.780 [2024-07-24 19:50:52.277768] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:00.780 [2024-07-24 19:50:52.279159] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:00.780 [2024-07-24 19:50:52.279186] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:00.780 BaseBdev3 00:15:00.780 19:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:01.040 [2024-07-24 19:50:52.526377] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:01.040 [2024-07-24 19:50:52.527613] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:01.040 [2024-07-24 19:50:52.527681] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:01.040 [2024-07-24 19:50:52.527885] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xfff860 00:15:01.040 [2024-07-24 19:50:52.527897] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:01.040 [2024-07-24 19:50:52.528084] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10016a0 00:15:01.040 [2024-07-24 19:50:52.528228] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfff860 00:15:01.040 [2024-07-24 19:50:52.528238] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfff860 00:15:01.040 [2024-07-24 19:50:52.528339] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:01.040 19:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:01.040 19:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:01.040 19:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:01.040 19:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:01.040 19:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:01.040 19:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:01.040 19:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:01.040 19:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:01.040 19:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:01.040 19:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:01.040 19:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:01.040 19:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:01.299 19:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:01.299 "name": "raid_bdev1", 00:15:01.299 "uuid": "3085896a-aead-4394-9f74-43e8ade9ef5f", 00:15:01.299 "strip_size_kb": 64, 00:15:01.299 "state": "online", 00:15:01.299 "raid_level": "raid0", 00:15:01.299 "superblock": true, 00:15:01.299 "num_base_bdevs": 3, 00:15:01.299 "num_base_bdevs_discovered": 3, 00:15:01.299 "num_base_bdevs_operational": 3, 00:15:01.299 "base_bdevs_list": [ 00:15:01.299 { 00:15:01.299 "name": "BaseBdev1", 00:15:01.299 "uuid": "4221e88f-6c3f-5f21-ae55-3193d571703b", 00:15:01.299 "is_configured": true, 00:15:01.299 "data_offset": 2048, 00:15:01.299 "data_size": 63488 00:15:01.299 }, 00:15:01.299 { 00:15:01.299 "name": "BaseBdev2", 00:15:01.299 "uuid": "ef014834-00fc-5452-a92b-3e224061fa99", 00:15:01.299 "is_configured": true, 00:15:01.299 "data_offset": 2048, 00:15:01.299 "data_size": 63488 00:15:01.299 }, 00:15:01.299 { 00:15:01.299 "name": "BaseBdev3", 00:15:01.299 "uuid": "0dbad448-709a-519d-ba37-e438cfbc7c07", 00:15:01.299 "is_configured": true, 00:15:01.299 "data_offset": 2048, 00:15:01.299 "data_size": 63488 00:15:01.299 } 00:15:01.299 ] 00:15:01.299 }' 00:15:01.299 19:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:01.299 19:50:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:01.875 19:50:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:15:01.875 19:50:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:02.200 [2024-07-24 19:50:53.533365] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1001610 00:15:03.138 19:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:15:03.138 19:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:15:03.138 19:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid0 = \r\a\i\d\1 ]] 00:15:03.138 19:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=3 00:15:03.138 19:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:03.138 19:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:03.138 19:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:03.138 19:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:03.138 19:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:03.138 19:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:03.138 19:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:03.138 19:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:03.138 19:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:03.138 19:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:03.138 19:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:03.138 19:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:03.398 19:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:03.398 "name": "raid_bdev1", 00:15:03.398 "uuid": "3085896a-aead-4394-9f74-43e8ade9ef5f", 00:15:03.398 "strip_size_kb": 64, 00:15:03.398 "state": "online", 00:15:03.398 "raid_level": "raid0", 00:15:03.398 "superblock": true, 00:15:03.398 "num_base_bdevs": 3, 00:15:03.398 "num_base_bdevs_discovered": 3, 00:15:03.398 "num_base_bdevs_operational": 3, 00:15:03.398 "base_bdevs_list": [ 00:15:03.398 { 00:15:03.398 "name": "BaseBdev1", 00:15:03.398 "uuid": "4221e88f-6c3f-5f21-ae55-3193d571703b", 00:15:03.398 "is_configured": true, 00:15:03.398 "data_offset": 2048, 00:15:03.398 "data_size": 63488 00:15:03.398 }, 00:15:03.398 { 00:15:03.398 "name": "BaseBdev2", 00:15:03.398 "uuid": "ef014834-00fc-5452-a92b-3e224061fa99", 00:15:03.398 "is_configured": true, 00:15:03.398 "data_offset": 2048, 00:15:03.398 "data_size": 63488 00:15:03.398 }, 00:15:03.398 { 00:15:03.398 "name": "BaseBdev3", 00:15:03.398 "uuid": "0dbad448-709a-519d-ba37-e438cfbc7c07", 00:15:03.398 "is_configured": true, 00:15:03.398 "data_offset": 2048, 00:15:03.398 "data_size": 63488 00:15:03.398 } 00:15:03.398 ] 00:15:03.398 }' 00:15:03.398 19:50:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:03.398 19:50:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:03.967 19:50:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:04.226 [2024-07-24 19:50:55.718476] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:04.226 [2024-07-24 19:50:55.718521] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:04.226 [2024-07-24 19:50:55.721693] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:04.226 [2024-07-24 19:50:55.721730] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:04.226 [2024-07-24 19:50:55.721767] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:04.226 [2024-07-24 19:50:55.721778] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfff860 name raid_bdev1, state offline 00:15:04.226 0 00:15:04.226 19:50:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1405227 00:15:04.226 19:50:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 1405227 ']' 00:15:04.226 19:50:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 1405227 00:15:04.226 19:50:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:15:04.226 19:50:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:15:04.226 19:50:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1405227 00:15:04.226 19:50:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:15:04.226 19:50:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:15:04.226 19:50:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1405227' 00:15:04.226 killing process with pid 1405227 00:15:04.226 19:50:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 1405227 00:15:04.226 [2024-07-24 19:50:55.799357] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:04.226 19:50:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 1405227 00:15:04.485 [2024-07-24 19:50:55.820505] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:04.485 19:50:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.fKJxy97Gd3 00:15:04.485 19:50:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:15:04.485 19:50:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:15:04.485 19:50:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.46 00:15:04.485 19:50:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid0 00:15:04.485 19:50:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:04.485 19:50:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:04.485 19:50:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.46 != \0\.\0\0 ]] 00:15:04.485 00:15:04.485 real 0m7.060s 00:15:04.485 user 0m11.163s 00:15:04.485 sys 0m1.279s 00:15:04.485 19:50:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:15:04.485 19:50:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:04.485 ************************************ 00:15:04.485 END TEST raid_read_error_test 00:15:04.485 ************************************ 00:15:04.745 19:50:56 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:15:04.745 19:50:56 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:15:04.745 19:50:56 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:15:04.745 19:50:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:04.745 ************************************ 00:15:04.745 START TEST raid_write_error_test 00:15:04.745 ************************************ 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 3 write 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid0 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=3 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid0 '!=' raid1 ']' 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.YxjH3HAAKs 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1406211 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1406211 /var/tmp/spdk-raid.sock 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 1406211 ']' 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:04.745 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:15:04.745 19:50:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:04.745 [2024-07-24 19:50:56.224970] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:15:04.745 [2024-07-24 19:50:56.225025] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1406211 ] 00:15:05.004 [2024-07-24 19:50:56.340083] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:05.004 [2024-07-24 19:50:56.450709] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:05.004 [2024-07-24 19:50:56.512292] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:05.004 [2024-07-24 19:50:56.512320] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:05.572 19:50:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:15:05.572 19:50:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:15:05.572 19:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:15:05.572 19:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:05.831 BaseBdev1_malloc 00:15:05.831 19:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:06.090 true 00:15:06.090 19:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:06.349 [2024-07-24 19:50:57.829121] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:06.349 [2024-07-24 19:50:57.829168] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:06.349 [2024-07-24 19:50:57.829187] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12003a0 00:15:06.349 [2024-07-24 19:50:57.829200] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:06.349 [2024-07-24 19:50:57.830894] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:06.349 [2024-07-24 19:50:57.830924] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:06.349 BaseBdev1 00:15:06.349 19:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:15:06.349 19:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:06.608 BaseBdev2_malloc 00:15:06.608 19:50:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:06.867 true 00:15:06.867 19:50:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:07.126 [2024-07-24 19:50:58.551691] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:07.126 [2024-07-24 19:50:58.551737] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:07.126 [2024-07-24 19:50:58.551761] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12bf370 00:15:07.126 [2024-07-24 19:50:58.551774] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:07.126 [2024-07-24 19:50:58.553298] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:07.127 [2024-07-24 19:50:58.553325] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:07.127 BaseBdev2 00:15:07.127 19:50:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:15:07.127 19:50:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:07.386 BaseBdev3_malloc 00:15:07.386 19:50:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:07.645 true 00:15:07.645 19:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:07.904 [2024-07-24 19:50:59.294257] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:07.904 [2024-07-24 19:50:59.294302] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:07.904 [2024-07-24 19:50:59.294323] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11f52d0 00:15:07.904 [2024-07-24 19:50:59.294335] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:07.904 [2024-07-24 19:50:59.295864] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:07.904 [2024-07-24 19:50:59.295892] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:07.904 BaseBdev3 00:15:07.904 19:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:07.904 [2024-07-24 19:50:59.478781] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:07.904 [2024-07-24 19:50:59.479956] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:07.904 [2024-07-24 19:50:59.480023] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:07.904 [2024-07-24 19:50:59.480232] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x11f6860 00:15:07.904 [2024-07-24 19:50:59.480244] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:07.904 [2024-07-24 19:50:59.480440] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11f86a0 00:15:07.904 [2024-07-24 19:50:59.480584] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11f6860 00:15:07.904 [2024-07-24 19:50:59.480594] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11f6860 00:15:07.904 [2024-07-24 19:50:59.480693] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:08.164 19:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:08.164 19:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:08.164 19:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:08.164 19:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:08.164 19:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:08.164 19:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:08.164 19:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:08.164 19:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:08.164 19:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:08.164 19:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:08.164 19:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:08.164 19:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:08.164 19:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:08.164 "name": "raid_bdev1", 00:15:08.164 "uuid": "e0e64eda-eb3d-452a-b9bb-8d6b8cebe889", 00:15:08.164 "strip_size_kb": 64, 00:15:08.164 "state": "online", 00:15:08.164 "raid_level": "raid0", 00:15:08.164 "superblock": true, 00:15:08.164 "num_base_bdevs": 3, 00:15:08.164 "num_base_bdevs_discovered": 3, 00:15:08.164 "num_base_bdevs_operational": 3, 00:15:08.164 "base_bdevs_list": [ 00:15:08.164 { 00:15:08.164 "name": "BaseBdev1", 00:15:08.164 "uuid": "ae738e87-b40e-5028-9121-ee91c0af3e28", 00:15:08.164 "is_configured": true, 00:15:08.164 "data_offset": 2048, 00:15:08.164 "data_size": 63488 00:15:08.164 }, 00:15:08.164 { 00:15:08.164 "name": "BaseBdev2", 00:15:08.164 "uuid": "8208a4b7-f1c8-52da-a39c-0e0d0f5cd302", 00:15:08.164 "is_configured": true, 00:15:08.164 "data_offset": 2048, 00:15:08.164 "data_size": 63488 00:15:08.164 }, 00:15:08.164 { 00:15:08.164 "name": "BaseBdev3", 00:15:08.164 "uuid": "93df7991-047d-567f-a41c-74447c6e29ad", 00:15:08.164 "is_configured": true, 00:15:08.164 "data_offset": 2048, 00:15:08.164 "data_size": 63488 00:15:08.164 } 00:15:08.164 ] 00:15:08.164 }' 00:15:08.164 19:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:08.164 19:50:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:08.743 19:51:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:15:08.743 19:51:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:09.003 [2024-07-24 19:51:00.421640] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11f8610 00:15:09.940 19:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:15:10.198 19:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:15:10.198 19:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid0 = \r\a\i\d\1 ]] 00:15:10.198 19:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=3 00:15:10.198 19:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:10.198 19:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:10.198 19:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:10.198 19:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:10.198 19:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:10.198 19:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:10.198 19:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:10.198 19:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:10.198 19:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:10.198 19:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:10.198 19:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:10.198 19:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:10.456 19:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:10.456 "name": "raid_bdev1", 00:15:10.456 "uuid": "e0e64eda-eb3d-452a-b9bb-8d6b8cebe889", 00:15:10.456 "strip_size_kb": 64, 00:15:10.456 "state": "online", 00:15:10.456 "raid_level": "raid0", 00:15:10.456 "superblock": true, 00:15:10.456 "num_base_bdevs": 3, 00:15:10.456 "num_base_bdevs_discovered": 3, 00:15:10.456 "num_base_bdevs_operational": 3, 00:15:10.456 "base_bdevs_list": [ 00:15:10.456 { 00:15:10.456 "name": "BaseBdev1", 00:15:10.456 "uuid": "ae738e87-b40e-5028-9121-ee91c0af3e28", 00:15:10.456 "is_configured": true, 00:15:10.456 "data_offset": 2048, 00:15:10.456 "data_size": 63488 00:15:10.456 }, 00:15:10.456 { 00:15:10.456 "name": "BaseBdev2", 00:15:10.456 "uuid": "8208a4b7-f1c8-52da-a39c-0e0d0f5cd302", 00:15:10.456 "is_configured": true, 00:15:10.456 "data_offset": 2048, 00:15:10.456 "data_size": 63488 00:15:10.456 }, 00:15:10.456 { 00:15:10.456 "name": "BaseBdev3", 00:15:10.456 "uuid": "93df7991-047d-567f-a41c-74447c6e29ad", 00:15:10.456 "is_configured": true, 00:15:10.456 "data_offset": 2048, 00:15:10.456 "data_size": 63488 00:15:10.456 } 00:15:10.456 ] 00:15:10.456 }' 00:15:10.457 19:51:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:10.457 19:51:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:11.393 19:51:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:11.393 [2024-07-24 19:51:02.926343] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:11.393 [2024-07-24 19:51:02.926383] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:11.393 [2024-07-24 19:51:02.929558] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:11.393 [2024-07-24 19:51:02.929596] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:11.393 [2024-07-24 19:51:02.929631] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:11.393 [2024-07-24 19:51:02.929643] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11f6860 name raid_bdev1, state offline 00:15:11.393 0 00:15:11.393 19:51:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1406211 00:15:11.394 19:51:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 1406211 ']' 00:15:11.394 19:51:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 1406211 00:15:11.394 19:51:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:15:11.394 19:51:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:15:11.394 19:51:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1406211 00:15:11.653 19:51:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:15:11.653 19:51:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:15:11.653 19:51:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1406211' 00:15:11.653 killing process with pid 1406211 00:15:11.653 19:51:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 1406211 00:15:11.653 [2024-07-24 19:51:03.000471] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:11.653 19:51:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 1406211 00:15:11.653 [2024-07-24 19:51:03.022194] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:11.914 19:51:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.YxjH3HAAKs 00:15:11.914 19:51:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:15:11.914 19:51:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:15:11.914 19:51:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.40 00:15:11.914 19:51:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid0 00:15:11.914 19:51:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:11.914 19:51:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:11.914 19:51:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.40 != \0\.\0\0 ]] 00:15:11.914 00:15:11.914 real 0m7.112s 00:15:11.914 user 0m11.323s 00:15:11.914 sys 0m1.227s 00:15:11.914 19:51:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:15:11.914 19:51:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:11.914 ************************************ 00:15:11.914 END TEST raid_write_error_test 00:15:11.914 ************************************ 00:15:11.914 19:51:03 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:15:11.914 19:51:03 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:15:11.914 19:51:03 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:15:11.914 19:51:03 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:15:11.914 19:51:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:11.914 ************************************ 00:15:11.914 START TEST raid_state_function_test 00:15:11.914 ************************************ 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 3 false 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1407437 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1407437' 00:15:11.914 Process raid pid: 1407437 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1407437 /var/tmp/spdk-raid.sock 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 1407437 ']' 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:11.914 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:15:11.914 19:51:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:11.914 [2024-07-24 19:51:03.466012] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:15:11.914 [2024-07-24 19:51:03.466148] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:12.174 [2024-07-24 19:51:03.654756] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:12.174 [2024-07-24 19:51:03.758645] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:12.433 [2024-07-24 19:51:03.822926] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:12.433 [2024-07-24 19:51:03.822955] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:13.001 19:51:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:15:13.001 19:51:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:15:13.001 19:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:13.569 [2024-07-24 19:51:04.861901] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:13.569 [2024-07-24 19:51:04.861941] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:13.569 [2024-07-24 19:51:04.861952] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:13.569 [2024-07-24 19:51:04.861964] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:13.569 [2024-07-24 19:51:04.861973] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:13.569 [2024-07-24 19:51:04.861983] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:13.569 19:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:13.569 19:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:13.569 19:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:13.569 19:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:13.569 19:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:13.569 19:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:13.569 19:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:13.569 19:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:13.569 19:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:13.569 19:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:13.569 19:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:13.569 19:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:13.569 19:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:13.569 "name": "Existed_Raid", 00:15:13.569 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:13.569 "strip_size_kb": 64, 00:15:13.569 "state": "configuring", 00:15:13.569 "raid_level": "concat", 00:15:13.569 "superblock": false, 00:15:13.569 "num_base_bdevs": 3, 00:15:13.569 "num_base_bdevs_discovered": 0, 00:15:13.569 "num_base_bdevs_operational": 3, 00:15:13.569 "base_bdevs_list": [ 00:15:13.569 { 00:15:13.569 "name": "BaseBdev1", 00:15:13.569 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:13.569 "is_configured": false, 00:15:13.569 "data_offset": 0, 00:15:13.569 "data_size": 0 00:15:13.569 }, 00:15:13.569 { 00:15:13.569 "name": "BaseBdev2", 00:15:13.569 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:13.569 "is_configured": false, 00:15:13.569 "data_offset": 0, 00:15:13.570 "data_size": 0 00:15:13.570 }, 00:15:13.570 { 00:15:13.570 "name": "BaseBdev3", 00:15:13.570 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:13.570 "is_configured": false, 00:15:13.570 "data_offset": 0, 00:15:13.570 "data_size": 0 00:15:13.570 } 00:15:13.570 ] 00:15:13.570 }' 00:15:13.570 19:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:13.570 19:51:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:14.507 19:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:14.766 [2024-07-24 19:51:06.229367] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:14.766 [2024-07-24 19:51:06.229403] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x187ba10 name Existed_Raid, state configuring 00:15:14.766 19:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:15.025 [2024-07-24 19:51:06.478043] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:15.025 [2024-07-24 19:51:06.478074] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:15.025 [2024-07-24 19:51:06.478084] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:15.025 [2024-07-24 19:51:06.478095] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:15.025 [2024-07-24 19:51:06.478104] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:15.025 [2024-07-24 19:51:06.478115] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:15.025 19:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:15.283 [2024-07-24 19:51:06.736608] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:15.283 BaseBdev1 00:15:15.283 19:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:15.284 19:51:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:15:15.284 19:51:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:15.284 19:51:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:15:15.284 19:51:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:15.284 19:51:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:15.284 19:51:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:15.542 19:51:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:15.801 [ 00:15:15.801 { 00:15:15.801 "name": "BaseBdev1", 00:15:15.801 "aliases": [ 00:15:15.801 "c74d15b7-5a3a-4b87-b54f-73dabe998e97" 00:15:15.801 ], 00:15:15.801 "product_name": "Malloc disk", 00:15:15.801 "block_size": 512, 00:15:15.801 "num_blocks": 65536, 00:15:15.801 "uuid": "c74d15b7-5a3a-4b87-b54f-73dabe998e97", 00:15:15.801 "assigned_rate_limits": { 00:15:15.801 "rw_ios_per_sec": 0, 00:15:15.801 "rw_mbytes_per_sec": 0, 00:15:15.801 "r_mbytes_per_sec": 0, 00:15:15.801 "w_mbytes_per_sec": 0 00:15:15.801 }, 00:15:15.801 "claimed": true, 00:15:15.801 "claim_type": "exclusive_write", 00:15:15.801 "zoned": false, 00:15:15.801 "supported_io_types": { 00:15:15.801 "read": true, 00:15:15.801 "write": true, 00:15:15.801 "unmap": true, 00:15:15.801 "flush": true, 00:15:15.801 "reset": true, 00:15:15.801 "nvme_admin": false, 00:15:15.801 "nvme_io": false, 00:15:15.801 "nvme_io_md": false, 00:15:15.801 "write_zeroes": true, 00:15:15.801 "zcopy": true, 00:15:15.801 "get_zone_info": false, 00:15:15.801 "zone_management": false, 00:15:15.801 "zone_append": false, 00:15:15.801 "compare": false, 00:15:15.801 "compare_and_write": false, 00:15:15.801 "abort": true, 00:15:15.801 "seek_hole": false, 00:15:15.801 "seek_data": false, 00:15:15.801 "copy": true, 00:15:15.801 "nvme_iov_md": false 00:15:15.801 }, 00:15:15.801 "memory_domains": [ 00:15:15.801 { 00:15:15.801 "dma_device_id": "system", 00:15:15.801 "dma_device_type": 1 00:15:15.801 }, 00:15:15.801 { 00:15:15.801 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:15.801 "dma_device_type": 2 00:15:15.801 } 00:15:15.801 ], 00:15:15.801 "driver_specific": {} 00:15:15.801 } 00:15:15.801 ] 00:15:15.801 19:51:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:15:15.801 19:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:15.801 19:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:15.801 19:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:15.801 19:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:15.801 19:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:15.801 19:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:15.801 19:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:15.801 19:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:15.801 19:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:15.801 19:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:15.801 19:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:15.801 19:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:16.060 19:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:16.060 "name": "Existed_Raid", 00:15:16.060 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:16.060 "strip_size_kb": 64, 00:15:16.060 "state": "configuring", 00:15:16.060 "raid_level": "concat", 00:15:16.060 "superblock": false, 00:15:16.060 "num_base_bdevs": 3, 00:15:16.060 "num_base_bdevs_discovered": 1, 00:15:16.060 "num_base_bdevs_operational": 3, 00:15:16.060 "base_bdevs_list": [ 00:15:16.060 { 00:15:16.060 "name": "BaseBdev1", 00:15:16.060 "uuid": "c74d15b7-5a3a-4b87-b54f-73dabe998e97", 00:15:16.060 "is_configured": true, 00:15:16.060 "data_offset": 0, 00:15:16.060 "data_size": 65536 00:15:16.060 }, 00:15:16.060 { 00:15:16.060 "name": "BaseBdev2", 00:15:16.060 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:16.060 "is_configured": false, 00:15:16.060 "data_offset": 0, 00:15:16.060 "data_size": 0 00:15:16.060 }, 00:15:16.060 { 00:15:16.060 "name": "BaseBdev3", 00:15:16.060 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:16.060 "is_configured": false, 00:15:16.060 "data_offset": 0, 00:15:16.060 "data_size": 0 00:15:16.060 } 00:15:16.060 ] 00:15:16.060 }' 00:15:16.060 19:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:16.060 19:51:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:16.627 19:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:16.886 [2024-07-24 19:51:08.300727] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:16.886 [2024-07-24 19:51:08.300765] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x187b2e0 name Existed_Raid, state configuring 00:15:16.886 19:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:17.144 [2024-07-24 19:51:08.549427] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:17.144 [2024-07-24 19:51:08.550893] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:17.144 [2024-07-24 19:51:08.550923] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:17.144 [2024-07-24 19:51:08.550933] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:17.144 [2024-07-24 19:51:08.550944] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:17.144 19:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:17.144 19:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:17.144 19:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:17.144 19:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:17.144 19:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:17.144 19:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:17.144 19:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:17.144 19:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:17.144 19:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:17.144 19:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:17.144 19:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:17.144 19:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:17.144 19:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:17.144 19:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:17.401 19:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:17.401 "name": "Existed_Raid", 00:15:17.401 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:17.401 "strip_size_kb": 64, 00:15:17.401 "state": "configuring", 00:15:17.401 "raid_level": "concat", 00:15:17.401 "superblock": false, 00:15:17.401 "num_base_bdevs": 3, 00:15:17.401 "num_base_bdevs_discovered": 1, 00:15:17.401 "num_base_bdevs_operational": 3, 00:15:17.401 "base_bdevs_list": [ 00:15:17.401 { 00:15:17.401 "name": "BaseBdev1", 00:15:17.401 "uuid": "c74d15b7-5a3a-4b87-b54f-73dabe998e97", 00:15:17.401 "is_configured": true, 00:15:17.401 "data_offset": 0, 00:15:17.401 "data_size": 65536 00:15:17.401 }, 00:15:17.401 { 00:15:17.401 "name": "BaseBdev2", 00:15:17.401 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:17.401 "is_configured": false, 00:15:17.401 "data_offset": 0, 00:15:17.401 "data_size": 0 00:15:17.401 }, 00:15:17.401 { 00:15:17.401 "name": "BaseBdev3", 00:15:17.401 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:17.401 "is_configured": false, 00:15:17.401 "data_offset": 0, 00:15:17.401 "data_size": 0 00:15:17.401 } 00:15:17.401 ] 00:15:17.401 }' 00:15:17.401 19:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:17.401 19:51:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:17.968 19:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:18.226 [2024-07-24 19:51:09.643620] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:18.226 BaseBdev2 00:15:18.226 19:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:18.226 19:51:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:15:18.226 19:51:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:18.226 19:51:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:15:18.226 19:51:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:18.226 19:51:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:18.226 19:51:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:18.484 19:51:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:18.742 [ 00:15:18.742 { 00:15:18.742 "name": "BaseBdev2", 00:15:18.742 "aliases": [ 00:15:18.742 "3a6d283b-b482-40cc-b7bb-06e9917eb153" 00:15:18.742 ], 00:15:18.742 "product_name": "Malloc disk", 00:15:18.742 "block_size": 512, 00:15:18.742 "num_blocks": 65536, 00:15:18.742 "uuid": "3a6d283b-b482-40cc-b7bb-06e9917eb153", 00:15:18.742 "assigned_rate_limits": { 00:15:18.742 "rw_ios_per_sec": 0, 00:15:18.742 "rw_mbytes_per_sec": 0, 00:15:18.742 "r_mbytes_per_sec": 0, 00:15:18.742 "w_mbytes_per_sec": 0 00:15:18.742 }, 00:15:18.742 "claimed": true, 00:15:18.742 "claim_type": "exclusive_write", 00:15:18.742 "zoned": false, 00:15:18.742 "supported_io_types": { 00:15:18.742 "read": true, 00:15:18.742 "write": true, 00:15:18.742 "unmap": true, 00:15:18.742 "flush": true, 00:15:18.742 "reset": true, 00:15:18.742 "nvme_admin": false, 00:15:18.742 "nvme_io": false, 00:15:18.742 "nvme_io_md": false, 00:15:18.742 "write_zeroes": true, 00:15:18.742 "zcopy": true, 00:15:18.742 "get_zone_info": false, 00:15:18.742 "zone_management": false, 00:15:18.742 "zone_append": false, 00:15:18.742 "compare": false, 00:15:18.742 "compare_and_write": false, 00:15:18.742 "abort": true, 00:15:18.742 "seek_hole": false, 00:15:18.742 "seek_data": false, 00:15:18.742 "copy": true, 00:15:18.742 "nvme_iov_md": false 00:15:18.742 }, 00:15:18.742 "memory_domains": [ 00:15:18.742 { 00:15:18.742 "dma_device_id": "system", 00:15:18.742 "dma_device_type": 1 00:15:18.742 }, 00:15:18.742 { 00:15:18.742 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:18.742 "dma_device_type": 2 00:15:18.742 } 00:15:18.742 ], 00:15:18.742 "driver_specific": {} 00:15:18.742 } 00:15:18.742 ] 00:15:18.742 19:51:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:15:18.742 19:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:18.742 19:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:18.742 19:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:18.742 19:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:18.742 19:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:18.742 19:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:18.742 19:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:18.742 19:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:18.742 19:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:18.742 19:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:18.742 19:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:18.742 19:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:18.742 19:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:18.742 19:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:19.008 19:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:19.008 "name": "Existed_Raid", 00:15:19.008 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:19.008 "strip_size_kb": 64, 00:15:19.008 "state": "configuring", 00:15:19.008 "raid_level": "concat", 00:15:19.008 "superblock": false, 00:15:19.008 "num_base_bdevs": 3, 00:15:19.008 "num_base_bdevs_discovered": 2, 00:15:19.008 "num_base_bdevs_operational": 3, 00:15:19.008 "base_bdevs_list": [ 00:15:19.008 { 00:15:19.008 "name": "BaseBdev1", 00:15:19.008 "uuid": "c74d15b7-5a3a-4b87-b54f-73dabe998e97", 00:15:19.008 "is_configured": true, 00:15:19.008 "data_offset": 0, 00:15:19.008 "data_size": 65536 00:15:19.009 }, 00:15:19.009 { 00:15:19.009 "name": "BaseBdev2", 00:15:19.009 "uuid": "3a6d283b-b482-40cc-b7bb-06e9917eb153", 00:15:19.009 "is_configured": true, 00:15:19.009 "data_offset": 0, 00:15:19.009 "data_size": 65536 00:15:19.009 }, 00:15:19.009 { 00:15:19.009 "name": "BaseBdev3", 00:15:19.009 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:19.009 "is_configured": false, 00:15:19.009 "data_offset": 0, 00:15:19.009 "data_size": 0 00:15:19.009 } 00:15:19.009 ] 00:15:19.009 }' 00:15:19.009 19:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:19.009 19:51:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:19.962 19:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:19.962 [2024-07-24 19:51:11.531989] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:19.962 [2024-07-24 19:51:11.532033] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x187c1d0 00:15:19.962 [2024-07-24 19:51:11.532041] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:15:19.962 [2024-07-24 19:51:11.532232] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a23370 00:15:19.962 [2024-07-24 19:51:11.532356] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x187c1d0 00:15:19.962 [2024-07-24 19:51:11.532366] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x187c1d0 00:15:19.962 [2024-07-24 19:51:11.532535] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:19.962 BaseBdev3 00:15:19.962 19:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:19.962 19:51:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:15:19.962 19:51:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:19.962 19:51:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:15:19.962 19:51:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:19.962 19:51:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:19.962 19:51:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:20.221 19:51:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:20.478 [ 00:15:20.478 { 00:15:20.478 "name": "BaseBdev3", 00:15:20.478 "aliases": [ 00:15:20.478 "1267fd9b-a298-43d8-8f7b-0c23ff2f763c" 00:15:20.478 ], 00:15:20.478 "product_name": "Malloc disk", 00:15:20.478 "block_size": 512, 00:15:20.478 "num_blocks": 65536, 00:15:20.478 "uuid": "1267fd9b-a298-43d8-8f7b-0c23ff2f763c", 00:15:20.478 "assigned_rate_limits": { 00:15:20.478 "rw_ios_per_sec": 0, 00:15:20.478 "rw_mbytes_per_sec": 0, 00:15:20.478 "r_mbytes_per_sec": 0, 00:15:20.478 "w_mbytes_per_sec": 0 00:15:20.478 }, 00:15:20.478 "claimed": true, 00:15:20.478 "claim_type": "exclusive_write", 00:15:20.478 "zoned": false, 00:15:20.478 "supported_io_types": { 00:15:20.478 "read": true, 00:15:20.478 "write": true, 00:15:20.479 "unmap": true, 00:15:20.479 "flush": true, 00:15:20.479 "reset": true, 00:15:20.479 "nvme_admin": false, 00:15:20.479 "nvme_io": false, 00:15:20.479 "nvme_io_md": false, 00:15:20.479 "write_zeroes": true, 00:15:20.479 "zcopy": true, 00:15:20.479 "get_zone_info": false, 00:15:20.479 "zone_management": false, 00:15:20.479 "zone_append": false, 00:15:20.479 "compare": false, 00:15:20.479 "compare_and_write": false, 00:15:20.479 "abort": true, 00:15:20.479 "seek_hole": false, 00:15:20.479 "seek_data": false, 00:15:20.479 "copy": true, 00:15:20.479 "nvme_iov_md": false 00:15:20.479 }, 00:15:20.479 "memory_domains": [ 00:15:20.479 { 00:15:20.479 "dma_device_id": "system", 00:15:20.479 "dma_device_type": 1 00:15:20.479 }, 00:15:20.479 { 00:15:20.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:20.479 "dma_device_type": 2 00:15:20.479 } 00:15:20.479 ], 00:15:20.479 "driver_specific": {} 00:15:20.479 } 00:15:20.479 ] 00:15:20.479 19:51:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:15:20.479 19:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:20.479 19:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:20.479 19:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:20.479 19:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:20.479 19:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:20.479 19:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:20.479 19:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:20.479 19:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:20.479 19:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:20.479 19:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:20.479 19:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:20.479 19:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:20.479 19:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:20.479 19:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:20.737 19:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:20.737 "name": "Existed_Raid", 00:15:20.737 "uuid": "67205027-5b19-4e2b-aae3-cc2148b50c4a", 00:15:20.737 "strip_size_kb": 64, 00:15:20.737 "state": "online", 00:15:20.737 "raid_level": "concat", 00:15:20.737 "superblock": false, 00:15:20.737 "num_base_bdevs": 3, 00:15:20.737 "num_base_bdevs_discovered": 3, 00:15:20.737 "num_base_bdevs_operational": 3, 00:15:20.737 "base_bdevs_list": [ 00:15:20.737 { 00:15:20.737 "name": "BaseBdev1", 00:15:20.737 "uuid": "c74d15b7-5a3a-4b87-b54f-73dabe998e97", 00:15:20.737 "is_configured": true, 00:15:20.737 "data_offset": 0, 00:15:20.737 "data_size": 65536 00:15:20.737 }, 00:15:20.737 { 00:15:20.737 "name": "BaseBdev2", 00:15:20.737 "uuid": "3a6d283b-b482-40cc-b7bb-06e9917eb153", 00:15:20.737 "is_configured": true, 00:15:20.737 "data_offset": 0, 00:15:20.737 "data_size": 65536 00:15:20.737 }, 00:15:20.737 { 00:15:20.737 "name": "BaseBdev3", 00:15:20.737 "uuid": "1267fd9b-a298-43d8-8f7b-0c23ff2f763c", 00:15:20.737 "is_configured": true, 00:15:20.737 "data_offset": 0, 00:15:20.737 "data_size": 65536 00:15:20.737 } 00:15:20.737 ] 00:15:20.737 }' 00:15:20.737 19:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:20.737 19:51:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:21.304 19:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:21.304 19:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:21.304 19:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:21.304 19:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:21.304 19:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:21.304 19:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:21.304 19:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:21.304 19:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:21.871 [2024-07-24 19:51:13.349165] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:21.871 19:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:21.871 "name": "Existed_Raid", 00:15:21.871 "aliases": [ 00:15:21.871 "67205027-5b19-4e2b-aae3-cc2148b50c4a" 00:15:21.871 ], 00:15:21.871 "product_name": "Raid Volume", 00:15:21.871 "block_size": 512, 00:15:21.871 "num_blocks": 196608, 00:15:21.871 "uuid": "67205027-5b19-4e2b-aae3-cc2148b50c4a", 00:15:21.871 "assigned_rate_limits": { 00:15:21.871 "rw_ios_per_sec": 0, 00:15:21.871 "rw_mbytes_per_sec": 0, 00:15:21.871 "r_mbytes_per_sec": 0, 00:15:21.871 "w_mbytes_per_sec": 0 00:15:21.871 }, 00:15:21.871 "claimed": false, 00:15:21.871 "zoned": false, 00:15:21.871 "supported_io_types": { 00:15:21.871 "read": true, 00:15:21.871 "write": true, 00:15:21.871 "unmap": true, 00:15:21.871 "flush": true, 00:15:21.871 "reset": true, 00:15:21.871 "nvme_admin": false, 00:15:21.871 "nvme_io": false, 00:15:21.871 "nvme_io_md": false, 00:15:21.871 "write_zeroes": true, 00:15:21.871 "zcopy": false, 00:15:21.871 "get_zone_info": false, 00:15:21.871 "zone_management": false, 00:15:21.871 "zone_append": false, 00:15:21.871 "compare": false, 00:15:21.871 "compare_and_write": false, 00:15:21.871 "abort": false, 00:15:21.871 "seek_hole": false, 00:15:21.871 "seek_data": false, 00:15:21.871 "copy": false, 00:15:21.871 "nvme_iov_md": false 00:15:21.871 }, 00:15:21.871 "memory_domains": [ 00:15:21.871 { 00:15:21.871 "dma_device_id": "system", 00:15:21.871 "dma_device_type": 1 00:15:21.871 }, 00:15:21.871 { 00:15:21.871 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:21.871 "dma_device_type": 2 00:15:21.871 }, 00:15:21.871 { 00:15:21.871 "dma_device_id": "system", 00:15:21.871 "dma_device_type": 1 00:15:21.871 }, 00:15:21.871 { 00:15:21.871 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:21.871 "dma_device_type": 2 00:15:21.871 }, 00:15:21.871 { 00:15:21.871 "dma_device_id": "system", 00:15:21.871 "dma_device_type": 1 00:15:21.871 }, 00:15:21.871 { 00:15:21.871 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:21.871 "dma_device_type": 2 00:15:21.871 } 00:15:21.871 ], 00:15:21.871 "driver_specific": { 00:15:21.871 "raid": { 00:15:21.871 "uuid": "67205027-5b19-4e2b-aae3-cc2148b50c4a", 00:15:21.871 "strip_size_kb": 64, 00:15:21.871 "state": "online", 00:15:21.871 "raid_level": "concat", 00:15:21.871 "superblock": false, 00:15:21.871 "num_base_bdevs": 3, 00:15:21.871 "num_base_bdevs_discovered": 3, 00:15:21.871 "num_base_bdevs_operational": 3, 00:15:21.871 "base_bdevs_list": [ 00:15:21.871 { 00:15:21.871 "name": "BaseBdev1", 00:15:21.871 "uuid": "c74d15b7-5a3a-4b87-b54f-73dabe998e97", 00:15:21.871 "is_configured": true, 00:15:21.871 "data_offset": 0, 00:15:21.871 "data_size": 65536 00:15:21.871 }, 00:15:21.871 { 00:15:21.871 "name": "BaseBdev2", 00:15:21.871 "uuid": "3a6d283b-b482-40cc-b7bb-06e9917eb153", 00:15:21.871 "is_configured": true, 00:15:21.871 "data_offset": 0, 00:15:21.871 "data_size": 65536 00:15:21.871 }, 00:15:21.871 { 00:15:21.871 "name": "BaseBdev3", 00:15:21.871 "uuid": "1267fd9b-a298-43d8-8f7b-0c23ff2f763c", 00:15:21.871 "is_configured": true, 00:15:21.871 "data_offset": 0, 00:15:21.871 "data_size": 65536 00:15:21.871 } 00:15:21.871 ] 00:15:21.871 } 00:15:21.871 } 00:15:21.871 }' 00:15:21.871 19:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:21.871 19:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:21.871 BaseBdev2 00:15:21.871 BaseBdev3' 00:15:21.871 19:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:21.871 19:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:21.871 19:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:22.129 19:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:22.129 "name": "BaseBdev1", 00:15:22.129 "aliases": [ 00:15:22.129 "c74d15b7-5a3a-4b87-b54f-73dabe998e97" 00:15:22.129 ], 00:15:22.129 "product_name": "Malloc disk", 00:15:22.129 "block_size": 512, 00:15:22.129 "num_blocks": 65536, 00:15:22.129 "uuid": "c74d15b7-5a3a-4b87-b54f-73dabe998e97", 00:15:22.129 "assigned_rate_limits": { 00:15:22.129 "rw_ios_per_sec": 0, 00:15:22.129 "rw_mbytes_per_sec": 0, 00:15:22.129 "r_mbytes_per_sec": 0, 00:15:22.129 "w_mbytes_per_sec": 0 00:15:22.129 }, 00:15:22.129 "claimed": true, 00:15:22.129 "claim_type": "exclusive_write", 00:15:22.129 "zoned": false, 00:15:22.129 "supported_io_types": { 00:15:22.129 "read": true, 00:15:22.129 "write": true, 00:15:22.129 "unmap": true, 00:15:22.129 "flush": true, 00:15:22.129 "reset": true, 00:15:22.129 "nvme_admin": false, 00:15:22.129 "nvme_io": false, 00:15:22.129 "nvme_io_md": false, 00:15:22.129 "write_zeroes": true, 00:15:22.129 "zcopy": true, 00:15:22.129 "get_zone_info": false, 00:15:22.129 "zone_management": false, 00:15:22.129 "zone_append": false, 00:15:22.129 "compare": false, 00:15:22.129 "compare_and_write": false, 00:15:22.129 "abort": true, 00:15:22.129 "seek_hole": false, 00:15:22.129 "seek_data": false, 00:15:22.129 "copy": true, 00:15:22.129 "nvme_iov_md": false 00:15:22.129 }, 00:15:22.129 "memory_domains": [ 00:15:22.129 { 00:15:22.129 "dma_device_id": "system", 00:15:22.129 "dma_device_type": 1 00:15:22.129 }, 00:15:22.129 { 00:15:22.129 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:22.129 "dma_device_type": 2 00:15:22.129 } 00:15:22.129 ], 00:15:22.129 "driver_specific": {} 00:15:22.129 }' 00:15:22.129 19:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:22.129 19:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:22.387 19:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:22.387 19:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:22.387 19:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:22.387 19:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:22.387 19:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:22.387 19:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:22.387 19:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:22.387 19:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:22.645 19:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:22.645 19:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:22.645 19:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:22.645 19:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:22.645 19:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:22.905 19:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:22.905 "name": "BaseBdev2", 00:15:22.905 "aliases": [ 00:15:22.905 "3a6d283b-b482-40cc-b7bb-06e9917eb153" 00:15:22.905 ], 00:15:22.905 "product_name": "Malloc disk", 00:15:22.905 "block_size": 512, 00:15:22.905 "num_blocks": 65536, 00:15:22.905 "uuid": "3a6d283b-b482-40cc-b7bb-06e9917eb153", 00:15:22.905 "assigned_rate_limits": { 00:15:22.905 "rw_ios_per_sec": 0, 00:15:22.905 "rw_mbytes_per_sec": 0, 00:15:22.905 "r_mbytes_per_sec": 0, 00:15:22.905 "w_mbytes_per_sec": 0 00:15:22.905 }, 00:15:22.905 "claimed": true, 00:15:22.905 "claim_type": "exclusive_write", 00:15:22.905 "zoned": false, 00:15:22.905 "supported_io_types": { 00:15:22.905 "read": true, 00:15:22.905 "write": true, 00:15:22.905 "unmap": true, 00:15:22.905 "flush": true, 00:15:22.905 "reset": true, 00:15:22.905 "nvme_admin": false, 00:15:22.905 "nvme_io": false, 00:15:22.905 "nvme_io_md": false, 00:15:22.905 "write_zeroes": true, 00:15:22.905 "zcopy": true, 00:15:22.905 "get_zone_info": false, 00:15:22.905 "zone_management": false, 00:15:22.905 "zone_append": false, 00:15:22.905 "compare": false, 00:15:22.905 "compare_and_write": false, 00:15:22.905 "abort": true, 00:15:22.905 "seek_hole": false, 00:15:22.905 "seek_data": false, 00:15:22.905 "copy": true, 00:15:22.905 "nvme_iov_md": false 00:15:22.905 }, 00:15:22.905 "memory_domains": [ 00:15:22.905 { 00:15:22.905 "dma_device_id": "system", 00:15:22.905 "dma_device_type": 1 00:15:22.905 }, 00:15:22.905 { 00:15:22.905 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:22.905 "dma_device_type": 2 00:15:22.905 } 00:15:22.905 ], 00:15:22.905 "driver_specific": {} 00:15:22.905 }' 00:15:22.905 19:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:22.905 19:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:22.905 19:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:22.905 19:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:22.905 19:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:22.905 19:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:22.905 19:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:22.905 19:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:23.164 19:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:23.164 19:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:23.164 19:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:23.164 19:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:23.164 19:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:23.164 19:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:23.164 19:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:23.423 19:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:23.423 "name": "BaseBdev3", 00:15:23.423 "aliases": [ 00:15:23.423 "1267fd9b-a298-43d8-8f7b-0c23ff2f763c" 00:15:23.423 ], 00:15:23.423 "product_name": "Malloc disk", 00:15:23.423 "block_size": 512, 00:15:23.423 "num_blocks": 65536, 00:15:23.423 "uuid": "1267fd9b-a298-43d8-8f7b-0c23ff2f763c", 00:15:23.423 "assigned_rate_limits": { 00:15:23.423 "rw_ios_per_sec": 0, 00:15:23.423 "rw_mbytes_per_sec": 0, 00:15:23.423 "r_mbytes_per_sec": 0, 00:15:23.423 "w_mbytes_per_sec": 0 00:15:23.423 }, 00:15:23.423 "claimed": true, 00:15:23.423 "claim_type": "exclusive_write", 00:15:23.423 "zoned": false, 00:15:23.423 "supported_io_types": { 00:15:23.423 "read": true, 00:15:23.423 "write": true, 00:15:23.423 "unmap": true, 00:15:23.423 "flush": true, 00:15:23.423 "reset": true, 00:15:23.423 "nvme_admin": false, 00:15:23.423 "nvme_io": false, 00:15:23.423 "nvme_io_md": false, 00:15:23.423 "write_zeroes": true, 00:15:23.423 "zcopy": true, 00:15:23.423 "get_zone_info": false, 00:15:23.423 "zone_management": false, 00:15:23.423 "zone_append": false, 00:15:23.423 "compare": false, 00:15:23.423 "compare_and_write": false, 00:15:23.423 "abort": true, 00:15:23.423 "seek_hole": false, 00:15:23.423 "seek_data": false, 00:15:23.423 "copy": true, 00:15:23.423 "nvme_iov_md": false 00:15:23.423 }, 00:15:23.423 "memory_domains": [ 00:15:23.423 { 00:15:23.423 "dma_device_id": "system", 00:15:23.423 "dma_device_type": 1 00:15:23.423 }, 00:15:23.423 { 00:15:23.423 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:23.423 "dma_device_type": 2 00:15:23.423 } 00:15:23.423 ], 00:15:23.423 "driver_specific": {} 00:15:23.423 }' 00:15:23.423 19:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:23.423 19:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:23.682 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:23.682 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:23.682 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:23.682 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:23.682 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:23.682 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:23.682 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:23.682 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:23.682 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:23.941 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:23.941 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:23.941 [2024-07-24 19:51:15.530650] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:23.941 [2024-07-24 19:51:15.530682] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:23.941 [2024-07-24 19:51:15.530727] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:24.201 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:24.201 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:15:24.201 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:24.201 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:24.201 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:24.201 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:15:24.201 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:24.201 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:24.201 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:24.201 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:24.201 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:24.201 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:24.201 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:24.201 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:24.201 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:24.201 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:24.201 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.460 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:24.461 "name": "Existed_Raid", 00:15:24.461 "uuid": "67205027-5b19-4e2b-aae3-cc2148b50c4a", 00:15:24.461 "strip_size_kb": 64, 00:15:24.461 "state": "offline", 00:15:24.461 "raid_level": "concat", 00:15:24.461 "superblock": false, 00:15:24.461 "num_base_bdevs": 3, 00:15:24.461 "num_base_bdevs_discovered": 2, 00:15:24.461 "num_base_bdevs_operational": 2, 00:15:24.461 "base_bdevs_list": [ 00:15:24.461 { 00:15:24.461 "name": null, 00:15:24.461 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:24.461 "is_configured": false, 00:15:24.461 "data_offset": 0, 00:15:24.461 "data_size": 65536 00:15:24.461 }, 00:15:24.461 { 00:15:24.461 "name": "BaseBdev2", 00:15:24.461 "uuid": "3a6d283b-b482-40cc-b7bb-06e9917eb153", 00:15:24.461 "is_configured": true, 00:15:24.461 "data_offset": 0, 00:15:24.461 "data_size": 65536 00:15:24.461 }, 00:15:24.461 { 00:15:24.461 "name": "BaseBdev3", 00:15:24.461 "uuid": "1267fd9b-a298-43d8-8f7b-0c23ff2f763c", 00:15:24.461 "is_configured": true, 00:15:24.461 "data_offset": 0, 00:15:24.461 "data_size": 65536 00:15:24.461 } 00:15:24.461 ] 00:15:24.461 }' 00:15:24.461 19:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:24.461 19:51:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:25.029 19:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:25.029 19:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:25.029 19:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:25.029 19:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:25.288 19:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:25.288 19:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:25.288 19:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:25.548 [2024-07-24 19:51:16.923965] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:25.548 19:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:25.548 19:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:25.548 19:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:25.548 19:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:25.806 19:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:25.806 19:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:25.806 19:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:26.064 [2024-07-24 19:51:17.449468] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:26.064 [2024-07-24 19:51:17.449522] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x187c1d0 name Existed_Raid, state offline 00:15:26.064 19:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:26.064 19:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:26.064 19:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:26.064 19:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:26.324 19:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:26.324 19:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:26.324 19:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:15:26.324 19:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:26.324 19:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:26.324 19:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:26.583 BaseBdev2 00:15:26.583 19:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:26.583 19:51:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:15:26.583 19:51:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:26.583 19:51:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:15:26.583 19:51:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:26.583 19:51:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:26.583 19:51:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:26.842 19:51:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:27.101 [ 00:15:27.101 { 00:15:27.101 "name": "BaseBdev2", 00:15:27.101 "aliases": [ 00:15:27.101 "0a470a92-ec83-4475-8d1d-831b2fde0170" 00:15:27.101 ], 00:15:27.101 "product_name": "Malloc disk", 00:15:27.101 "block_size": 512, 00:15:27.101 "num_blocks": 65536, 00:15:27.101 "uuid": "0a470a92-ec83-4475-8d1d-831b2fde0170", 00:15:27.101 "assigned_rate_limits": { 00:15:27.101 "rw_ios_per_sec": 0, 00:15:27.101 "rw_mbytes_per_sec": 0, 00:15:27.101 "r_mbytes_per_sec": 0, 00:15:27.101 "w_mbytes_per_sec": 0 00:15:27.101 }, 00:15:27.101 "claimed": false, 00:15:27.101 "zoned": false, 00:15:27.101 "supported_io_types": { 00:15:27.101 "read": true, 00:15:27.101 "write": true, 00:15:27.101 "unmap": true, 00:15:27.101 "flush": true, 00:15:27.101 "reset": true, 00:15:27.101 "nvme_admin": false, 00:15:27.101 "nvme_io": false, 00:15:27.101 "nvme_io_md": false, 00:15:27.101 "write_zeroes": true, 00:15:27.101 "zcopy": true, 00:15:27.101 "get_zone_info": false, 00:15:27.101 "zone_management": false, 00:15:27.101 "zone_append": false, 00:15:27.101 "compare": false, 00:15:27.101 "compare_and_write": false, 00:15:27.101 "abort": true, 00:15:27.101 "seek_hole": false, 00:15:27.101 "seek_data": false, 00:15:27.101 "copy": true, 00:15:27.101 "nvme_iov_md": false 00:15:27.101 }, 00:15:27.101 "memory_domains": [ 00:15:27.101 { 00:15:27.101 "dma_device_id": "system", 00:15:27.101 "dma_device_type": 1 00:15:27.101 }, 00:15:27.101 { 00:15:27.101 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:27.101 "dma_device_type": 2 00:15:27.101 } 00:15:27.101 ], 00:15:27.101 "driver_specific": {} 00:15:27.101 } 00:15:27.101 ] 00:15:27.101 19:51:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:15:27.102 19:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:27.102 19:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:27.102 19:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:27.361 BaseBdev3 00:15:27.361 19:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:27.361 19:51:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:15:27.361 19:51:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:27.361 19:51:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:15:27.361 19:51:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:27.361 19:51:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:27.361 19:51:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:27.621 19:51:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:27.880 [ 00:15:27.880 { 00:15:27.880 "name": "BaseBdev3", 00:15:27.880 "aliases": [ 00:15:27.880 "d7d3b8be-74a3-42ea-bda2-de16e9aa1146" 00:15:27.880 ], 00:15:27.880 "product_name": "Malloc disk", 00:15:27.880 "block_size": 512, 00:15:27.880 "num_blocks": 65536, 00:15:27.880 "uuid": "d7d3b8be-74a3-42ea-bda2-de16e9aa1146", 00:15:27.880 "assigned_rate_limits": { 00:15:27.880 "rw_ios_per_sec": 0, 00:15:27.880 "rw_mbytes_per_sec": 0, 00:15:27.880 "r_mbytes_per_sec": 0, 00:15:27.880 "w_mbytes_per_sec": 0 00:15:27.880 }, 00:15:27.880 "claimed": false, 00:15:27.880 "zoned": false, 00:15:27.880 "supported_io_types": { 00:15:27.880 "read": true, 00:15:27.880 "write": true, 00:15:27.880 "unmap": true, 00:15:27.880 "flush": true, 00:15:27.880 "reset": true, 00:15:27.880 "nvme_admin": false, 00:15:27.880 "nvme_io": false, 00:15:27.880 "nvme_io_md": false, 00:15:27.880 "write_zeroes": true, 00:15:27.880 "zcopy": true, 00:15:27.880 "get_zone_info": false, 00:15:27.880 "zone_management": false, 00:15:27.880 "zone_append": false, 00:15:27.880 "compare": false, 00:15:27.880 "compare_and_write": false, 00:15:27.880 "abort": true, 00:15:27.880 "seek_hole": false, 00:15:27.880 "seek_data": false, 00:15:27.880 "copy": true, 00:15:27.880 "nvme_iov_md": false 00:15:27.880 }, 00:15:27.880 "memory_domains": [ 00:15:27.880 { 00:15:27.880 "dma_device_id": "system", 00:15:27.880 "dma_device_type": 1 00:15:27.881 }, 00:15:27.881 { 00:15:27.881 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:27.881 "dma_device_type": 2 00:15:27.881 } 00:15:27.881 ], 00:15:27.881 "driver_specific": {} 00:15:27.881 } 00:15:27.881 ] 00:15:27.881 19:51:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:15:27.881 19:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:27.881 19:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:27.881 19:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:27.881 [2024-07-24 19:51:19.446727] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:27.881 [2024-07-24 19:51:19.446778] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:27.881 [2024-07-24 19:51:19.446798] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:27.881 [2024-07-24 19:51:19.448504] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:27.881 19:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:27.881 19:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:27.881 19:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:27.881 19:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:27.881 19:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:27.881 19:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:27.881 19:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:27.881 19:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:27.881 19:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:27.881 19:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:27.881 19:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:27.881 19:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:28.140 19:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:28.140 "name": "Existed_Raid", 00:15:28.140 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:28.140 "strip_size_kb": 64, 00:15:28.140 "state": "configuring", 00:15:28.140 "raid_level": "concat", 00:15:28.140 "superblock": false, 00:15:28.140 "num_base_bdevs": 3, 00:15:28.140 "num_base_bdevs_discovered": 2, 00:15:28.140 "num_base_bdevs_operational": 3, 00:15:28.140 "base_bdevs_list": [ 00:15:28.140 { 00:15:28.140 "name": "BaseBdev1", 00:15:28.140 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:28.140 "is_configured": false, 00:15:28.140 "data_offset": 0, 00:15:28.140 "data_size": 0 00:15:28.140 }, 00:15:28.140 { 00:15:28.140 "name": "BaseBdev2", 00:15:28.140 "uuid": "0a470a92-ec83-4475-8d1d-831b2fde0170", 00:15:28.140 "is_configured": true, 00:15:28.140 "data_offset": 0, 00:15:28.140 "data_size": 65536 00:15:28.140 }, 00:15:28.140 { 00:15:28.140 "name": "BaseBdev3", 00:15:28.140 "uuid": "d7d3b8be-74a3-42ea-bda2-de16e9aa1146", 00:15:28.140 "is_configured": true, 00:15:28.140 "data_offset": 0, 00:15:28.140 "data_size": 65536 00:15:28.140 } 00:15:28.140 ] 00:15:28.140 }' 00:15:28.140 19:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:28.140 19:51:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:29.079 19:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:29.079 [2024-07-24 19:51:20.529620] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:29.079 19:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:29.079 19:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:29.079 19:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:29.079 19:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:29.079 19:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:29.079 19:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:29.079 19:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:29.079 19:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:29.079 19:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:29.079 19:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:29.079 19:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.079 19:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:29.338 19:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:29.338 "name": "Existed_Raid", 00:15:29.338 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:29.338 "strip_size_kb": 64, 00:15:29.338 "state": "configuring", 00:15:29.338 "raid_level": "concat", 00:15:29.338 "superblock": false, 00:15:29.338 "num_base_bdevs": 3, 00:15:29.338 "num_base_bdevs_discovered": 1, 00:15:29.338 "num_base_bdevs_operational": 3, 00:15:29.338 "base_bdevs_list": [ 00:15:29.338 { 00:15:29.338 "name": "BaseBdev1", 00:15:29.338 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:29.338 "is_configured": false, 00:15:29.338 "data_offset": 0, 00:15:29.338 "data_size": 0 00:15:29.338 }, 00:15:29.338 { 00:15:29.338 "name": null, 00:15:29.338 "uuid": "0a470a92-ec83-4475-8d1d-831b2fde0170", 00:15:29.338 "is_configured": false, 00:15:29.338 "data_offset": 0, 00:15:29.338 "data_size": 65536 00:15:29.338 }, 00:15:29.338 { 00:15:29.338 "name": "BaseBdev3", 00:15:29.338 "uuid": "d7d3b8be-74a3-42ea-bda2-de16e9aa1146", 00:15:29.338 "is_configured": true, 00:15:29.338 "data_offset": 0, 00:15:29.338 "data_size": 65536 00:15:29.338 } 00:15:29.338 ] 00:15:29.338 }' 00:15:29.338 19:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:29.338 19:51:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:29.914 19:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.914 19:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:30.174 19:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:30.174 19:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:30.434 [2024-07-24 19:51:21.932825] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:30.434 BaseBdev1 00:15:30.434 19:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:30.434 19:51:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:15:30.434 19:51:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:30.434 19:51:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:15:30.434 19:51:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:30.434 19:51:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:30.434 19:51:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:30.693 19:51:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:30.952 [ 00:15:30.952 { 00:15:30.952 "name": "BaseBdev1", 00:15:30.952 "aliases": [ 00:15:30.952 "39790942-ca66-4488-895f-e99cc1b413ef" 00:15:30.952 ], 00:15:30.952 "product_name": "Malloc disk", 00:15:30.952 "block_size": 512, 00:15:30.952 "num_blocks": 65536, 00:15:30.952 "uuid": "39790942-ca66-4488-895f-e99cc1b413ef", 00:15:30.952 "assigned_rate_limits": { 00:15:30.952 "rw_ios_per_sec": 0, 00:15:30.952 "rw_mbytes_per_sec": 0, 00:15:30.952 "r_mbytes_per_sec": 0, 00:15:30.952 "w_mbytes_per_sec": 0 00:15:30.952 }, 00:15:30.952 "claimed": true, 00:15:30.952 "claim_type": "exclusive_write", 00:15:30.952 "zoned": false, 00:15:30.952 "supported_io_types": { 00:15:30.952 "read": true, 00:15:30.952 "write": true, 00:15:30.952 "unmap": true, 00:15:30.952 "flush": true, 00:15:30.952 "reset": true, 00:15:30.952 "nvme_admin": false, 00:15:30.952 "nvme_io": false, 00:15:30.952 "nvme_io_md": false, 00:15:30.952 "write_zeroes": true, 00:15:30.952 "zcopy": true, 00:15:30.952 "get_zone_info": false, 00:15:30.952 "zone_management": false, 00:15:30.952 "zone_append": false, 00:15:30.952 "compare": false, 00:15:30.952 "compare_and_write": false, 00:15:30.952 "abort": true, 00:15:30.952 "seek_hole": false, 00:15:30.952 "seek_data": false, 00:15:30.952 "copy": true, 00:15:30.952 "nvme_iov_md": false 00:15:30.952 }, 00:15:30.952 "memory_domains": [ 00:15:30.952 { 00:15:30.952 "dma_device_id": "system", 00:15:30.952 "dma_device_type": 1 00:15:30.952 }, 00:15:30.952 { 00:15:30.952 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:30.952 "dma_device_type": 2 00:15:30.952 } 00:15:30.952 ], 00:15:30.952 "driver_specific": {} 00:15:30.952 } 00:15:30.952 ] 00:15:30.952 19:51:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:15:30.952 19:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:30.952 19:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:30.952 19:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:30.952 19:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:30.952 19:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:30.952 19:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:30.952 19:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:30.952 19:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:30.952 19:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:30.952 19:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:30.952 19:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:30.952 19:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:31.212 19:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:31.212 "name": "Existed_Raid", 00:15:31.212 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:31.212 "strip_size_kb": 64, 00:15:31.212 "state": "configuring", 00:15:31.212 "raid_level": "concat", 00:15:31.212 "superblock": false, 00:15:31.212 "num_base_bdevs": 3, 00:15:31.212 "num_base_bdevs_discovered": 2, 00:15:31.212 "num_base_bdevs_operational": 3, 00:15:31.212 "base_bdevs_list": [ 00:15:31.212 { 00:15:31.212 "name": "BaseBdev1", 00:15:31.212 "uuid": "39790942-ca66-4488-895f-e99cc1b413ef", 00:15:31.212 "is_configured": true, 00:15:31.212 "data_offset": 0, 00:15:31.212 "data_size": 65536 00:15:31.212 }, 00:15:31.212 { 00:15:31.212 "name": null, 00:15:31.212 "uuid": "0a470a92-ec83-4475-8d1d-831b2fde0170", 00:15:31.212 "is_configured": false, 00:15:31.212 "data_offset": 0, 00:15:31.212 "data_size": 65536 00:15:31.212 }, 00:15:31.212 { 00:15:31.212 "name": "BaseBdev3", 00:15:31.212 "uuid": "d7d3b8be-74a3-42ea-bda2-de16e9aa1146", 00:15:31.212 "is_configured": true, 00:15:31.212 "data_offset": 0, 00:15:31.212 "data_size": 65536 00:15:31.212 } 00:15:31.212 ] 00:15:31.212 }' 00:15:31.212 19:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:31.212 19:51:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:31.778 19:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:31.778 19:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:32.036 19:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:32.036 19:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:32.295 [2024-07-24 19:51:23.773768] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:32.295 19:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:32.295 19:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:32.295 19:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:32.295 19:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:32.295 19:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:32.295 19:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:32.295 19:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:32.295 19:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:32.295 19:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:32.295 19:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:32.295 19:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:32.295 19:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:32.554 19:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:32.554 "name": "Existed_Raid", 00:15:32.554 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:32.554 "strip_size_kb": 64, 00:15:32.554 "state": "configuring", 00:15:32.554 "raid_level": "concat", 00:15:32.554 "superblock": false, 00:15:32.554 "num_base_bdevs": 3, 00:15:32.554 "num_base_bdevs_discovered": 1, 00:15:32.554 "num_base_bdevs_operational": 3, 00:15:32.554 "base_bdevs_list": [ 00:15:32.554 { 00:15:32.554 "name": "BaseBdev1", 00:15:32.554 "uuid": "39790942-ca66-4488-895f-e99cc1b413ef", 00:15:32.554 "is_configured": true, 00:15:32.554 "data_offset": 0, 00:15:32.554 "data_size": 65536 00:15:32.554 }, 00:15:32.554 { 00:15:32.554 "name": null, 00:15:32.554 "uuid": "0a470a92-ec83-4475-8d1d-831b2fde0170", 00:15:32.554 "is_configured": false, 00:15:32.554 "data_offset": 0, 00:15:32.554 "data_size": 65536 00:15:32.554 }, 00:15:32.554 { 00:15:32.554 "name": null, 00:15:32.554 "uuid": "d7d3b8be-74a3-42ea-bda2-de16e9aa1146", 00:15:32.554 "is_configured": false, 00:15:32.554 "data_offset": 0, 00:15:32.554 "data_size": 65536 00:15:32.554 } 00:15:32.554 ] 00:15:32.554 }' 00:15:32.554 19:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:32.554 19:51:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:33.126 19:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:33.126 19:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:33.385 19:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:33.385 19:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:33.644 [2024-07-24 19:51:25.125383] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:33.644 19:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:33.644 19:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:33.644 19:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:33.644 19:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:33.644 19:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:33.644 19:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:33.644 19:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:33.644 19:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:33.644 19:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:33.644 19:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:33.644 19:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:33.644 19:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:33.904 19:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:33.904 "name": "Existed_Raid", 00:15:33.904 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:33.904 "strip_size_kb": 64, 00:15:33.904 "state": "configuring", 00:15:33.904 "raid_level": "concat", 00:15:33.904 "superblock": false, 00:15:33.904 "num_base_bdevs": 3, 00:15:33.904 "num_base_bdevs_discovered": 2, 00:15:33.904 "num_base_bdevs_operational": 3, 00:15:33.904 "base_bdevs_list": [ 00:15:33.904 { 00:15:33.904 "name": "BaseBdev1", 00:15:33.904 "uuid": "39790942-ca66-4488-895f-e99cc1b413ef", 00:15:33.904 "is_configured": true, 00:15:33.904 "data_offset": 0, 00:15:33.904 "data_size": 65536 00:15:33.904 }, 00:15:33.904 { 00:15:33.904 "name": null, 00:15:33.904 "uuid": "0a470a92-ec83-4475-8d1d-831b2fde0170", 00:15:33.904 "is_configured": false, 00:15:33.904 "data_offset": 0, 00:15:33.904 "data_size": 65536 00:15:33.904 }, 00:15:33.904 { 00:15:33.904 "name": "BaseBdev3", 00:15:33.904 "uuid": "d7d3b8be-74a3-42ea-bda2-de16e9aa1146", 00:15:33.904 "is_configured": true, 00:15:33.904 "data_offset": 0, 00:15:33.904 "data_size": 65536 00:15:33.904 } 00:15:33.904 ] 00:15:33.904 }' 00:15:33.904 19:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:33.904 19:51:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:34.473 19:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.473 19:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:34.732 19:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:34.732 19:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:34.992 [2024-07-24 19:51:26.517110] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:34.992 19:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:34.992 19:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:34.992 19:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:34.992 19:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:34.992 19:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:34.992 19:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:34.992 19:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:34.992 19:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:34.992 19:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:34.992 19:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:34.992 19:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.992 19:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:35.591 19:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:35.591 "name": "Existed_Raid", 00:15:35.591 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:35.591 "strip_size_kb": 64, 00:15:35.591 "state": "configuring", 00:15:35.591 "raid_level": "concat", 00:15:35.591 "superblock": false, 00:15:35.591 "num_base_bdevs": 3, 00:15:35.591 "num_base_bdevs_discovered": 1, 00:15:35.591 "num_base_bdevs_operational": 3, 00:15:35.591 "base_bdevs_list": [ 00:15:35.591 { 00:15:35.591 "name": null, 00:15:35.591 "uuid": "39790942-ca66-4488-895f-e99cc1b413ef", 00:15:35.591 "is_configured": false, 00:15:35.591 "data_offset": 0, 00:15:35.591 "data_size": 65536 00:15:35.591 }, 00:15:35.591 { 00:15:35.591 "name": null, 00:15:35.591 "uuid": "0a470a92-ec83-4475-8d1d-831b2fde0170", 00:15:35.591 "is_configured": false, 00:15:35.591 "data_offset": 0, 00:15:35.591 "data_size": 65536 00:15:35.591 }, 00:15:35.591 { 00:15:35.591 "name": "BaseBdev3", 00:15:35.591 "uuid": "d7d3b8be-74a3-42ea-bda2-de16e9aa1146", 00:15:35.591 "is_configured": true, 00:15:35.591 "data_offset": 0, 00:15:35.591 "data_size": 65536 00:15:35.591 } 00:15:35.591 ] 00:15:35.591 }' 00:15:35.591 19:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:35.591 19:51:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:36.159 19:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:36.159 19:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:36.418 19:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:36.418 19:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:36.677 [2024-07-24 19:51:28.143383] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:36.677 19:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:36.677 19:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:36.677 19:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:36.677 19:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:36.677 19:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:36.677 19:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:36.677 19:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:36.677 19:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:36.677 19:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:36.677 19:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:36.677 19:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:36.677 19:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:36.937 19:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:36.937 "name": "Existed_Raid", 00:15:36.937 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:36.937 "strip_size_kb": 64, 00:15:36.937 "state": "configuring", 00:15:36.937 "raid_level": "concat", 00:15:36.937 "superblock": false, 00:15:36.937 "num_base_bdevs": 3, 00:15:36.937 "num_base_bdevs_discovered": 2, 00:15:36.937 "num_base_bdevs_operational": 3, 00:15:36.937 "base_bdevs_list": [ 00:15:36.937 { 00:15:36.937 "name": null, 00:15:36.937 "uuid": "39790942-ca66-4488-895f-e99cc1b413ef", 00:15:36.937 "is_configured": false, 00:15:36.937 "data_offset": 0, 00:15:36.937 "data_size": 65536 00:15:36.937 }, 00:15:36.937 { 00:15:36.937 "name": "BaseBdev2", 00:15:36.937 "uuid": "0a470a92-ec83-4475-8d1d-831b2fde0170", 00:15:36.937 "is_configured": true, 00:15:36.937 "data_offset": 0, 00:15:36.937 "data_size": 65536 00:15:36.937 }, 00:15:36.937 { 00:15:36.937 "name": "BaseBdev3", 00:15:36.937 "uuid": "d7d3b8be-74a3-42ea-bda2-de16e9aa1146", 00:15:36.937 "is_configured": true, 00:15:36.937 "data_offset": 0, 00:15:36.937 "data_size": 65536 00:15:36.937 } 00:15:36.937 ] 00:15:36.937 }' 00:15:36.937 19:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:36.937 19:51:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:37.505 19:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:37.505 19:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:37.764 19:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:37.764 19:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:37.764 19:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:38.024 19:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 39790942-ca66-4488-895f-e99cc1b413ef 00:15:38.283 [2024-07-24 19:51:29.756637] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:38.283 [2024-07-24 19:51:29.756686] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a234f0 00:15:38.283 [2024-07-24 19:51:29.756695] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:15:38.283 [2024-07-24 19:51:29.756924] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18930c0 00:15:38.283 [2024-07-24 19:51:29.757058] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a234f0 00:15:38.283 [2024-07-24 19:51:29.757068] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1a234f0 00:15:38.283 [2024-07-24 19:51:29.757246] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:38.283 NewBaseBdev 00:15:38.283 19:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:38.283 19:51:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:15:38.283 19:51:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:38.283 19:51:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:15:38.283 19:51:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:38.283 19:51:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:38.283 19:51:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:38.543 19:51:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:38.543 [ 00:15:38.543 { 00:15:38.543 "name": "NewBaseBdev", 00:15:38.543 "aliases": [ 00:15:38.543 "39790942-ca66-4488-895f-e99cc1b413ef" 00:15:38.543 ], 00:15:38.543 "product_name": "Malloc disk", 00:15:38.543 "block_size": 512, 00:15:38.543 "num_blocks": 65536, 00:15:38.543 "uuid": "39790942-ca66-4488-895f-e99cc1b413ef", 00:15:38.543 "assigned_rate_limits": { 00:15:38.543 "rw_ios_per_sec": 0, 00:15:38.543 "rw_mbytes_per_sec": 0, 00:15:38.543 "r_mbytes_per_sec": 0, 00:15:38.543 "w_mbytes_per_sec": 0 00:15:38.543 }, 00:15:38.543 "claimed": true, 00:15:38.543 "claim_type": "exclusive_write", 00:15:38.543 "zoned": false, 00:15:38.543 "supported_io_types": { 00:15:38.543 "read": true, 00:15:38.543 "write": true, 00:15:38.543 "unmap": true, 00:15:38.543 "flush": true, 00:15:38.543 "reset": true, 00:15:38.543 "nvme_admin": false, 00:15:38.543 "nvme_io": false, 00:15:38.543 "nvme_io_md": false, 00:15:38.543 "write_zeroes": true, 00:15:38.543 "zcopy": true, 00:15:38.543 "get_zone_info": false, 00:15:38.543 "zone_management": false, 00:15:38.543 "zone_append": false, 00:15:38.543 "compare": false, 00:15:38.543 "compare_and_write": false, 00:15:38.543 "abort": true, 00:15:38.543 "seek_hole": false, 00:15:38.543 "seek_data": false, 00:15:38.543 "copy": true, 00:15:38.543 "nvme_iov_md": false 00:15:38.543 }, 00:15:38.543 "memory_domains": [ 00:15:38.543 { 00:15:38.543 "dma_device_id": "system", 00:15:38.543 "dma_device_type": 1 00:15:38.543 }, 00:15:38.543 { 00:15:38.543 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.543 "dma_device_type": 2 00:15:38.543 } 00:15:38.543 ], 00:15:38.543 "driver_specific": {} 00:15:38.543 } 00:15:38.543 ] 00:15:38.543 19:51:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:15:38.543 19:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:38.543 19:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:38.543 19:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:38.543 19:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:38.543 19:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:38.543 19:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:38.543 19:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:38.543 19:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:38.543 19:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:38.543 19:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:38.543 19:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:38.543 19:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:38.802 19:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:38.802 "name": "Existed_Raid", 00:15:38.802 "uuid": "2835fb37-36c9-4f0b-afd8-a2c77f3fef19", 00:15:38.802 "strip_size_kb": 64, 00:15:38.802 "state": "online", 00:15:38.802 "raid_level": "concat", 00:15:38.802 "superblock": false, 00:15:38.802 "num_base_bdevs": 3, 00:15:38.802 "num_base_bdevs_discovered": 3, 00:15:38.802 "num_base_bdevs_operational": 3, 00:15:38.802 "base_bdevs_list": [ 00:15:38.802 { 00:15:38.802 "name": "NewBaseBdev", 00:15:38.802 "uuid": "39790942-ca66-4488-895f-e99cc1b413ef", 00:15:38.802 "is_configured": true, 00:15:38.802 "data_offset": 0, 00:15:38.802 "data_size": 65536 00:15:38.802 }, 00:15:38.802 { 00:15:38.802 "name": "BaseBdev2", 00:15:38.802 "uuid": "0a470a92-ec83-4475-8d1d-831b2fde0170", 00:15:38.802 "is_configured": true, 00:15:38.802 "data_offset": 0, 00:15:38.802 "data_size": 65536 00:15:38.802 }, 00:15:38.802 { 00:15:38.802 "name": "BaseBdev3", 00:15:38.802 "uuid": "d7d3b8be-74a3-42ea-bda2-de16e9aa1146", 00:15:38.802 "is_configured": true, 00:15:38.802 "data_offset": 0, 00:15:38.802 "data_size": 65536 00:15:38.802 } 00:15:38.802 ] 00:15:38.802 }' 00:15:38.802 19:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:38.802 19:51:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:39.740 19:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:39.740 19:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:39.740 19:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:39.740 19:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:39.740 19:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:39.740 19:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:39.740 19:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:39.740 19:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:39.740 [2024-07-24 19:51:31.128582] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:39.740 19:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:39.740 "name": "Existed_Raid", 00:15:39.740 "aliases": [ 00:15:39.740 "2835fb37-36c9-4f0b-afd8-a2c77f3fef19" 00:15:39.740 ], 00:15:39.740 "product_name": "Raid Volume", 00:15:39.740 "block_size": 512, 00:15:39.740 "num_blocks": 196608, 00:15:39.740 "uuid": "2835fb37-36c9-4f0b-afd8-a2c77f3fef19", 00:15:39.740 "assigned_rate_limits": { 00:15:39.740 "rw_ios_per_sec": 0, 00:15:39.740 "rw_mbytes_per_sec": 0, 00:15:39.740 "r_mbytes_per_sec": 0, 00:15:39.740 "w_mbytes_per_sec": 0 00:15:39.740 }, 00:15:39.740 "claimed": false, 00:15:39.740 "zoned": false, 00:15:39.740 "supported_io_types": { 00:15:39.740 "read": true, 00:15:39.740 "write": true, 00:15:39.740 "unmap": true, 00:15:39.740 "flush": true, 00:15:39.740 "reset": true, 00:15:39.740 "nvme_admin": false, 00:15:39.740 "nvme_io": false, 00:15:39.740 "nvme_io_md": false, 00:15:39.740 "write_zeroes": true, 00:15:39.740 "zcopy": false, 00:15:39.740 "get_zone_info": false, 00:15:39.740 "zone_management": false, 00:15:39.740 "zone_append": false, 00:15:39.740 "compare": false, 00:15:39.740 "compare_and_write": false, 00:15:39.740 "abort": false, 00:15:39.740 "seek_hole": false, 00:15:39.740 "seek_data": false, 00:15:39.740 "copy": false, 00:15:39.740 "nvme_iov_md": false 00:15:39.740 }, 00:15:39.740 "memory_domains": [ 00:15:39.740 { 00:15:39.740 "dma_device_id": "system", 00:15:39.740 "dma_device_type": 1 00:15:39.740 }, 00:15:39.740 { 00:15:39.740 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:39.740 "dma_device_type": 2 00:15:39.740 }, 00:15:39.740 { 00:15:39.740 "dma_device_id": "system", 00:15:39.740 "dma_device_type": 1 00:15:39.740 }, 00:15:39.740 { 00:15:39.740 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:39.740 "dma_device_type": 2 00:15:39.740 }, 00:15:39.740 { 00:15:39.740 "dma_device_id": "system", 00:15:39.740 "dma_device_type": 1 00:15:39.740 }, 00:15:39.740 { 00:15:39.740 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:39.740 "dma_device_type": 2 00:15:39.740 } 00:15:39.740 ], 00:15:39.740 "driver_specific": { 00:15:39.740 "raid": { 00:15:39.740 "uuid": "2835fb37-36c9-4f0b-afd8-a2c77f3fef19", 00:15:39.740 "strip_size_kb": 64, 00:15:39.740 "state": "online", 00:15:39.740 "raid_level": "concat", 00:15:39.740 "superblock": false, 00:15:39.740 "num_base_bdevs": 3, 00:15:39.740 "num_base_bdevs_discovered": 3, 00:15:39.740 "num_base_bdevs_operational": 3, 00:15:39.740 "base_bdevs_list": [ 00:15:39.740 { 00:15:39.740 "name": "NewBaseBdev", 00:15:39.740 "uuid": "39790942-ca66-4488-895f-e99cc1b413ef", 00:15:39.740 "is_configured": true, 00:15:39.740 "data_offset": 0, 00:15:39.740 "data_size": 65536 00:15:39.740 }, 00:15:39.740 { 00:15:39.740 "name": "BaseBdev2", 00:15:39.740 "uuid": "0a470a92-ec83-4475-8d1d-831b2fde0170", 00:15:39.740 "is_configured": true, 00:15:39.740 "data_offset": 0, 00:15:39.740 "data_size": 65536 00:15:39.740 }, 00:15:39.740 { 00:15:39.740 "name": "BaseBdev3", 00:15:39.740 "uuid": "d7d3b8be-74a3-42ea-bda2-de16e9aa1146", 00:15:39.740 "is_configured": true, 00:15:39.740 "data_offset": 0, 00:15:39.740 "data_size": 65536 00:15:39.740 } 00:15:39.740 ] 00:15:39.740 } 00:15:39.740 } 00:15:39.740 }' 00:15:39.740 19:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:39.740 19:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:39.740 BaseBdev2 00:15:39.740 BaseBdev3' 00:15:39.740 19:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:39.740 19:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:39.740 19:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:39.999 19:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:39.999 "name": "NewBaseBdev", 00:15:39.999 "aliases": [ 00:15:39.999 "39790942-ca66-4488-895f-e99cc1b413ef" 00:15:40.000 ], 00:15:40.000 "product_name": "Malloc disk", 00:15:40.000 "block_size": 512, 00:15:40.000 "num_blocks": 65536, 00:15:40.000 "uuid": "39790942-ca66-4488-895f-e99cc1b413ef", 00:15:40.000 "assigned_rate_limits": { 00:15:40.000 "rw_ios_per_sec": 0, 00:15:40.000 "rw_mbytes_per_sec": 0, 00:15:40.000 "r_mbytes_per_sec": 0, 00:15:40.000 "w_mbytes_per_sec": 0 00:15:40.000 }, 00:15:40.000 "claimed": true, 00:15:40.000 "claim_type": "exclusive_write", 00:15:40.000 "zoned": false, 00:15:40.000 "supported_io_types": { 00:15:40.000 "read": true, 00:15:40.000 "write": true, 00:15:40.000 "unmap": true, 00:15:40.000 "flush": true, 00:15:40.000 "reset": true, 00:15:40.000 "nvme_admin": false, 00:15:40.000 "nvme_io": false, 00:15:40.000 "nvme_io_md": false, 00:15:40.000 "write_zeroes": true, 00:15:40.000 "zcopy": true, 00:15:40.000 "get_zone_info": false, 00:15:40.000 "zone_management": false, 00:15:40.000 "zone_append": false, 00:15:40.000 "compare": false, 00:15:40.000 "compare_and_write": false, 00:15:40.000 "abort": true, 00:15:40.000 "seek_hole": false, 00:15:40.000 "seek_data": false, 00:15:40.000 "copy": true, 00:15:40.000 "nvme_iov_md": false 00:15:40.000 }, 00:15:40.000 "memory_domains": [ 00:15:40.000 { 00:15:40.000 "dma_device_id": "system", 00:15:40.000 "dma_device_type": 1 00:15:40.000 }, 00:15:40.000 { 00:15:40.000 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:40.000 "dma_device_type": 2 00:15:40.000 } 00:15:40.000 ], 00:15:40.000 "driver_specific": {} 00:15:40.000 }' 00:15:40.000 19:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:40.000 19:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:40.000 19:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:40.000 19:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:40.000 19:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:40.259 19:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:40.259 19:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:40.259 19:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:40.259 19:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:40.259 19:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:40.259 19:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:40.259 19:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:40.259 19:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:40.259 19:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:40.259 19:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:40.519 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:40.519 "name": "BaseBdev2", 00:15:40.519 "aliases": [ 00:15:40.519 "0a470a92-ec83-4475-8d1d-831b2fde0170" 00:15:40.519 ], 00:15:40.519 "product_name": "Malloc disk", 00:15:40.519 "block_size": 512, 00:15:40.519 "num_blocks": 65536, 00:15:40.519 "uuid": "0a470a92-ec83-4475-8d1d-831b2fde0170", 00:15:40.519 "assigned_rate_limits": { 00:15:40.519 "rw_ios_per_sec": 0, 00:15:40.519 "rw_mbytes_per_sec": 0, 00:15:40.519 "r_mbytes_per_sec": 0, 00:15:40.519 "w_mbytes_per_sec": 0 00:15:40.519 }, 00:15:40.519 "claimed": true, 00:15:40.519 "claim_type": "exclusive_write", 00:15:40.519 "zoned": false, 00:15:40.519 "supported_io_types": { 00:15:40.519 "read": true, 00:15:40.519 "write": true, 00:15:40.519 "unmap": true, 00:15:40.519 "flush": true, 00:15:40.519 "reset": true, 00:15:40.519 "nvme_admin": false, 00:15:40.519 "nvme_io": false, 00:15:40.519 "nvme_io_md": false, 00:15:40.519 "write_zeroes": true, 00:15:40.519 "zcopy": true, 00:15:40.519 "get_zone_info": false, 00:15:40.519 "zone_management": false, 00:15:40.519 "zone_append": false, 00:15:40.519 "compare": false, 00:15:40.519 "compare_and_write": false, 00:15:40.519 "abort": true, 00:15:40.519 "seek_hole": false, 00:15:40.519 "seek_data": false, 00:15:40.519 "copy": true, 00:15:40.519 "nvme_iov_md": false 00:15:40.519 }, 00:15:40.519 "memory_domains": [ 00:15:40.519 { 00:15:40.519 "dma_device_id": "system", 00:15:40.519 "dma_device_type": 1 00:15:40.519 }, 00:15:40.519 { 00:15:40.519 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:40.519 "dma_device_type": 2 00:15:40.519 } 00:15:40.519 ], 00:15:40.519 "driver_specific": {} 00:15:40.519 }' 00:15:40.519 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:40.519 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:40.519 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:40.519 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:40.778 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:40.778 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:40.778 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:40.778 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:40.778 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:40.778 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:40.778 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:40.778 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:40.778 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:41.038 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:41.038 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:41.297 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:41.297 "name": "BaseBdev3", 00:15:41.297 "aliases": [ 00:15:41.297 "d7d3b8be-74a3-42ea-bda2-de16e9aa1146" 00:15:41.297 ], 00:15:41.297 "product_name": "Malloc disk", 00:15:41.297 "block_size": 512, 00:15:41.297 "num_blocks": 65536, 00:15:41.297 "uuid": "d7d3b8be-74a3-42ea-bda2-de16e9aa1146", 00:15:41.297 "assigned_rate_limits": { 00:15:41.297 "rw_ios_per_sec": 0, 00:15:41.297 "rw_mbytes_per_sec": 0, 00:15:41.297 "r_mbytes_per_sec": 0, 00:15:41.297 "w_mbytes_per_sec": 0 00:15:41.297 }, 00:15:41.297 "claimed": true, 00:15:41.297 "claim_type": "exclusive_write", 00:15:41.297 "zoned": false, 00:15:41.297 "supported_io_types": { 00:15:41.297 "read": true, 00:15:41.297 "write": true, 00:15:41.297 "unmap": true, 00:15:41.297 "flush": true, 00:15:41.297 "reset": true, 00:15:41.297 "nvme_admin": false, 00:15:41.297 "nvme_io": false, 00:15:41.297 "nvme_io_md": false, 00:15:41.297 "write_zeroes": true, 00:15:41.297 "zcopy": true, 00:15:41.297 "get_zone_info": false, 00:15:41.297 "zone_management": false, 00:15:41.297 "zone_append": false, 00:15:41.297 "compare": false, 00:15:41.297 "compare_and_write": false, 00:15:41.297 "abort": true, 00:15:41.297 "seek_hole": false, 00:15:41.297 "seek_data": false, 00:15:41.297 "copy": true, 00:15:41.297 "nvme_iov_md": false 00:15:41.297 }, 00:15:41.297 "memory_domains": [ 00:15:41.297 { 00:15:41.297 "dma_device_id": "system", 00:15:41.297 "dma_device_type": 1 00:15:41.297 }, 00:15:41.297 { 00:15:41.297 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:41.297 "dma_device_type": 2 00:15:41.297 } 00:15:41.297 ], 00:15:41.297 "driver_specific": {} 00:15:41.297 }' 00:15:41.297 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:41.297 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:41.297 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:41.297 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:41.297 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:41.297 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:41.297 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:41.297 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:41.557 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:41.557 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:41.557 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:41.557 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:41.557 19:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:41.557 [2024-07-24 19:51:33.141633] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:41.557 [2024-07-24 19:51:33.141666] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:41.557 [2024-07-24 19:51:33.141727] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:41.557 [2024-07-24 19:51:33.141784] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:41.557 [2024-07-24 19:51:33.141803] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a234f0 name Existed_Raid, state offline 00:15:41.817 19:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1407437 00:15:41.817 19:51:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 1407437 ']' 00:15:41.817 19:51:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 1407437 00:15:41.817 19:51:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:15:41.817 19:51:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:15:41.817 19:51:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1407437 00:15:41.817 19:51:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:15:41.817 19:51:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:15:41.817 19:51:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1407437' 00:15:41.817 killing process with pid 1407437 00:15:41.817 19:51:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 1407437 00:15:41.817 [2024-07-24 19:51:33.205074] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:41.817 19:51:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 1407437 00:15:41.817 [2024-07-24 19:51:33.270093] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:42.385 19:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:15:42.385 00:15:42.385 real 0m30.326s 00:15:42.385 user 0m55.346s 00:15:42.385 sys 0m5.428s 00:15:42.385 19:51:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:15:42.385 19:51:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:42.385 ************************************ 00:15:42.385 END TEST raid_state_function_test 00:15:42.385 ************************************ 00:15:42.385 19:51:33 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:15:42.385 19:51:33 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:15:42.385 19:51:33 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:15:42.385 19:51:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:42.385 ************************************ 00:15:42.385 START TEST raid_state_function_test_sb 00:15:42.385 ************************************ 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 3 true 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1412308 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1412308' 00:15:42.386 Process raid pid: 1412308 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1412308 /var/tmp/spdk-raid.sock 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1412308 ']' 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:42.386 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:15:42.386 19:51:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:42.386 [2024-07-24 19:51:33.831755] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:15:42.386 [2024-07-24 19:51:33.831809] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:42.386 [2024-07-24 19:51:33.946985] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:42.645 [2024-07-24 19:51:34.052201] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:42.645 [2024-07-24 19:51:34.113030] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:42.645 [2024-07-24 19:51:34.113059] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:43.213 19:51:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:15:43.213 19:51:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:15:43.214 19:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:43.473 [2024-07-24 19:51:34.930527] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:43.473 [2024-07-24 19:51:34.930566] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:43.473 [2024-07-24 19:51:34.930577] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:43.473 [2024-07-24 19:51:34.930588] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:43.473 [2024-07-24 19:51:34.930601] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:43.473 [2024-07-24 19:51:34.930613] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:43.473 19:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:43.473 19:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:43.473 19:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:43.473 19:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:43.473 19:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:43.473 19:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:43.473 19:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:43.473 19:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:43.473 19:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:43.473 19:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:43.473 19:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:43.473 19:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:43.731 19:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:43.731 "name": "Existed_Raid", 00:15:43.731 "uuid": "40499e13-3ca4-4f58-924a-2206cba06307", 00:15:43.731 "strip_size_kb": 64, 00:15:43.731 "state": "configuring", 00:15:43.731 "raid_level": "concat", 00:15:43.731 "superblock": true, 00:15:43.731 "num_base_bdevs": 3, 00:15:43.731 "num_base_bdevs_discovered": 0, 00:15:43.731 "num_base_bdevs_operational": 3, 00:15:43.731 "base_bdevs_list": [ 00:15:43.731 { 00:15:43.731 "name": "BaseBdev1", 00:15:43.731 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:43.731 "is_configured": false, 00:15:43.731 "data_offset": 0, 00:15:43.731 "data_size": 0 00:15:43.731 }, 00:15:43.731 { 00:15:43.731 "name": "BaseBdev2", 00:15:43.731 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:43.731 "is_configured": false, 00:15:43.731 "data_offset": 0, 00:15:43.732 "data_size": 0 00:15:43.732 }, 00:15:43.732 { 00:15:43.732 "name": "BaseBdev3", 00:15:43.732 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:43.732 "is_configured": false, 00:15:43.732 "data_offset": 0, 00:15:43.732 "data_size": 0 00:15:43.732 } 00:15:43.732 ] 00:15:43.732 }' 00:15:43.732 19:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:43.732 19:51:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:44.299 19:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:44.559 [2024-07-24 19:51:36.041334] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:44.559 [2024-07-24 19:51:36.041365] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21c7a10 name Existed_Raid, state configuring 00:15:44.559 19:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:44.818 [2024-07-24 19:51:36.290023] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:44.818 [2024-07-24 19:51:36.290059] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:44.818 [2024-07-24 19:51:36.290069] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:44.818 [2024-07-24 19:51:36.290080] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:44.818 [2024-07-24 19:51:36.290089] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:44.818 [2024-07-24 19:51:36.290100] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:44.818 19:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:45.078 [2024-07-24 19:51:36.545762] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:45.078 BaseBdev1 00:15:45.078 19:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:45.078 19:51:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:15:45.078 19:51:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:45.078 19:51:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:45.078 19:51:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:45.078 19:51:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:45.078 19:51:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:45.338 19:51:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:45.597 [ 00:15:45.597 { 00:15:45.597 "name": "BaseBdev1", 00:15:45.597 "aliases": [ 00:15:45.597 "9fdde662-fcaf-4972-bb4a-c0e6c924ec92" 00:15:45.597 ], 00:15:45.597 "product_name": "Malloc disk", 00:15:45.597 "block_size": 512, 00:15:45.597 "num_blocks": 65536, 00:15:45.597 "uuid": "9fdde662-fcaf-4972-bb4a-c0e6c924ec92", 00:15:45.597 "assigned_rate_limits": { 00:15:45.597 "rw_ios_per_sec": 0, 00:15:45.597 "rw_mbytes_per_sec": 0, 00:15:45.597 "r_mbytes_per_sec": 0, 00:15:45.597 "w_mbytes_per_sec": 0 00:15:45.597 }, 00:15:45.597 "claimed": true, 00:15:45.597 "claim_type": "exclusive_write", 00:15:45.597 "zoned": false, 00:15:45.597 "supported_io_types": { 00:15:45.597 "read": true, 00:15:45.597 "write": true, 00:15:45.597 "unmap": true, 00:15:45.597 "flush": true, 00:15:45.597 "reset": true, 00:15:45.597 "nvme_admin": false, 00:15:45.597 "nvme_io": false, 00:15:45.597 "nvme_io_md": false, 00:15:45.597 "write_zeroes": true, 00:15:45.597 "zcopy": true, 00:15:45.597 "get_zone_info": false, 00:15:45.597 "zone_management": false, 00:15:45.597 "zone_append": false, 00:15:45.597 "compare": false, 00:15:45.597 "compare_and_write": false, 00:15:45.597 "abort": true, 00:15:45.597 "seek_hole": false, 00:15:45.597 "seek_data": false, 00:15:45.597 "copy": true, 00:15:45.597 "nvme_iov_md": false 00:15:45.597 }, 00:15:45.597 "memory_domains": [ 00:15:45.597 { 00:15:45.597 "dma_device_id": "system", 00:15:45.597 "dma_device_type": 1 00:15:45.597 }, 00:15:45.597 { 00:15:45.597 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:45.597 "dma_device_type": 2 00:15:45.597 } 00:15:45.597 ], 00:15:45.597 "driver_specific": {} 00:15:45.597 } 00:15:45.597 ] 00:15:45.597 19:51:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:45.597 19:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:45.598 19:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:45.598 19:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:45.598 19:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:45.598 19:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:45.598 19:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:45.598 19:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:45.598 19:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:45.598 19:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:45.598 19:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:45.598 19:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:45.598 19:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:45.857 19:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:45.857 "name": "Existed_Raid", 00:15:45.857 "uuid": "a0163e61-26ce-4e59-a6ed-416b4ee2c595", 00:15:45.857 "strip_size_kb": 64, 00:15:45.857 "state": "configuring", 00:15:45.857 "raid_level": "concat", 00:15:45.857 "superblock": true, 00:15:45.857 "num_base_bdevs": 3, 00:15:45.857 "num_base_bdevs_discovered": 1, 00:15:45.857 "num_base_bdevs_operational": 3, 00:15:45.857 "base_bdevs_list": [ 00:15:45.857 { 00:15:45.857 "name": "BaseBdev1", 00:15:45.857 "uuid": "9fdde662-fcaf-4972-bb4a-c0e6c924ec92", 00:15:45.857 "is_configured": true, 00:15:45.857 "data_offset": 2048, 00:15:45.857 "data_size": 63488 00:15:45.857 }, 00:15:45.857 { 00:15:45.857 "name": "BaseBdev2", 00:15:45.857 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:45.857 "is_configured": false, 00:15:45.857 "data_offset": 0, 00:15:45.857 "data_size": 0 00:15:45.857 }, 00:15:45.857 { 00:15:45.857 "name": "BaseBdev3", 00:15:45.857 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:45.857 "is_configured": false, 00:15:45.857 "data_offset": 0, 00:15:45.857 "data_size": 0 00:15:45.857 } 00:15:45.857 ] 00:15:45.857 }' 00:15:45.857 19:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:45.857 19:51:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:46.425 19:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:46.684 [2024-07-24 19:51:38.085842] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:46.684 [2024-07-24 19:51:38.085884] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21c72e0 name Existed_Raid, state configuring 00:15:46.684 19:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:46.944 [2024-07-24 19:51:38.330527] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:46.944 [2024-07-24 19:51:38.332059] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:46.944 [2024-07-24 19:51:38.332092] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:46.944 [2024-07-24 19:51:38.332103] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:46.944 [2024-07-24 19:51:38.332114] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:46.944 19:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:46.944 19:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:46.944 19:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:46.944 19:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:46.944 19:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:46.944 19:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:46.944 19:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:46.944 19:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:46.944 19:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:46.944 19:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:46.944 19:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:46.944 19:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:46.944 19:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:46.944 19:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:47.203 19:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:47.203 "name": "Existed_Raid", 00:15:47.203 "uuid": "5a906f32-acd9-43d0-a28b-987d7dde325b", 00:15:47.203 "strip_size_kb": 64, 00:15:47.203 "state": "configuring", 00:15:47.203 "raid_level": "concat", 00:15:47.203 "superblock": true, 00:15:47.203 "num_base_bdevs": 3, 00:15:47.203 "num_base_bdevs_discovered": 1, 00:15:47.203 "num_base_bdevs_operational": 3, 00:15:47.203 "base_bdevs_list": [ 00:15:47.203 { 00:15:47.203 "name": "BaseBdev1", 00:15:47.203 "uuid": "9fdde662-fcaf-4972-bb4a-c0e6c924ec92", 00:15:47.203 "is_configured": true, 00:15:47.203 "data_offset": 2048, 00:15:47.203 "data_size": 63488 00:15:47.203 }, 00:15:47.203 { 00:15:47.203 "name": "BaseBdev2", 00:15:47.203 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:47.203 "is_configured": false, 00:15:47.203 "data_offset": 0, 00:15:47.203 "data_size": 0 00:15:47.203 }, 00:15:47.203 { 00:15:47.203 "name": "BaseBdev3", 00:15:47.203 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:47.203 "is_configured": false, 00:15:47.203 "data_offset": 0, 00:15:47.203 "data_size": 0 00:15:47.203 } 00:15:47.203 ] 00:15:47.203 }' 00:15:47.203 19:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:47.203 19:51:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:48.141 19:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:48.141 [2024-07-24 19:51:39.674706] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:48.141 BaseBdev2 00:15:48.141 19:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:48.141 19:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:15:48.141 19:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:48.141 19:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:48.141 19:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:48.141 19:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:48.141 19:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:48.400 19:51:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:48.659 [ 00:15:48.659 { 00:15:48.659 "name": "BaseBdev2", 00:15:48.659 "aliases": [ 00:15:48.659 "020a64b8-68f0-4be6-b6d5-b92ad53f1425" 00:15:48.659 ], 00:15:48.659 "product_name": "Malloc disk", 00:15:48.659 "block_size": 512, 00:15:48.659 "num_blocks": 65536, 00:15:48.659 "uuid": "020a64b8-68f0-4be6-b6d5-b92ad53f1425", 00:15:48.659 "assigned_rate_limits": { 00:15:48.659 "rw_ios_per_sec": 0, 00:15:48.659 "rw_mbytes_per_sec": 0, 00:15:48.659 "r_mbytes_per_sec": 0, 00:15:48.659 "w_mbytes_per_sec": 0 00:15:48.659 }, 00:15:48.659 "claimed": true, 00:15:48.659 "claim_type": "exclusive_write", 00:15:48.659 "zoned": false, 00:15:48.659 "supported_io_types": { 00:15:48.659 "read": true, 00:15:48.659 "write": true, 00:15:48.659 "unmap": true, 00:15:48.659 "flush": true, 00:15:48.659 "reset": true, 00:15:48.659 "nvme_admin": false, 00:15:48.659 "nvme_io": false, 00:15:48.659 "nvme_io_md": false, 00:15:48.659 "write_zeroes": true, 00:15:48.659 "zcopy": true, 00:15:48.659 "get_zone_info": false, 00:15:48.659 "zone_management": false, 00:15:48.659 "zone_append": false, 00:15:48.659 "compare": false, 00:15:48.659 "compare_and_write": false, 00:15:48.659 "abort": true, 00:15:48.659 "seek_hole": false, 00:15:48.659 "seek_data": false, 00:15:48.659 "copy": true, 00:15:48.659 "nvme_iov_md": false 00:15:48.659 }, 00:15:48.659 "memory_domains": [ 00:15:48.659 { 00:15:48.659 "dma_device_id": "system", 00:15:48.659 "dma_device_type": 1 00:15:48.659 }, 00:15:48.659 { 00:15:48.659 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:48.659 "dma_device_type": 2 00:15:48.659 } 00:15:48.659 ], 00:15:48.659 "driver_specific": {} 00:15:48.659 } 00:15:48.659 ] 00:15:48.659 19:51:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:48.659 19:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:48.659 19:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:48.659 19:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:48.659 19:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:48.659 19:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:48.659 19:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:48.659 19:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:48.659 19:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:48.659 19:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:48.659 19:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:48.659 19:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:48.659 19:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:48.659 19:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.659 19:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:48.918 19:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:48.918 "name": "Existed_Raid", 00:15:48.918 "uuid": "5a906f32-acd9-43d0-a28b-987d7dde325b", 00:15:48.918 "strip_size_kb": 64, 00:15:48.918 "state": "configuring", 00:15:48.918 "raid_level": "concat", 00:15:48.918 "superblock": true, 00:15:48.918 "num_base_bdevs": 3, 00:15:48.918 "num_base_bdevs_discovered": 2, 00:15:48.918 "num_base_bdevs_operational": 3, 00:15:48.918 "base_bdevs_list": [ 00:15:48.918 { 00:15:48.918 "name": "BaseBdev1", 00:15:48.918 "uuid": "9fdde662-fcaf-4972-bb4a-c0e6c924ec92", 00:15:48.918 "is_configured": true, 00:15:48.918 "data_offset": 2048, 00:15:48.918 "data_size": 63488 00:15:48.918 }, 00:15:48.918 { 00:15:48.918 "name": "BaseBdev2", 00:15:48.918 "uuid": "020a64b8-68f0-4be6-b6d5-b92ad53f1425", 00:15:48.918 "is_configured": true, 00:15:48.918 "data_offset": 2048, 00:15:48.918 "data_size": 63488 00:15:48.918 }, 00:15:48.918 { 00:15:48.919 "name": "BaseBdev3", 00:15:48.919 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:48.919 "is_configured": false, 00:15:48.919 "data_offset": 0, 00:15:48.919 "data_size": 0 00:15:48.919 } 00:15:48.919 ] 00:15:48.919 }' 00:15:48.919 19:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:48.919 19:51:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:49.855 19:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:49.855 [2024-07-24 19:51:41.323220] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:49.855 [2024-07-24 19:51:41.323405] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x21c81d0 00:15:49.855 [2024-07-24 19:51:41.323421] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:49.855 [2024-07-24 19:51:41.323599] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x236bfb0 00:15:49.855 [2024-07-24 19:51:41.323729] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21c81d0 00:15:49.855 [2024-07-24 19:51:41.323739] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x21c81d0 00:15:49.855 [2024-07-24 19:51:41.323832] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:49.855 BaseBdev3 00:15:49.855 19:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:49.855 19:51:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:15:49.855 19:51:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:49.855 19:51:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:49.855 19:51:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:49.855 19:51:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:49.855 19:51:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:50.114 19:51:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:50.373 [ 00:15:50.373 { 00:15:50.373 "name": "BaseBdev3", 00:15:50.373 "aliases": [ 00:15:50.373 "9ec07ae0-9116-4de5-956a-3833d0f4607f" 00:15:50.373 ], 00:15:50.373 "product_name": "Malloc disk", 00:15:50.373 "block_size": 512, 00:15:50.373 "num_blocks": 65536, 00:15:50.373 "uuid": "9ec07ae0-9116-4de5-956a-3833d0f4607f", 00:15:50.373 "assigned_rate_limits": { 00:15:50.373 "rw_ios_per_sec": 0, 00:15:50.373 "rw_mbytes_per_sec": 0, 00:15:50.373 "r_mbytes_per_sec": 0, 00:15:50.373 "w_mbytes_per_sec": 0 00:15:50.373 }, 00:15:50.373 "claimed": true, 00:15:50.373 "claim_type": "exclusive_write", 00:15:50.373 "zoned": false, 00:15:50.373 "supported_io_types": { 00:15:50.373 "read": true, 00:15:50.373 "write": true, 00:15:50.373 "unmap": true, 00:15:50.373 "flush": true, 00:15:50.373 "reset": true, 00:15:50.373 "nvme_admin": false, 00:15:50.373 "nvme_io": false, 00:15:50.373 "nvme_io_md": false, 00:15:50.373 "write_zeroes": true, 00:15:50.373 "zcopy": true, 00:15:50.373 "get_zone_info": false, 00:15:50.373 "zone_management": false, 00:15:50.373 "zone_append": false, 00:15:50.374 "compare": false, 00:15:50.374 "compare_and_write": false, 00:15:50.374 "abort": true, 00:15:50.374 "seek_hole": false, 00:15:50.374 "seek_data": false, 00:15:50.374 "copy": true, 00:15:50.374 "nvme_iov_md": false 00:15:50.374 }, 00:15:50.374 "memory_domains": [ 00:15:50.374 { 00:15:50.374 "dma_device_id": "system", 00:15:50.374 "dma_device_type": 1 00:15:50.374 }, 00:15:50.374 { 00:15:50.374 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:50.374 "dma_device_type": 2 00:15:50.374 } 00:15:50.374 ], 00:15:50.374 "driver_specific": {} 00:15:50.374 } 00:15:50.374 ] 00:15:50.374 19:51:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:50.374 19:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:50.374 19:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:50.374 19:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:50.374 19:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:50.374 19:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:50.374 19:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:50.374 19:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:50.374 19:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:50.374 19:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:50.374 19:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:50.374 19:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:50.374 19:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:50.374 19:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:50.374 19:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:50.633 19:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:50.633 "name": "Existed_Raid", 00:15:50.633 "uuid": "5a906f32-acd9-43d0-a28b-987d7dde325b", 00:15:50.633 "strip_size_kb": 64, 00:15:50.633 "state": "online", 00:15:50.633 "raid_level": "concat", 00:15:50.633 "superblock": true, 00:15:50.633 "num_base_bdevs": 3, 00:15:50.633 "num_base_bdevs_discovered": 3, 00:15:50.633 "num_base_bdevs_operational": 3, 00:15:50.633 "base_bdevs_list": [ 00:15:50.633 { 00:15:50.633 "name": "BaseBdev1", 00:15:50.633 "uuid": "9fdde662-fcaf-4972-bb4a-c0e6c924ec92", 00:15:50.633 "is_configured": true, 00:15:50.633 "data_offset": 2048, 00:15:50.633 "data_size": 63488 00:15:50.633 }, 00:15:50.633 { 00:15:50.633 "name": "BaseBdev2", 00:15:50.633 "uuid": "020a64b8-68f0-4be6-b6d5-b92ad53f1425", 00:15:50.633 "is_configured": true, 00:15:50.633 "data_offset": 2048, 00:15:50.633 "data_size": 63488 00:15:50.633 }, 00:15:50.633 { 00:15:50.633 "name": "BaseBdev3", 00:15:50.633 "uuid": "9ec07ae0-9116-4de5-956a-3833d0f4607f", 00:15:50.633 "is_configured": true, 00:15:50.633 "data_offset": 2048, 00:15:50.633 "data_size": 63488 00:15:50.633 } 00:15:50.633 ] 00:15:50.633 }' 00:15:50.633 19:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:50.633 19:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:51.569 19:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:51.569 19:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:51.569 19:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:51.569 19:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:51.569 19:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:51.570 19:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:51.570 19:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:51.570 19:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:51.570 [2024-07-24 19:51:43.096258] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:51.570 19:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:51.570 "name": "Existed_Raid", 00:15:51.570 "aliases": [ 00:15:51.570 "5a906f32-acd9-43d0-a28b-987d7dde325b" 00:15:51.570 ], 00:15:51.570 "product_name": "Raid Volume", 00:15:51.570 "block_size": 512, 00:15:51.570 "num_blocks": 190464, 00:15:51.570 "uuid": "5a906f32-acd9-43d0-a28b-987d7dde325b", 00:15:51.570 "assigned_rate_limits": { 00:15:51.570 "rw_ios_per_sec": 0, 00:15:51.570 "rw_mbytes_per_sec": 0, 00:15:51.570 "r_mbytes_per_sec": 0, 00:15:51.570 "w_mbytes_per_sec": 0 00:15:51.570 }, 00:15:51.570 "claimed": false, 00:15:51.570 "zoned": false, 00:15:51.570 "supported_io_types": { 00:15:51.570 "read": true, 00:15:51.570 "write": true, 00:15:51.570 "unmap": true, 00:15:51.570 "flush": true, 00:15:51.570 "reset": true, 00:15:51.570 "nvme_admin": false, 00:15:51.570 "nvme_io": false, 00:15:51.570 "nvme_io_md": false, 00:15:51.570 "write_zeroes": true, 00:15:51.570 "zcopy": false, 00:15:51.570 "get_zone_info": false, 00:15:51.570 "zone_management": false, 00:15:51.570 "zone_append": false, 00:15:51.570 "compare": false, 00:15:51.570 "compare_and_write": false, 00:15:51.570 "abort": false, 00:15:51.570 "seek_hole": false, 00:15:51.570 "seek_data": false, 00:15:51.570 "copy": false, 00:15:51.570 "nvme_iov_md": false 00:15:51.570 }, 00:15:51.570 "memory_domains": [ 00:15:51.570 { 00:15:51.570 "dma_device_id": "system", 00:15:51.570 "dma_device_type": 1 00:15:51.570 }, 00:15:51.570 { 00:15:51.570 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:51.570 "dma_device_type": 2 00:15:51.570 }, 00:15:51.570 { 00:15:51.570 "dma_device_id": "system", 00:15:51.570 "dma_device_type": 1 00:15:51.570 }, 00:15:51.570 { 00:15:51.570 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:51.570 "dma_device_type": 2 00:15:51.570 }, 00:15:51.570 { 00:15:51.570 "dma_device_id": "system", 00:15:51.570 "dma_device_type": 1 00:15:51.570 }, 00:15:51.570 { 00:15:51.570 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:51.570 "dma_device_type": 2 00:15:51.570 } 00:15:51.570 ], 00:15:51.570 "driver_specific": { 00:15:51.570 "raid": { 00:15:51.570 "uuid": "5a906f32-acd9-43d0-a28b-987d7dde325b", 00:15:51.570 "strip_size_kb": 64, 00:15:51.570 "state": "online", 00:15:51.570 "raid_level": "concat", 00:15:51.570 "superblock": true, 00:15:51.570 "num_base_bdevs": 3, 00:15:51.570 "num_base_bdevs_discovered": 3, 00:15:51.570 "num_base_bdevs_operational": 3, 00:15:51.570 "base_bdevs_list": [ 00:15:51.570 { 00:15:51.570 "name": "BaseBdev1", 00:15:51.570 "uuid": "9fdde662-fcaf-4972-bb4a-c0e6c924ec92", 00:15:51.570 "is_configured": true, 00:15:51.570 "data_offset": 2048, 00:15:51.570 "data_size": 63488 00:15:51.570 }, 00:15:51.570 { 00:15:51.570 "name": "BaseBdev2", 00:15:51.570 "uuid": "020a64b8-68f0-4be6-b6d5-b92ad53f1425", 00:15:51.570 "is_configured": true, 00:15:51.570 "data_offset": 2048, 00:15:51.570 "data_size": 63488 00:15:51.570 }, 00:15:51.570 { 00:15:51.570 "name": "BaseBdev3", 00:15:51.570 "uuid": "9ec07ae0-9116-4de5-956a-3833d0f4607f", 00:15:51.570 "is_configured": true, 00:15:51.570 "data_offset": 2048, 00:15:51.570 "data_size": 63488 00:15:51.570 } 00:15:51.570 ] 00:15:51.570 } 00:15:51.570 } 00:15:51.570 }' 00:15:51.570 19:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:51.829 19:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:51.829 BaseBdev2 00:15:51.829 BaseBdev3' 00:15:51.829 19:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:51.829 19:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:51.829 19:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:52.087 19:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:52.087 "name": "BaseBdev1", 00:15:52.087 "aliases": [ 00:15:52.087 "9fdde662-fcaf-4972-bb4a-c0e6c924ec92" 00:15:52.087 ], 00:15:52.087 "product_name": "Malloc disk", 00:15:52.087 "block_size": 512, 00:15:52.087 "num_blocks": 65536, 00:15:52.087 "uuid": "9fdde662-fcaf-4972-bb4a-c0e6c924ec92", 00:15:52.087 "assigned_rate_limits": { 00:15:52.087 "rw_ios_per_sec": 0, 00:15:52.087 "rw_mbytes_per_sec": 0, 00:15:52.087 "r_mbytes_per_sec": 0, 00:15:52.087 "w_mbytes_per_sec": 0 00:15:52.087 }, 00:15:52.087 "claimed": true, 00:15:52.087 "claim_type": "exclusive_write", 00:15:52.087 "zoned": false, 00:15:52.087 "supported_io_types": { 00:15:52.087 "read": true, 00:15:52.087 "write": true, 00:15:52.087 "unmap": true, 00:15:52.087 "flush": true, 00:15:52.088 "reset": true, 00:15:52.088 "nvme_admin": false, 00:15:52.088 "nvme_io": false, 00:15:52.088 "nvme_io_md": false, 00:15:52.088 "write_zeroes": true, 00:15:52.088 "zcopy": true, 00:15:52.088 "get_zone_info": false, 00:15:52.088 "zone_management": false, 00:15:52.088 "zone_append": false, 00:15:52.088 "compare": false, 00:15:52.088 "compare_and_write": false, 00:15:52.088 "abort": true, 00:15:52.088 "seek_hole": false, 00:15:52.088 "seek_data": false, 00:15:52.088 "copy": true, 00:15:52.088 "nvme_iov_md": false 00:15:52.088 }, 00:15:52.088 "memory_domains": [ 00:15:52.088 { 00:15:52.088 "dma_device_id": "system", 00:15:52.088 "dma_device_type": 1 00:15:52.088 }, 00:15:52.088 { 00:15:52.088 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:52.088 "dma_device_type": 2 00:15:52.088 } 00:15:52.088 ], 00:15:52.088 "driver_specific": {} 00:15:52.088 }' 00:15:52.088 19:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:52.088 19:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:52.088 19:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:52.088 19:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:52.088 19:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:52.088 19:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:52.088 19:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:52.346 19:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:52.346 19:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:52.346 19:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:52.346 19:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:52.346 19:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:52.346 19:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:52.346 19:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:52.346 19:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:52.633 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:52.633 "name": "BaseBdev2", 00:15:52.633 "aliases": [ 00:15:52.633 "020a64b8-68f0-4be6-b6d5-b92ad53f1425" 00:15:52.633 ], 00:15:52.633 "product_name": "Malloc disk", 00:15:52.633 "block_size": 512, 00:15:52.633 "num_blocks": 65536, 00:15:52.633 "uuid": "020a64b8-68f0-4be6-b6d5-b92ad53f1425", 00:15:52.633 "assigned_rate_limits": { 00:15:52.633 "rw_ios_per_sec": 0, 00:15:52.633 "rw_mbytes_per_sec": 0, 00:15:52.633 "r_mbytes_per_sec": 0, 00:15:52.633 "w_mbytes_per_sec": 0 00:15:52.633 }, 00:15:52.633 "claimed": true, 00:15:52.633 "claim_type": "exclusive_write", 00:15:52.633 "zoned": false, 00:15:52.633 "supported_io_types": { 00:15:52.633 "read": true, 00:15:52.633 "write": true, 00:15:52.633 "unmap": true, 00:15:52.633 "flush": true, 00:15:52.633 "reset": true, 00:15:52.633 "nvme_admin": false, 00:15:52.633 "nvme_io": false, 00:15:52.633 "nvme_io_md": false, 00:15:52.633 "write_zeroes": true, 00:15:52.633 "zcopy": true, 00:15:52.633 "get_zone_info": false, 00:15:52.633 "zone_management": false, 00:15:52.633 "zone_append": false, 00:15:52.633 "compare": false, 00:15:52.633 "compare_and_write": false, 00:15:52.633 "abort": true, 00:15:52.633 "seek_hole": false, 00:15:52.633 "seek_data": false, 00:15:52.633 "copy": true, 00:15:52.633 "nvme_iov_md": false 00:15:52.633 }, 00:15:52.633 "memory_domains": [ 00:15:52.633 { 00:15:52.633 "dma_device_id": "system", 00:15:52.633 "dma_device_type": 1 00:15:52.633 }, 00:15:52.633 { 00:15:52.633 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:52.633 "dma_device_type": 2 00:15:52.633 } 00:15:52.633 ], 00:15:52.633 "driver_specific": {} 00:15:52.633 }' 00:15:52.633 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:52.633 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:52.633 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:52.633 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:52.633 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:52.633 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:52.633 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:52.892 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:52.892 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:52.892 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:52.892 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:52.892 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:52.892 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:52.892 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:52.892 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:53.151 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:53.151 "name": "BaseBdev3", 00:15:53.151 "aliases": [ 00:15:53.151 "9ec07ae0-9116-4de5-956a-3833d0f4607f" 00:15:53.151 ], 00:15:53.151 "product_name": "Malloc disk", 00:15:53.151 "block_size": 512, 00:15:53.151 "num_blocks": 65536, 00:15:53.151 "uuid": "9ec07ae0-9116-4de5-956a-3833d0f4607f", 00:15:53.151 "assigned_rate_limits": { 00:15:53.151 "rw_ios_per_sec": 0, 00:15:53.151 "rw_mbytes_per_sec": 0, 00:15:53.151 "r_mbytes_per_sec": 0, 00:15:53.151 "w_mbytes_per_sec": 0 00:15:53.151 }, 00:15:53.151 "claimed": true, 00:15:53.151 "claim_type": "exclusive_write", 00:15:53.151 "zoned": false, 00:15:53.151 "supported_io_types": { 00:15:53.151 "read": true, 00:15:53.151 "write": true, 00:15:53.151 "unmap": true, 00:15:53.151 "flush": true, 00:15:53.151 "reset": true, 00:15:53.151 "nvme_admin": false, 00:15:53.151 "nvme_io": false, 00:15:53.151 "nvme_io_md": false, 00:15:53.151 "write_zeroes": true, 00:15:53.151 "zcopy": true, 00:15:53.151 "get_zone_info": false, 00:15:53.151 "zone_management": false, 00:15:53.151 "zone_append": false, 00:15:53.151 "compare": false, 00:15:53.151 "compare_and_write": false, 00:15:53.151 "abort": true, 00:15:53.151 "seek_hole": false, 00:15:53.151 "seek_data": false, 00:15:53.151 "copy": true, 00:15:53.151 "nvme_iov_md": false 00:15:53.151 }, 00:15:53.151 "memory_domains": [ 00:15:53.151 { 00:15:53.151 "dma_device_id": "system", 00:15:53.151 "dma_device_type": 1 00:15:53.151 }, 00:15:53.151 { 00:15:53.151 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:53.151 "dma_device_type": 2 00:15:53.151 } 00:15:53.151 ], 00:15:53.151 "driver_specific": {} 00:15:53.151 }' 00:15:53.151 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:53.151 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:53.151 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:53.151 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:53.410 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:53.410 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:53.410 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:53.410 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:53.410 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:53.410 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:53.410 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:53.410 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:53.410 19:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:53.670 [2024-07-24 19:51:45.209612] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:53.670 [2024-07-24 19:51:45.209642] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:53.670 [2024-07-24 19:51:45.209684] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:53.670 19:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:53.670 19:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:15:53.670 19:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:53.670 19:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:15:53.670 19:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:53.670 19:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:15:53.670 19:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:53.670 19:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:53.670 19:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:53.670 19:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:53.670 19:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:53.670 19:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:53.670 19:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:53.670 19:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:53.670 19:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:53.670 19:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:53.670 19:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:53.929 19:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:53.929 "name": "Existed_Raid", 00:15:53.929 "uuid": "5a906f32-acd9-43d0-a28b-987d7dde325b", 00:15:53.929 "strip_size_kb": 64, 00:15:53.929 "state": "offline", 00:15:53.929 "raid_level": "concat", 00:15:53.929 "superblock": true, 00:15:53.929 "num_base_bdevs": 3, 00:15:53.929 "num_base_bdevs_discovered": 2, 00:15:53.929 "num_base_bdevs_operational": 2, 00:15:53.929 "base_bdevs_list": [ 00:15:53.929 { 00:15:53.929 "name": null, 00:15:53.929 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:53.929 "is_configured": false, 00:15:53.929 "data_offset": 2048, 00:15:53.929 "data_size": 63488 00:15:53.929 }, 00:15:53.929 { 00:15:53.929 "name": "BaseBdev2", 00:15:53.929 "uuid": "020a64b8-68f0-4be6-b6d5-b92ad53f1425", 00:15:53.929 "is_configured": true, 00:15:53.929 "data_offset": 2048, 00:15:53.929 "data_size": 63488 00:15:53.929 }, 00:15:53.929 { 00:15:53.929 "name": "BaseBdev3", 00:15:53.929 "uuid": "9ec07ae0-9116-4de5-956a-3833d0f4607f", 00:15:53.929 "is_configured": true, 00:15:53.929 "data_offset": 2048, 00:15:53.929 "data_size": 63488 00:15:53.929 } 00:15:53.929 ] 00:15:53.929 }' 00:15:53.929 19:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:53.929 19:51:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:54.866 19:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:54.866 19:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:54.866 19:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:54.866 19:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:54.866 19:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:54.866 19:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:54.866 19:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:55.125 [2024-07-24 19:51:46.574313] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:55.125 19:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:55.125 19:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:55.125 19:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:55.125 19:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:55.692 19:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:55.692 19:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:55.692 19:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:55.951 [2024-07-24 19:51:47.355055] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:55.951 [2024-07-24 19:51:47.355104] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21c81d0 name Existed_Raid, state offline 00:15:55.951 19:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:55.951 19:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:55.951 19:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:55.952 19:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:56.210 19:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:56.210 19:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:56.210 19:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:15:56.210 19:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:56.210 19:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:56.210 19:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:56.468 BaseBdev2 00:15:56.468 19:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:56.468 19:51:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:15:56.469 19:51:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:56.469 19:51:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:56.469 19:51:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:56.469 19:51:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:56.469 19:51:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:56.727 19:51:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:56.986 [ 00:15:56.986 { 00:15:56.986 "name": "BaseBdev2", 00:15:56.986 "aliases": [ 00:15:56.986 "499eb674-44d0-4104-9599-13ce601f0fd4" 00:15:56.986 ], 00:15:56.986 "product_name": "Malloc disk", 00:15:56.986 "block_size": 512, 00:15:56.986 "num_blocks": 65536, 00:15:56.986 "uuid": "499eb674-44d0-4104-9599-13ce601f0fd4", 00:15:56.986 "assigned_rate_limits": { 00:15:56.986 "rw_ios_per_sec": 0, 00:15:56.986 "rw_mbytes_per_sec": 0, 00:15:56.986 "r_mbytes_per_sec": 0, 00:15:56.986 "w_mbytes_per_sec": 0 00:15:56.986 }, 00:15:56.986 "claimed": false, 00:15:56.986 "zoned": false, 00:15:56.986 "supported_io_types": { 00:15:56.986 "read": true, 00:15:56.986 "write": true, 00:15:56.986 "unmap": true, 00:15:56.986 "flush": true, 00:15:56.986 "reset": true, 00:15:56.986 "nvme_admin": false, 00:15:56.986 "nvme_io": false, 00:15:56.986 "nvme_io_md": false, 00:15:56.986 "write_zeroes": true, 00:15:56.986 "zcopy": true, 00:15:56.986 "get_zone_info": false, 00:15:56.986 "zone_management": false, 00:15:56.986 "zone_append": false, 00:15:56.986 "compare": false, 00:15:56.986 "compare_and_write": false, 00:15:56.986 "abort": true, 00:15:56.986 "seek_hole": false, 00:15:56.986 "seek_data": false, 00:15:56.986 "copy": true, 00:15:56.986 "nvme_iov_md": false 00:15:56.986 }, 00:15:56.986 "memory_domains": [ 00:15:56.986 { 00:15:56.986 "dma_device_id": "system", 00:15:56.986 "dma_device_type": 1 00:15:56.986 }, 00:15:56.986 { 00:15:56.986 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:56.986 "dma_device_type": 2 00:15:56.986 } 00:15:56.986 ], 00:15:56.986 "driver_specific": {} 00:15:56.986 } 00:15:56.986 ] 00:15:56.986 19:51:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:56.986 19:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:56.986 19:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:56.986 19:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:57.244 BaseBdev3 00:15:57.244 19:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:57.244 19:51:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:15:57.244 19:51:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:57.244 19:51:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:57.244 19:51:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:57.244 19:51:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:57.244 19:51:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:57.503 19:51:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:57.503 [ 00:15:57.503 { 00:15:57.503 "name": "BaseBdev3", 00:15:57.503 "aliases": [ 00:15:57.503 "20831677-77bd-4b1c-85a9-d274cb879247" 00:15:57.503 ], 00:15:57.503 "product_name": "Malloc disk", 00:15:57.503 "block_size": 512, 00:15:57.503 "num_blocks": 65536, 00:15:57.503 "uuid": "20831677-77bd-4b1c-85a9-d274cb879247", 00:15:57.503 "assigned_rate_limits": { 00:15:57.503 "rw_ios_per_sec": 0, 00:15:57.503 "rw_mbytes_per_sec": 0, 00:15:57.503 "r_mbytes_per_sec": 0, 00:15:57.504 "w_mbytes_per_sec": 0 00:15:57.504 }, 00:15:57.504 "claimed": false, 00:15:57.504 "zoned": false, 00:15:57.504 "supported_io_types": { 00:15:57.504 "read": true, 00:15:57.504 "write": true, 00:15:57.504 "unmap": true, 00:15:57.504 "flush": true, 00:15:57.504 "reset": true, 00:15:57.504 "nvme_admin": false, 00:15:57.504 "nvme_io": false, 00:15:57.504 "nvme_io_md": false, 00:15:57.504 "write_zeroes": true, 00:15:57.504 "zcopy": true, 00:15:57.504 "get_zone_info": false, 00:15:57.504 "zone_management": false, 00:15:57.504 "zone_append": false, 00:15:57.504 "compare": false, 00:15:57.504 "compare_and_write": false, 00:15:57.504 "abort": true, 00:15:57.504 "seek_hole": false, 00:15:57.504 "seek_data": false, 00:15:57.504 "copy": true, 00:15:57.504 "nvme_iov_md": false 00:15:57.504 }, 00:15:57.504 "memory_domains": [ 00:15:57.504 { 00:15:57.504 "dma_device_id": "system", 00:15:57.504 "dma_device_type": 1 00:15:57.504 }, 00:15:57.504 { 00:15:57.504 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:57.504 "dma_device_type": 2 00:15:57.504 } 00:15:57.504 ], 00:15:57.504 "driver_specific": {} 00:15:57.504 } 00:15:57.504 ] 00:15:57.763 19:51:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:57.763 19:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:57.763 19:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:57.763 19:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:57.763 [2024-07-24 19:51:49.318856] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:57.763 [2024-07-24 19:51:49.318903] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:57.763 [2024-07-24 19:51:49.318928] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:57.763 [2024-07-24 19:51:49.320327] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:57.763 19:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:57.763 19:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:57.763 19:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:57.763 19:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:57.763 19:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:57.763 19:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:57.763 19:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:57.763 19:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:57.763 19:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:57.763 19:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:57.763 19:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:57.763 19:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:58.022 19:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:58.022 "name": "Existed_Raid", 00:15:58.022 "uuid": "36f3b73d-505d-4e4b-8664-445642f97c5f", 00:15:58.022 "strip_size_kb": 64, 00:15:58.022 "state": "configuring", 00:15:58.022 "raid_level": "concat", 00:15:58.022 "superblock": true, 00:15:58.022 "num_base_bdevs": 3, 00:15:58.022 "num_base_bdevs_discovered": 2, 00:15:58.022 "num_base_bdevs_operational": 3, 00:15:58.022 "base_bdevs_list": [ 00:15:58.022 { 00:15:58.022 "name": "BaseBdev1", 00:15:58.022 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:58.022 "is_configured": false, 00:15:58.022 "data_offset": 0, 00:15:58.022 "data_size": 0 00:15:58.022 }, 00:15:58.022 { 00:15:58.022 "name": "BaseBdev2", 00:15:58.022 "uuid": "499eb674-44d0-4104-9599-13ce601f0fd4", 00:15:58.022 "is_configured": true, 00:15:58.022 "data_offset": 2048, 00:15:58.022 "data_size": 63488 00:15:58.022 }, 00:15:58.022 { 00:15:58.022 "name": "BaseBdev3", 00:15:58.022 "uuid": "20831677-77bd-4b1c-85a9-d274cb879247", 00:15:58.022 "is_configured": true, 00:15:58.022 "data_offset": 2048, 00:15:58.022 "data_size": 63488 00:15:58.022 } 00:15:58.022 ] 00:15:58.022 }' 00:15:58.022 19:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:58.022 19:51:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:58.957 19:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:58.957 [2024-07-24 19:51:50.345557] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:58.957 19:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:58.957 19:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:58.957 19:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:58.957 19:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:58.957 19:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:58.957 19:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:58.957 19:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:58.957 19:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:58.957 19:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:58.957 19:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:58.957 19:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:58.957 19:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:59.215 19:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:59.215 "name": "Existed_Raid", 00:15:59.215 "uuid": "36f3b73d-505d-4e4b-8664-445642f97c5f", 00:15:59.215 "strip_size_kb": 64, 00:15:59.215 "state": "configuring", 00:15:59.215 "raid_level": "concat", 00:15:59.215 "superblock": true, 00:15:59.215 "num_base_bdevs": 3, 00:15:59.215 "num_base_bdevs_discovered": 1, 00:15:59.215 "num_base_bdevs_operational": 3, 00:15:59.215 "base_bdevs_list": [ 00:15:59.215 { 00:15:59.215 "name": "BaseBdev1", 00:15:59.215 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:59.215 "is_configured": false, 00:15:59.215 "data_offset": 0, 00:15:59.215 "data_size": 0 00:15:59.215 }, 00:15:59.215 { 00:15:59.215 "name": null, 00:15:59.215 "uuid": "499eb674-44d0-4104-9599-13ce601f0fd4", 00:15:59.215 "is_configured": false, 00:15:59.215 "data_offset": 2048, 00:15:59.215 "data_size": 63488 00:15:59.215 }, 00:15:59.215 { 00:15:59.215 "name": "BaseBdev3", 00:15:59.215 "uuid": "20831677-77bd-4b1c-85a9-d274cb879247", 00:15:59.215 "is_configured": true, 00:15:59.215 "data_offset": 2048, 00:15:59.215 "data_size": 63488 00:15:59.215 } 00:15:59.215 ] 00:15:59.215 }' 00:15:59.215 19:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:59.215 19:51:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:59.782 19:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:59.782 19:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:00.042 19:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:00.042 19:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:00.302 [2024-07-24 19:51:51.732696] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:00.302 BaseBdev1 00:16:00.302 19:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:00.302 19:51:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:16:00.302 19:51:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:00.302 19:51:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:16:00.302 19:51:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:00.302 19:51:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:00.302 19:51:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:00.561 19:51:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:00.821 [ 00:16:00.821 { 00:16:00.821 "name": "BaseBdev1", 00:16:00.821 "aliases": [ 00:16:00.821 "b6a46031-03d6-41af-9df8-2db04adf028f" 00:16:00.821 ], 00:16:00.821 "product_name": "Malloc disk", 00:16:00.821 "block_size": 512, 00:16:00.821 "num_blocks": 65536, 00:16:00.821 "uuid": "b6a46031-03d6-41af-9df8-2db04adf028f", 00:16:00.821 "assigned_rate_limits": { 00:16:00.821 "rw_ios_per_sec": 0, 00:16:00.821 "rw_mbytes_per_sec": 0, 00:16:00.821 "r_mbytes_per_sec": 0, 00:16:00.821 "w_mbytes_per_sec": 0 00:16:00.821 }, 00:16:00.821 "claimed": true, 00:16:00.821 "claim_type": "exclusive_write", 00:16:00.821 "zoned": false, 00:16:00.821 "supported_io_types": { 00:16:00.821 "read": true, 00:16:00.821 "write": true, 00:16:00.821 "unmap": true, 00:16:00.821 "flush": true, 00:16:00.821 "reset": true, 00:16:00.821 "nvme_admin": false, 00:16:00.821 "nvme_io": false, 00:16:00.821 "nvme_io_md": false, 00:16:00.821 "write_zeroes": true, 00:16:00.821 "zcopy": true, 00:16:00.821 "get_zone_info": false, 00:16:00.821 "zone_management": false, 00:16:00.821 "zone_append": false, 00:16:00.821 "compare": false, 00:16:00.821 "compare_and_write": false, 00:16:00.821 "abort": true, 00:16:00.821 "seek_hole": false, 00:16:00.821 "seek_data": false, 00:16:00.821 "copy": true, 00:16:00.821 "nvme_iov_md": false 00:16:00.821 }, 00:16:00.821 "memory_domains": [ 00:16:00.821 { 00:16:00.821 "dma_device_id": "system", 00:16:00.821 "dma_device_type": 1 00:16:00.821 }, 00:16:00.821 { 00:16:00.821 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:00.821 "dma_device_type": 2 00:16:00.821 } 00:16:00.821 ], 00:16:00.821 "driver_specific": {} 00:16:00.821 } 00:16:00.821 ] 00:16:00.821 19:51:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:16:00.821 19:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:00.821 19:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:00.821 19:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:00.821 19:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:00.821 19:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:00.821 19:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:00.821 19:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:00.821 19:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:00.821 19:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:00.821 19:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:00.821 19:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:00.821 19:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:01.081 19:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:01.081 "name": "Existed_Raid", 00:16:01.081 "uuid": "36f3b73d-505d-4e4b-8664-445642f97c5f", 00:16:01.081 "strip_size_kb": 64, 00:16:01.081 "state": "configuring", 00:16:01.081 "raid_level": "concat", 00:16:01.081 "superblock": true, 00:16:01.081 "num_base_bdevs": 3, 00:16:01.081 "num_base_bdevs_discovered": 2, 00:16:01.081 "num_base_bdevs_operational": 3, 00:16:01.081 "base_bdevs_list": [ 00:16:01.081 { 00:16:01.081 "name": "BaseBdev1", 00:16:01.081 "uuid": "b6a46031-03d6-41af-9df8-2db04adf028f", 00:16:01.081 "is_configured": true, 00:16:01.081 "data_offset": 2048, 00:16:01.081 "data_size": 63488 00:16:01.081 }, 00:16:01.081 { 00:16:01.081 "name": null, 00:16:01.081 "uuid": "499eb674-44d0-4104-9599-13ce601f0fd4", 00:16:01.081 "is_configured": false, 00:16:01.081 "data_offset": 2048, 00:16:01.081 "data_size": 63488 00:16:01.081 }, 00:16:01.081 { 00:16:01.081 "name": "BaseBdev3", 00:16:01.081 "uuid": "20831677-77bd-4b1c-85a9-d274cb879247", 00:16:01.081 "is_configured": true, 00:16:01.081 "data_offset": 2048, 00:16:01.081 "data_size": 63488 00:16:01.081 } 00:16:01.081 ] 00:16:01.081 }' 00:16:01.081 19:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:01.081 19:51:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:01.649 19:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:01.649 19:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:01.908 19:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:01.908 19:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:02.168 [2024-07-24 19:51:53.597661] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:02.168 19:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:02.168 19:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:02.168 19:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:02.168 19:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:02.168 19:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:02.168 19:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:02.168 19:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:02.168 19:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:02.168 19:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:02.168 19:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:02.168 19:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.168 19:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:02.427 19:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:02.427 "name": "Existed_Raid", 00:16:02.427 "uuid": "36f3b73d-505d-4e4b-8664-445642f97c5f", 00:16:02.427 "strip_size_kb": 64, 00:16:02.427 "state": "configuring", 00:16:02.427 "raid_level": "concat", 00:16:02.427 "superblock": true, 00:16:02.427 "num_base_bdevs": 3, 00:16:02.427 "num_base_bdevs_discovered": 1, 00:16:02.427 "num_base_bdevs_operational": 3, 00:16:02.427 "base_bdevs_list": [ 00:16:02.427 { 00:16:02.427 "name": "BaseBdev1", 00:16:02.427 "uuid": "b6a46031-03d6-41af-9df8-2db04adf028f", 00:16:02.427 "is_configured": true, 00:16:02.427 "data_offset": 2048, 00:16:02.427 "data_size": 63488 00:16:02.427 }, 00:16:02.427 { 00:16:02.427 "name": null, 00:16:02.427 "uuid": "499eb674-44d0-4104-9599-13ce601f0fd4", 00:16:02.427 "is_configured": false, 00:16:02.427 "data_offset": 2048, 00:16:02.427 "data_size": 63488 00:16:02.427 }, 00:16:02.427 { 00:16:02.427 "name": null, 00:16:02.427 "uuid": "20831677-77bd-4b1c-85a9-d274cb879247", 00:16:02.427 "is_configured": false, 00:16:02.427 "data_offset": 2048, 00:16:02.427 "data_size": 63488 00:16:02.427 } 00:16:02.427 ] 00:16:02.427 }' 00:16:02.427 19:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:02.427 19:51:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:02.996 19:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.996 19:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:03.256 19:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:03.256 19:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:03.824 [2024-07-24 19:51:55.201958] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:03.824 19:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:03.824 19:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:03.824 19:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:03.824 19:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:03.824 19:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:03.824 19:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:03.824 19:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:03.824 19:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:03.824 19:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:03.824 19:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:03.824 19:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.824 19:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:04.084 19:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:04.084 "name": "Existed_Raid", 00:16:04.084 "uuid": "36f3b73d-505d-4e4b-8664-445642f97c5f", 00:16:04.084 "strip_size_kb": 64, 00:16:04.084 "state": "configuring", 00:16:04.084 "raid_level": "concat", 00:16:04.084 "superblock": true, 00:16:04.084 "num_base_bdevs": 3, 00:16:04.084 "num_base_bdevs_discovered": 2, 00:16:04.084 "num_base_bdevs_operational": 3, 00:16:04.084 "base_bdevs_list": [ 00:16:04.084 { 00:16:04.084 "name": "BaseBdev1", 00:16:04.084 "uuid": "b6a46031-03d6-41af-9df8-2db04adf028f", 00:16:04.084 "is_configured": true, 00:16:04.084 "data_offset": 2048, 00:16:04.084 "data_size": 63488 00:16:04.084 }, 00:16:04.084 { 00:16:04.084 "name": null, 00:16:04.084 "uuid": "499eb674-44d0-4104-9599-13ce601f0fd4", 00:16:04.084 "is_configured": false, 00:16:04.084 "data_offset": 2048, 00:16:04.084 "data_size": 63488 00:16:04.084 }, 00:16:04.084 { 00:16:04.084 "name": "BaseBdev3", 00:16:04.084 "uuid": "20831677-77bd-4b1c-85a9-d274cb879247", 00:16:04.084 "is_configured": true, 00:16:04.084 "data_offset": 2048, 00:16:04.084 "data_size": 63488 00:16:04.084 } 00:16:04.084 ] 00:16:04.084 }' 00:16:04.084 19:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:04.084 19:51:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:04.652 19:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:04.652 19:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:05.220 19:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:05.220 19:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:05.479 [2024-07-24 19:51:56.906522] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:05.479 19:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:05.479 19:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:05.479 19:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:05.479 19:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:05.479 19:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:05.479 19:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:05.479 19:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:05.479 19:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:05.479 19:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:05.479 19:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:05.479 19:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:05.479 19:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:06.047 19:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:06.047 "name": "Existed_Raid", 00:16:06.047 "uuid": "36f3b73d-505d-4e4b-8664-445642f97c5f", 00:16:06.047 "strip_size_kb": 64, 00:16:06.047 "state": "configuring", 00:16:06.047 "raid_level": "concat", 00:16:06.047 "superblock": true, 00:16:06.047 "num_base_bdevs": 3, 00:16:06.047 "num_base_bdevs_discovered": 1, 00:16:06.047 "num_base_bdevs_operational": 3, 00:16:06.047 "base_bdevs_list": [ 00:16:06.047 { 00:16:06.047 "name": null, 00:16:06.047 "uuid": "b6a46031-03d6-41af-9df8-2db04adf028f", 00:16:06.047 "is_configured": false, 00:16:06.047 "data_offset": 2048, 00:16:06.047 "data_size": 63488 00:16:06.047 }, 00:16:06.047 { 00:16:06.047 "name": null, 00:16:06.047 "uuid": "499eb674-44d0-4104-9599-13ce601f0fd4", 00:16:06.047 "is_configured": false, 00:16:06.047 "data_offset": 2048, 00:16:06.047 "data_size": 63488 00:16:06.047 }, 00:16:06.047 { 00:16:06.047 "name": "BaseBdev3", 00:16:06.047 "uuid": "20831677-77bd-4b1c-85a9-d274cb879247", 00:16:06.047 "is_configured": true, 00:16:06.047 "data_offset": 2048, 00:16:06.047 "data_size": 63488 00:16:06.047 } 00:16:06.047 ] 00:16:06.047 }' 00:16:06.047 19:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:06.047 19:51:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:06.615 19:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:06.615 19:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:06.874 19:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:06.874 19:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:07.442 [2024-07-24 19:51:58.875815] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:07.442 19:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:07.442 19:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:07.442 19:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:07.442 19:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:07.442 19:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:07.442 19:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:07.442 19:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:07.442 19:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:07.442 19:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:07.442 19:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:07.442 19:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:07.442 19:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:07.702 19:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:07.702 "name": "Existed_Raid", 00:16:07.702 "uuid": "36f3b73d-505d-4e4b-8664-445642f97c5f", 00:16:07.702 "strip_size_kb": 64, 00:16:07.702 "state": "configuring", 00:16:07.702 "raid_level": "concat", 00:16:07.702 "superblock": true, 00:16:07.702 "num_base_bdevs": 3, 00:16:07.702 "num_base_bdevs_discovered": 2, 00:16:07.702 "num_base_bdevs_operational": 3, 00:16:07.702 "base_bdevs_list": [ 00:16:07.702 { 00:16:07.702 "name": null, 00:16:07.702 "uuid": "b6a46031-03d6-41af-9df8-2db04adf028f", 00:16:07.702 "is_configured": false, 00:16:07.702 "data_offset": 2048, 00:16:07.702 "data_size": 63488 00:16:07.702 }, 00:16:07.702 { 00:16:07.702 "name": "BaseBdev2", 00:16:07.702 "uuid": "499eb674-44d0-4104-9599-13ce601f0fd4", 00:16:07.702 "is_configured": true, 00:16:07.702 "data_offset": 2048, 00:16:07.702 "data_size": 63488 00:16:07.702 }, 00:16:07.702 { 00:16:07.702 "name": "BaseBdev3", 00:16:07.702 "uuid": "20831677-77bd-4b1c-85a9-d274cb879247", 00:16:07.702 "is_configured": true, 00:16:07.702 "data_offset": 2048, 00:16:07.702 "data_size": 63488 00:16:07.702 } 00:16:07.702 ] 00:16:07.702 }' 00:16:07.702 19:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:07.702 19:51:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:08.270 19:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.270 19:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:08.531 19:52:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:08.531 19:52:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.531 19:52:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:08.790 19:52:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u b6a46031-03d6-41af-9df8-2db04adf028f 00:16:09.050 [2024-07-24 19:52:00.533143] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:09.050 [2024-07-24 19:52:00.533317] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x21c85c0 00:16:09.050 [2024-07-24 19:52:00.533331] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:09.050 [2024-07-24 19:52:00.533529] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2374fd0 00:16:09.050 [2024-07-24 19:52:00.533654] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21c85c0 00:16:09.050 [2024-07-24 19:52:00.533665] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x21c85c0 00:16:09.050 [2024-07-24 19:52:00.533767] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:09.050 NewBaseBdev 00:16:09.050 19:52:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:09.050 19:52:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:16:09.050 19:52:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:09.050 19:52:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:16:09.050 19:52:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:09.050 19:52:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:09.050 19:52:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:09.309 19:52:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:09.600 [ 00:16:09.600 { 00:16:09.600 "name": "NewBaseBdev", 00:16:09.600 "aliases": [ 00:16:09.600 "b6a46031-03d6-41af-9df8-2db04adf028f" 00:16:09.600 ], 00:16:09.601 "product_name": "Malloc disk", 00:16:09.601 "block_size": 512, 00:16:09.601 "num_blocks": 65536, 00:16:09.601 "uuid": "b6a46031-03d6-41af-9df8-2db04adf028f", 00:16:09.601 "assigned_rate_limits": { 00:16:09.601 "rw_ios_per_sec": 0, 00:16:09.601 "rw_mbytes_per_sec": 0, 00:16:09.601 "r_mbytes_per_sec": 0, 00:16:09.601 "w_mbytes_per_sec": 0 00:16:09.601 }, 00:16:09.601 "claimed": true, 00:16:09.601 "claim_type": "exclusive_write", 00:16:09.601 "zoned": false, 00:16:09.601 "supported_io_types": { 00:16:09.601 "read": true, 00:16:09.601 "write": true, 00:16:09.601 "unmap": true, 00:16:09.601 "flush": true, 00:16:09.601 "reset": true, 00:16:09.601 "nvme_admin": false, 00:16:09.601 "nvme_io": false, 00:16:09.601 "nvme_io_md": false, 00:16:09.601 "write_zeroes": true, 00:16:09.601 "zcopy": true, 00:16:09.601 "get_zone_info": false, 00:16:09.601 "zone_management": false, 00:16:09.601 "zone_append": false, 00:16:09.601 "compare": false, 00:16:09.601 "compare_and_write": false, 00:16:09.601 "abort": true, 00:16:09.601 "seek_hole": false, 00:16:09.601 "seek_data": false, 00:16:09.601 "copy": true, 00:16:09.601 "nvme_iov_md": false 00:16:09.601 }, 00:16:09.601 "memory_domains": [ 00:16:09.601 { 00:16:09.601 "dma_device_id": "system", 00:16:09.601 "dma_device_type": 1 00:16:09.601 }, 00:16:09.601 { 00:16:09.601 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:09.601 "dma_device_type": 2 00:16:09.601 } 00:16:09.601 ], 00:16:09.601 "driver_specific": {} 00:16:09.601 } 00:16:09.601 ] 00:16:09.601 19:52:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:16:09.601 19:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:16:09.601 19:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:09.601 19:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:09.601 19:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:09.601 19:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:09.601 19:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:09.601 19:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:09.601 19:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:09.601 19:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:09.601 19:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:09.601 19:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:09.601 19:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:09.860 19:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:09.860 "name": "Existed_Raid", 00:16:09.860 "uuid": "36f3b73d-505d-4e4b-8664-445642f97c5f", 00:16:09.860 "strip_size_kb": 64, 00:16:09.860 "state": "online", 00:16:09.860 "raid_level": "concat", 00:16:09.860 "superblock": true, 00:16:09.860 "num_base_bdevs": 3, 00:16:09.860 "num_base_bdevs_discovered": 3, 00:16:09.860 "num_base_bdevs_operational": 3, 00:16:09.860 "base_bdevs_list": [ 00:16:09.860 { 00:16:09.860 "name": "NewBaseBdev", 00:16:09.860 "uuid": "b6a46031-03d6-41af-9df8-2db04adf028f", 00:16:09.860 "is_configured": true, 00:16:09.860 "data_offset": 2048, 00:16:09.860 "data_size": 63488 00:16:09.860 }, 00:16:09.860 { 00:16:09.860 "name": "BaseBdev2", 00:16:09.860 "uuid": "499eb674-44d0-4104-9599-13ce601f0fd4", 00:16:09.860 "is_configured": true, 00:16:09.860 "data_offset": 2048, 00:16:09.860 "data_size": 63488 00:16:09.860 }, 00:16:09.860 { 00:16:09.860 "name": "BaseBdev3", 00:16:09.860 "uuid": "20831677-77bd-4b1c-85a9-d274cb879247", 00:16:09.860 "is_configured": true, 00:16:09.860 "data_offset": 2048, 00:16:09.860 "data_size": 63488 00:16:09.860 } 00:16:09.860 ] 00:16:09.860 }' 00:16:09.860 19:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:09.860 19:52:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:10.428 19:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:10.428 19:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:10.428 19:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:10.428 19:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:10.428 19:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:10.428 19:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:10.428 19:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:10.428 19:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:10.687 [2024-07-24 19:52:02.129658] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:10.687 19:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:10.687 "name": "Existed_Raid", 00:16:10.687 "aliases": [ 00:16:10.687 "36f3b73d-505d-4e4b-8664-445642f97c5f" 00:16:10.687 ], 00:16:10.687 "product_name": "Raid Volume", 00:16:10.687 "block_size": 512, 00:16:10.687 "num_blocks": 190464, 00:16:10.687 "uuid": "36f3b73d-505d-4e4b-8664-445642f97c5f", 00:16:10.687 "assigned_rate_limits": { 00:16:10.687 "rw_ios_per_sec": 0, 00:16:10.687 "rw_mbytes_per_sec": 0, 00:16:10.687 "r_mbytes_per_sec": 0, 00:16:10.687 "w_mbytes_per_sec": 0 00:16:10.687 }, 00:16:10.687 "claimed": false, 00:16:10.687 "zoned": false, 00:16:10.687 "supported_io_types": { 00:16:10.687 "read": true, 00:16:10.687 "write": true, 00:16:10.687 "unmap": true, 00:16:10.687 "flush": true, 00:16:10.687 "reset": true, 00:16:10.687 "nvme_admin": false, 00:16:10.687 "nvme_io": false, 00:16:10.687 "nvme_io_md": false, 00:16:10.687 "write_zeroes": true, 00:16:10.687 "zcopy": false, 00:16:10.687 "get_zone_info": false, 00:16:10.687 "zone_management": false, 00:16:10.687 "zone_append": false, 00:16:10.687 "compare": false, 00:16:10.687 "compare_and_write": false, 00:16:10.687 "abort": false, 00:16:10.687 "seek_hole": false, 00:16:10.687 "seek_data": false, 00:16:10.687 "copy": false, 00:16:10.687 "nvme_iov_md": false 00:16:10.687 }, 00:16:10.687 "memory_domains": [ 00:16:10.687 { 00:16:10.687 "dma_device_id": "system", 00:16:10.687 "dma_device_type": 1 00:16:10.687 }, 00:16:10.687 { 00:16:10.687 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.687 "dma_device_type": 2 00:16:10.687 }, 00:16:10.687 { 00:16:10.687 "dma_device_id": "system", 00:16:10.687 "dma_device_type": 1 00:16:10.687 }, 00:16:10.687 { 00:16:10.687 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.687 "dma_device_type": 2 00:16:10.687 }, 00:16:10.687 { 00:16:10.687 "dma_device_id": "system", 00:16:10.687 "dma_device_type": 1 00:16:10.687 }, 00:16:10.687 { 00:16:10.687 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.687 "dma_device_type": 2 00:16:10.687 } 00:16:10.687 ], 00:16:10.687 "driver_specific": { 00:16:10.687 "raid": { 00:16:10.687 "uuid": "36f3b73d-505d-4e4b-8664-445642f97c5f", 00:16:10.687 "strip_size_kb": 64, 00:16:10.687 "state": "online", 00:16:10.687 "raid_level": "concat", 00:16:10.687 "superblock": true, 00:16:10.687 "num_base_bdevs": 3, 00:16:10.687 "num_base_bdevs_discovered": 3, 00:16:10.687 "num_base_bdevs_operational": 3, 00:16:10.688 "base_bdevs_list": [ 00:16:10.688 { 00:16:10.688 "name": "NewBaseBdev", 00:16:10.688 "uuid": "b6a46031-03d6-41af-9df8-2db04adf028f", 00:16:10.688 "is_configured": true, 00:16:10.688 "data_offset": 2048, 00:16:10.688 "data_size": 63488 00:16:10.688 }, 00:16:10.688 { 00:16:10.688 "name": "BaseBdev2", 00:16:10.688 "uuid": "499eb674-44d0-4104-9599-13ce601f0fd4", 00:16:10.688 "is_configured": true, 00:16:10.688 "data_offset": 2048, 00:16:10.688 "data_size": 63488 00:16:10.688 }, 00:16:10.688 { 00:16:10.688 "name": "BaseBdev3", 00:16:10.688 "uuid": "20831677-77bd-4b1c-85a9-d274cb879247", 00:16:10.688 "is_configured": true, 00:16:10.688 "data_offset": 2048, 00:16:10.688 "data_size": 63488 00:16:10.688 } 00:16:10.688 ] 00:16:10.688 } 00:16:10.688 } 00:16:10.688 }' 00:16:10.688 19:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:10.688 19:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:10.688 BaseBdev2 00:16:10.688 BaseBdev3' 00:16:10.688 19:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:10.688 19:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:10.688 19:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:10.948 19:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:10.948 "name": "NewBaseBdev", 00:16:10.948 "aliases": [ 00:16:10.948 "b6a46031-03d6-41af-9df8-2db04adf028f" 00:16:10.948 ], 00:16:10.948 "product_name": "Malloc disk", 00:16:10.948 "block_size": 512, 00:16:10.948 "num_blocks": 65536, 00:16:10.948 "uuid": "b6a46031-03d6-41af-9df8-2db04adf028f", 00:16:10.948 "assigned_rate_limits": { 00:16:10.948 "rw_ios_per_sec": 0, 00:16:10.948 "rw_mbytes_per_sec": 0, 00:16:10.948 "r_mbytes_per_sec": 0, 00:16:10.948 "w_mbytes_per_sec": 0 00:16:10.948 }, 00:16:10.948 "claimed": true, 00:16:10.948 "claim_type": "exclusive_write", 00:16:10.948 "zoned": false, 00:16:10.948 "supported_io_types": { 00:16:10.948 "read": true, 00:16:10.948 "write": true, 00:16:10.948 "unmap": true, 00:16:10.948 "flush": true, 00:16:10.948 "reset": true, 00:16:10.948 "nvme_admin": false, 00:16:10.948 "nvme_io": false, 00:16:10.948 "nvme_io_md": false, 00:16:10.948 "write_zeroes": true, 00:16:10.948 "zcopy": true, 00:16:10.948 "get_zone_info": false, 00:16:10.948 "zone_management": false, 00:16:10.948 "zone_append": false, 00:16:10.948 "compare": false, 00:16:10.948 "compare_and_write": false, 00:16:10.948 "abort": true, 00:16:10.948 "seek_hole": false, 00:16:10.948 "seek_data": false, 00:16:10.948 "copy": true, 00:16:10.948 "nvme_iov_md": false 00:16:10.948 }, 00:16:10.948 "memory_domains": [ 00:16:10.948 { 00:16:10.948 "dma_device_id": "system", 00:16:10.948 "dma_device_type": 1 00:16:10.948 }, 00:16:10.948 { 00:16:10.948 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.948 "dma_device_type": 2 00:16:10.948 } 00:16:10.948 ], 00:16:10.948 "driver_specific": {} 00:16:10.948 }' 00:16:10.948 19:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:10.948 19:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:11.207 19:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:11.207 19:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:11.207 19:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:11.207 19:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:11.207 19:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:11.207 19:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:11.207 19:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:11.207 19:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:11.207 19:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:11.207 19:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:11.207 19:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:11.207 19:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:11.207 19:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:11.466 19:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:11.466 "name": "BaseBdev2", 00:16:11.466 "aliases": [ 00:16:11.466 "499eb674-44d0-4104-9599-13ce601f0fd4" 00:16:11.466 ], 00:16:11.466 "product_name": "Malloc disk", 00:16:11.466 "block_size": 512, 00:16:11.466 "num_blocks": 65536, 00:16:11.466 "uuid": "499eb674-44d0-4104-9599-13ce601f0fd4", 00:16:11.466 "assigned_rate_limits": { 00:16:11.466 "rw_ios_per_sec": 0, 00:16:11.466 "rw_mbytes_per_sec": 0, 00:16:11.466 "r_mbytes_per_sec": 0, 00:16:11.466 "w_mbytes_per_sec": 0 00:16:11.466 }, 00:16:11.466 "claimed": true, 00:16:11.466 "claim_type": "exclusive_write", 00:16:11.466 "zoned": false, 00:16:11.466 "supported_io_types": { 00:16:11.466 "read": true, 00:16:11.466 "write": true, 00:16:11.466 "unmap": true, 00:16:11.466 "flush": true, 00:16:11.466 "reset": true, 00:16:11.466 "nvme_admin": false, 00:16:11.466 "nvme_io": false, 00:16:11.466 "nvme_io_md": false, 00:16:11.466 "write_zeroes": true, 00:16:11.466 "zcopy": true, 00:16:11.466 "get_zone_info": false, 00:16:11.466 "zone_management": false, 00:16:11.466 "zone_append": false, 00:16:11.466 "compare": false, 00:16:11.466 "compare_and_write": false, 00:16:11.466 "abort": true, 00:16:11.466 "seek_hole": false, 00:16:11.466 "seek_data": false, 00:16:11.466 "copy": true, 00:16:11.466 "nvme_iov_md": false 00:16:11.466 }, 00:16:11.466 "memory_domains": [ 00:16:11.466 { 00:16:11.466 "dma_device_id": "system", 00:16:11.466 "dma_device_type": 1 00:16:11.466 }, 00:16:11.466 { 00:16:11.466 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.466 "dma_device_type": 2 00:16:11.466 } 00:16:11.466 ], 00:16:11.466 "driver_specific": {} 00:16:11.466 }' 00:16:11.466 19:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:11.725 19:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:11.725 19:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:11.725 19:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:11.725 19:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:11.725 19:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:11.725 19:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:11.725 19:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:11.725 19:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:11.725 19:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:11.982 19:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:11.983 19:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:11.983 19:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:11.983 19:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:11.983 19:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:12.241 19:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:12.241 "name": "BaseBdev3", 00:16:12.241 "aliases": [ 00:16:12.241 "20831677-77bd-4b1c-85a9-d274cb879247" 00:16:12.241 ], 00:16:12.241 "product_name": "Malloc disk", 00:16:12.241 "block_size": 512, 00:16:12.241 "num_blocks": 65536, 00:16:12.241 "uuid": "20831677-77bd-4b1c-85a9-d274cb879247", 00:16:12.241 "assigned_rate_limits": { 00:16:12.241 "rw_ios_per_sec": 0, 00:16:12.241 "rw_mbytes_per_sec": 0, 00:16:12.241 "r_mbytes_per_sec": 0, 00:16:12.241 "w_mbytes_per_sec": 0 00:16:12.241 }, 00:16:12.241 "claimed": true, 00:16:12.241 "claim_type": "exclusive_write", 00:16:12.241 "zoned": false, 00:16:12.241 "supported_io_types": { 00:16:12.241 "read": true, 00:16:12.241 "write": true, 00:16:12.241 "unmap": true, 00:16:12.241 "flush": true, 00:16:12.241 "reset": true, 00:16:12.241 "nvme_admin": false, 00:16:12.241 "nvme_io": false, 00:16:12.241 "nvme_io_md": false, 00:16:12.241 "write_zeroes": true, 00:16:12.241 "zcopy": true, 00:16:12.241 "get_zone_info": false, 00:16:12.241 "zone_management": false, 00:16:12.241 "zone_append": false, 00:16:12.241 "compare": false, 00:16:12.241 "compare_and_write": false, 00:16:12.241 "abort": true, 00:16:12.241 "seek_hole": false, 00:16:12.241 "seek_data": false, 00:16:12.241 "copy": true, 00:16:12.241 "nvme_iov_md": false 00:16:12.241 }, 00:16:12.241 "memory_domains": [ 00:16:12.241 { 00:16:12.241 "dma_device_id": "system", 00:16:12.241 "dma_device_type": 1 00:16:12.241 }, 00:16:12.241 { 00:16:12.241 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:12.241 "dma_device_type": 2 00:16:12.241 } 00:16:12.241 ], 00:16:12.241 "driver_specific": {} 00:16:12.241 }' 00:16:12.241 19:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:12.241 19:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:12.241 19:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:12.241 19:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:12.241 19:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:12.241 19:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:12.241 19:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:12.499 19:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:12.499 19:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:12.499 19:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:12.499 19:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:12.499 19:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:12.499 19:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:12.758 [2024-07-24 19:52:04.226933] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:12.758 [2024-07-24 19:52:04.226963] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:12.758 [2024-07-24 19:52:04.227023] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:12.758 [2024-07-24 19:52:04.227084] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:12.758 [2024-07-24 19:52:04.227097] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21c85c0 name Existed_Raid, state offline 00:16:12.758 19:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1412308 00:16:12.758 19:52:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1412308 ']' 00:16:12.758 19:52:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 1412308 00:16:12.758 19:52:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:16:12.758 19:52:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:16:12.758 19:52:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1412308 00:16:12.758 19:52:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:16:12.758 19:52:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:16:12.758 19:52:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1412308' 00:16:12.758 killing process with pid 1412308 00:16:12.758 19:52:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 1412308 00:16:12.758 [2024-07-24 19:52:04.293935] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:12.758 19:52:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 1412308 00:16:13.017 [2024-07-24 19:52:04.359130] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:13.277 19:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:16:13.277 00:16:13.277 real 0m30.991s 00:16:13.277 user 0m56.817s 00:16:13.277 sys 0m5.346s 00:16:13.277 19:52:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:16:13.277 19:52:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:13.277 ************************************ 00:16:13.277 END TEST raid_state_function_test_sb 00:16:13.277 ************************************ 00:16:13.277 19:52:04 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:16:13.277 19:52:04 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:16:13.277 19:52:04 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:16:13.277 19:52:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:13.277 ************************************ 00:16:13.277 START TEST raid_superblock_test 00:16:13.277 ************************************ 00:16:13.277 19:52:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test concat 3 00:16:13.277 19:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=concat 00:16:13.277 19:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=3 00:16:13.277 19:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:16:13.277 19:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:16:13.277 19:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:16:13.277 19:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:16:13.277 19:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:16:13.277 19:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:16:13.277 19:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:16:13.277 19:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:16:13.277 19:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:16:13.277 19:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:16:13.277 19:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:16:13.277 19:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' concat '!=' raid1 ']' 00:16:13.277 19:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # strip_size=64 00:16:13.277 19:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # strip_size_create_arg='-z 64' 00:16:13.277 19:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=1416940 00:16:13.277 19:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 1416940 /var/tmp/spdk-raid.sock 00:16:13.277 19:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:16:13.277 19:52:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1416940 ']' 00:16:13.277 19:52:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:13.277 19:52:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:16:13.277 19:52:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:13.277 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:13.277 19:52:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:16:13.277 19:52:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:13.536 [2024-07-24 19:52:04.912562] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:16:13.536 [2024-07-24 19:52:04.912629] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1416940 ] 00:16:13.536 [2024-07-24 19:52:05.043908] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:13.794 [2024-07-24 19:52:05.156042] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:13.794 [2024-07-24 19:52:05.220640] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:13.794 [2024-07-24 19:52:05.220671] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:13.794 19:52:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:16:13.794 19:52:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:16:13.794 19:52:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:16:13.794 19:52:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:16:13.794 19:52:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:16:13.794 19:52:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:16:13.794 19:52:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:16:13.794 19:52:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:13.794 19:52:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:16:13.794 19:52:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:13.794 19:52:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:16:14.053 malloc1 00:16:14.053 19:52:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:14.311 [2024-07-24 19:52:05.850803] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:14.311 [2024-07-24 19:52:05.850851] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:14.311 [2024-07-24 19:52:05.850873] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20eb590 00:16:14.311 [2024-07-24 19:52:05.850886] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:14.311 [2024-07-24 19:52:05.852619] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:14.311 [2024-07-24 19:52:05.852648] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:14.311 pt1 00:16:14.311 19:52:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:16:14.312 19:52:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:16:14.312 19:52:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:16:14.312 19:52:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:16:14.312 19:52:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:16:14.312 19:52:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:14.312 19:52:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:16:14.312 19:52:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:14.312 19:52:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:16:14.570 malloc2 00:16:14.570 19:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:14.829 [2024-07-24 19:52:06.338044] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:14.829 [2024-07-24 19:52:06.338090] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:14.829 [2024-07-24 19:52:06.338108] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2291690 00:16:14.829 [2024-07-24 19:52:06.338120] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:14.829 [2024-07-24 19:52:06.339657] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:14.829 [2024-07-24 19:52:06.339685] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:14.829 pt2 00:16:14.829 19:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:16:14.829 19:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:16:14.829 19:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc3 00:16:14.829 19:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt3 00:16:14.829 19:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:16:14.829 19:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:14.829 19:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:16:14.829 19:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:14.829 19:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:16:15.088 malloc3 00:16:15.088 19:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:15.347 [2024-07-24 19:52:06.837207] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:15.347 [2024-07-24 19:52:06.837254] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:15.347 [2024-07-24 19:52:06.837273] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2292fc0 00:16:15.347 [2024-07-24 19:52:06.837285] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:15.348 [2024-07-24 19:52:06.838866] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:15.348 [2024-07-24 19:52:06.838895] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:15.348 pt3 00:16:15.348 19:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:16:15.348 19:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:16:15.348 19:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:16:15.606 [2024-07-24 19:52:07.081892] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:15.606 [2024-07-24 19:52:07.083215] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:15.606 [2024-07-24 19:52:07.083270] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:15.606 [2024-07-24 19:52:07.083438] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2293d10 00:16:15.606 [2024-07-24 19:52:07.083449] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:15.606 [2024-07-24 19:52:07.083648] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2102480 00:16:15.606 [2024-07-24 19:52:07.083793] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2293d10 00:16:15.606 [2024-07-24 19:52:07.083804] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2293d10 00:16:15.606 [2024-07-24 19:52:07.083904] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:15.606 19:52:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:15.607 19:52:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:15.607 19:52:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:15.607 19:52:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:15.607 19:52:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:15.607 19:52:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:15.607 19:52:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:15.607 19:52:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:15.607 19:52:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:15.607 19:52:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:15.607 19:52:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:15.607 19:52:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:15.868 19:52:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:15.868 "name": "raid_bdev1", 00:16:15.868 "uuid": "15ee56e0-3b33-41d6-8948-dc776038ddf0", 00:16:15.868 "strip_size_kb": 64, 00:16:15.868 "state": "online", 00:16:15.868 "raid_level": "concat", 00:16:15.868 "superblock": true, 00:16:15.868 "num_base_bdevs": 3, 00:16:15.868 "num_base_bdevs_discovered": 3, 00:16:15.868 "num_base_bdevs_operational": 3, 00:16:15.868 "base_bdevs_list": [ 00:16:15.868 { 00:16:15.868 "name": "pt1", 00:16:15.868 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:15.868 "is_configured": true, 00:16:15.868 "data_offset": 2048, 00:16:15.868 "data_size": 63488 00:16:15.868 }, 00:16:15.868 { 00:16:15.868 "name": "pt2", 00:16:15.868 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:15.868 "is_configured": true, 00:16:15.868 "data_offset": 2048, 00:16:15.868 "data_size": 63488 00:16:15.868 }, 00:16:15.868 { 00:16:15.868 "name": "pt3", 00:16:15.868 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:15.868 "is_configured": true, 00:16:15.868 "data_offset": 2048, 00:16:15.868 "data_size": 63488 00:16:15.868 } 00:16:15.868 ] 00:16:15.868 }' 00:16:15.868 19:52:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:15.868 19:52:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:16.437 19:52:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:16:16.437 19:52:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:16.437 19:52:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:16.437 19:52:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:16.437 19:52:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:16.437 19:52:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:16.437 19:52:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:16.437 19:52:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:16.696 [2024-07-24 19:52:08.164995] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:16.696 19:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:16.696 "name": "raid_bdev1", 00:16:16.696 "aliases": [ 00:16:16.696 "15ee56e0-3b33-41d6-8948-dc776038ddf0" 00:16:16.696 ], 00:16:16.696 "product_name": "Raid Volume", 00:16:16.696 "block_size": 512, 00:16:16.696 "num_blocks": 190464, 00:16:16.696 "uuid": "15ee56e0-3b33-41d6-8948-dc776038ddf0", 00:16:16.696 "assigned_rate_limits": { 00:16:16.696 "rw_ios_per_sec": 0, 00:16:16.696 "rw_mbytes_per_sec": 0, 00:16:16.696 "r_mbytes_per_sec": 0, 00:16:16.696 "w_mbytes_per_sec": 0 00:16:16.696 }, 00:16:16.696 "claimed": false, 00:16:16.696 "zoned": false, 00:16:16.696 "supported_io_types": { 00:16:16.696 "read": true, 00:16:16.696 "write": true, 00:16:16.696 "unmap": true, 00:16:16.696 "flush": true, 00:16:16.696 "reset": true, 00:16:16.696 "nvme_admin": false, 00:16:16.696 "nvme_io": false, 00:16:16.696 "nvme_io_md": false, 00:16:16.696 "write_zeroes": true, 00:16:16.696 "zcopy": false, 00:16:16.696 "get_zone_info": false, 00:16:16.696 "zone_management": false, 00:16:16.696 "zone_append": false, 00:16:16.696 "compare": false, 00:16:16.696 "compare_and_write": false, 00:16:16.696 "abort": false, 00:16:16.696 "seek_hole": false, 00:16:16.696 "seek_data": false, 00:16:16.696 "copy": false, 00:16:16.696 "nvme_iov_md": false 00:16:16.696 }, 00:16:16.696 "memory_domains": [ 00:16:16.696 { 00:16:16.696 "dma_device_id": "system", 00:16:16.696 "dma_device_type": 1 00:16:16.696 }, 00:16:16.696 { 00:16:16.696 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:16.696 "dma_device_type": 2 00:16:16.696 }, 00:16:16.696 { 00:16:16.696 "dma_device_id": "system", 00:16:16.696 "dma_device_type": 1 00:16:16.696 }, 00:16:16.696 { 00:16:16.696 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:16.696 "dma_device_type": 2 00:16:16.696 }, 00:16:16.696 { 00:16:16.696 "dma_device_id": "system", 00:16:16.696 "dma_device_type": 1 00:16:16.696 }, 00:16:16.696 { 00:16:16.696 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:16.696 "dma_device_type": 2 00:16:16.696 } 00:16:16.696 ], 00:16:16.696 "driver_specific": { 00:16:16.696 "raid": { 00:16:16.696 "uuid": "15ee56e0-3b33-41d6-8948-dc776038ddf0", 00:16:16.696 "strip_size_kb": 64, 00:16:16.696 "state": "online", 00:16:16.696 "raid_level": "concat", 00:16:16.696 "superblock": true, 00:16:16.696 "num_base_bdevs": 3, 00:16:16.696 "num_base_bdevs_discovered": 3, 00:16:16.696 "num_base_bdevs_operational": 3, 00:16:16.697 "base_bdevs_list": [ 00:16:16.697 { 00:16:16.697 "name": "pt1", 00:16:16.697 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:16.697 "is_configured": true, 00:16:16.697 "data_offset": 2048, 00:16:16.697 "data_size": 63488 00:16:16.697 }, 00:16:16.697 { 00:16:16.697 "name": "pt2", 00:16:16.697 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:16.697 "is_configured": true, 00:16:16.697 "data_offset": 2048, 00:16:16.697 "data_size": 63488 00:16:16.697 }, 00:16:16.697 { 00:16:16.697 "name": "pt3", 00:16:16.697 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:16.697 "is_configured": true, 00:16:16.697 "data_offset": 2048, 00:16:16.697 "data_size": 63488 00:16:16.697 } 00:16:16.697 ] 00:16:16.697 } 00:16:16.697 } 00:16:16.697 }' 00:16:16.697 19:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:16.697 19:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:16.697 pt2 00:16:16.697 pt3' 00:16:16.697 19:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:16.697 19:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:16.697 19:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:16.956 19:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:16.956 "name": "pt1", 00:16:16.956 "aliases": [ 00:16:16.956 "00000000-0000-0000-0000-000000000001" 00:16:16.956 ], 00:16:16.956 "product_name": "passthru", 00:16:16.956 "block_size": 512, 00:16:16.956 "num_blocks": 65536, 00:16:16.956 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:16.956 "assigned_rate_limits": { 00:16:16.956 "rw_ios_per_sec": 0, 00:16:16.956 "rw_mbytes_per_sec": 0, 00:16:16.956 "r_mbytes_per_sec": 0, 00:16:16.956 "w_mbytes_per_sec": 0 00:16:16.956 }, 00:16:16.956 "claimed": true, 00:16:16.956 "claim_type": "exclusive_write", 00:16:16.956 "zoned": false, 00:16:16.956 "supported_io_types": { 00:16:16.956 "read": true, 00:16:16.956 "write": true, 00:16:16.956 "unmap": true, 00:16:16.956 "flush": true, 00:16:16.956 "reset": true, 00:16:16.956 "nvme_admin": false, 00:16:16.956 "nvme_io": false, 00:16:16.956 "nvme_io_md": false, 00:16:16.956 "write_zeroes": true, 00:16:16.956 "zcopy": true, 00:16:16.956 "get_zone_info": false, 00:16:16.956 "zone_management": false, 00:16:16.956 "zone_append": false, 00:16:16.956 "compare": false, 00:16:16.956 "compare_and_write": false, 00:16:16.956 "abort": true, 00:16:16.956 "seek_hole": false, 00:16:16.956 "seek_data": false, 00:16:16.956 "copy": true, 00:16:16.956 "nvme_iov_md": false 00:16:16.956 }, 00:16:16.956 "memory_domains": [ 00:16:16.956 { 00:16:16.956 "dma_device_id": "system", 00:16:16.956 "dma_device_type": 1 00:16:16.956 }, 00:16:16.956 { 00:16:16.956 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:16.956 "dma_device_type": 2 00:16:16.956 } 00:16:16.956 ], 00:16:16.956 "driver_specific": { 00:16:16.956 "passthru": { 00:16:16.956 "name": "pt1", 00:16:16.956 "base_bdev_name": "malloc1" 00:16:16.956 } 00:16:16.956 } 00:16:16.956 }' 00:16:16.956 19:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:16.956 19:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:17.216 19:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:17.216 19:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:17.216 19:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:17.216 19:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:17.216 19:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:17.216 19:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:17.216 19:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:17.216 19:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:17.475 19:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:17.475 19:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:17.475 19:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:17.475 19:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:17.475 19:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:17.734 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:17.735 "name": "pt2", 00:16:17.735 "aliases": [ 00:16:17.735 "00000000-0000-0000-0000-000000000002" 00:16:17.735 ], 00:16:17.735 "product_name": "passthru", 00:16:17.735 "block_size": 512, 00:16:17.735 "num_blocks": 65536, 00:16:17.735 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:17.735 "assigned_rate_limits": { 00:16:17.735 "rw_ios_per_sec": 0, 00:16:17.735 "rw_mbytes_per_sec": 0, 00:16:17.735 "r_mbytes_per_sec": 0, 00:16:17.735 "w_mbytes_per_sec": 0 00:16:17.735 }, 00:16:17.735 "claimed": true, 00:16:17.735 "claim_type": "exclusive_write", 00:16:17.735 "zoned": false, 00:16:17.735 "supported_io_types": { 00:16:17.735 "read": true, 00:16:17.735 "write": true, 00:16:17.735 "unmap": true, 00:16:17.735 "flush": true, 00:16:17.735 "reset": true, 00:16:17.735 "nvme_admin": false, 00:16:17.735 "nvme_io": false, 00:16:17.735 "nvme_io_md": false, 00:16:17.735 "write_zeroes": true, 00:16:17.735 "zcopy": true, 00:16:17.735 "get_zone_info": false, 00:16:17.735 "zone_management": false, 00:16:17.735 "zone_append": false, 00:16:17.735 "compare": false, 00:16:17.735 "compare_and_write": false, 00:16:17.735 "abort": true, 00:16:17.735 "seek_hole": false, 00:16:17.735 "seek_data": false, 00:16:17.735 "copy": true, 00:16:17.735 "nvme_iov_md": false 00:16:17.735 }, 00:16:17.735 "memory_domains": [ 00:16:17.735 { 00:16:17.735 "dma_device_id": "system", 00:16:17.735 "dma_device_type": 1 00:16:17.735 }, 00:16:17.735 { 00:16:17.735 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.735 "dma_device_type": 2 00:16:17.735 } 00:16:17.735 ], 00:16:17.735 "driver_specific": { 00:16:17.735 "passthru": { 00:16:17.735 "name": "pt2", 00:16:17.735 "base_bdev_name": "malloc2" 00:16:17.735 } 00:16:17.735 } 00:16:17.735 }' 00:16:17.735 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:17.735 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:17.735 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:17.735 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:17.735 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:17.735 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:17.735 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:17.735 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:17.994 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:17.994 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:17.994 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:17.995 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:17.995 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:17.995 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:17.995 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:18.254 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:18.254 "name": "pt3", 00:16:18.254 "aliases": [ 00:16:18.254 "00000000-0000-0000-0000-000000000003" 00:16:18.254 ], 00:16:18.254 "product_name": "passthru", 00:16:18.254 "block_size": 512, 00:16:18.254 "num_blocks": 65536, 00:16:18.254 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:18.254 "assigned_rate_limits": { 00:16:18.254 "rw_ios_per_sec": 0, 00:16:18.254 "rw_mbytes_per_sec": 0, 00:16:18.254 "r_mbytes_per_sec": 0, 00:16:18.254 "w_mbytes_per_sec": 0 00:16:18.254 }, 00:16:18.254 "claimed": true, 00:16:18.254 "claim_type": "exclusive_write", 00:16:18.254 "zoned": false, 00:16:18.254 "supported_io_types": { 00:16:18.254 "read": true, 00:16:18.254 "write": true, 00:16:18.254 "unmap": true, 00:16:18.254 "flush": true, 00:16:18.254 "reset": true, 00:16:18.254 "nvme_admin": false, 00:16:18.254 "nvme_io": false, 00:16:18.254 "nvme_io_md": false, 00:16:18.254 "write_zeroes": true, 00:16:18.254 "zcopy": true, 00:16:18.254 "get_zone_info": false, 00:16:18.254 "zone_management": false, 00:16:18.254 "zone_append": false, 00:16:18.254 "compare": false, 00:16:18.254 "compare_and_write": false, 00:16:18.254 "abort": true, 00:16:18.254 "seek_hole": false, 00:16:18.254 "seek_data": false, 00:16:18.254 "copy": true, 00:16:18.254 "nvme_iov_md": false 00:16:18.254 }, 00:16:18.254 "memory_domains": [ 00:16:18.254 { 00:16:18.254 "dma_device_id": "system", 00:16:18.254 "dma_device_type": 1 00:16:18.254 }, 00:16:18.254 { 00:16:18.254 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:18.254 "dma_device_type": 2 00:16:18.254 } 00:16:18.254 ], 00:16:18.254 "driver_specific": { 00:16:18.254 "passthru": { 00:16:18.254 "name": "pt3", 00:16:18.254 "base_bdev_name": "malloc3" 00:16:18.254 } 00:16:18.254 } 00:16:18.254 }' 00:16:18.254 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:18.254 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:18.254 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:18.254 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:18.254 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:18.254 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:18.254 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:18.513 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:18.513 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:18.513 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:18.513 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:18.513 19:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:18.513 19:52:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:18.513 19:52:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:16:19.081 [2024-07-24 19:52:10.487196] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:19.081 19:52:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=15ee56e0-3b33-41d6-8948-dc776038ddf0 00:16:19.081 19:52:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z 15ee56e0-3b33-41d6-8948-dc776038ddf0 ']' 00:16:19.081 19:52:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:19.339 [2024-07-24 19:52:10.743587] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:19.339 [2024-07-24 19:52:10.743606] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:19.339 [2024-07-24 19:52:10.743655] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:19.339 [2024-07-24 19:52:10.743706] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:19.339 [2024-07-24 19:52:10.743718] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2293d10 name raid_bdev1, state offline 00:16:19.339 19:52:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:19.339 19:52:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:16:19.598 19:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:16:19.598 19:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:16:19.598 19:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:16:19.598 19:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:16:19.857 19:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:16:19.857 19:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:20.115 19:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:16:20.116 19:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:16:20.375 19:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:16:20.375 19:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:16:20.634 19:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:16:20.634 19:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:20.634 19:52:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:16:20.634 19:52:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:20.634 19:52:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:20.634 19:52:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:16:20.634 19:52:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:20.634 19:52:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:16:20.634 19:52:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:20.634 19:52:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:16:20.634 19:52:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:20.634 19:52:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:16:20.634 19:52:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:20.892 [2024-07-24 19:52:12.255514] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:16:20.892 [2024-07-24 19:52:12.256855] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:16:20.892 [2024-07-24 19:52:12.256898] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:16:20.892 [2024-07-24 19:52:12.256943] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:16:20.892 [2024-07-24 19:52:12.256981] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:16:20.892 [2024-07-24 19:52:12.257005] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:16:20.892 [2024-07-24 19:52:12.257022] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:20.892 [2024-07-24 19:52:12.257032] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20e2c50 name raid_bdev1, state configuring 00:16:20.892 request: 00:16:20.892 { 00:16:20.892 "name": "raid_bdev1", 00:16:20.892 "raid_level": "concat", 00:16:20.892 "base_bdevs": [ 00:16:20.892 "malloc1", 00:16:20.892 "malloc2", 00:16:20.892 "malloc3" 00:16:20.892 ], 00:16:20.892 "strip_size_kb": 64, 00:16:20.892 "superblock": false, 00:16:20.892 "method": "bdev_raid_create", 00:16:20.892 "req_id": 1 00:16:20.892 } 00:16:20.892 Got JSON-RPC error response 00:16:20.892 response: 00:16:20.892 { 00:16:20.892 "code": -17, 00:16:20.892 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:16:20.892 } 00:16:20.892 19:52:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:16:20.892 19:52:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:16:20.892 19:52:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:16:20.892 19:52:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:16:20.892 19:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.892 19:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:16:21.151 19:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:16:21.151 19:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:16:21.151 19:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:21.410 [2024-07-24 19:52:12.748887] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:21.410 [2024-07-24 19:52:12.748927] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:21.410 [2024-07-24 19:52:12.748945] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2291460 00:16:21.410 [2024-07-24 19:52:12.748957] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:21.410 [2024-07-24 19:52:12.750544] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:21.410 [2024-07-24 19:52:12.750572] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:21.410 [2024-07-24 19:52:12.750637] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:16:21.410 [2024-07-24 19:52:12.750664] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:21.410 pt1 00:16:21.410 19:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:16:21.410 19:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:21.410 19:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:21.410 19:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:21.410 19:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:21.410 19:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:21.410 19:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:21.410 19:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:21.410 19:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:21.410 19:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:21.410 19:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:21.410 19:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:21.670 19:52:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:21.670 "name": "raid_bdev1", 00:16:21.670 "uuid": "15ee56e0-3b33-41d6-8948-dc776038ddf0", 00:16:21.670 "strip_size_kb": 64, 00:16:21.670 "state": "configuring", 00:16:21.670 "raid_level": "concat", 00:16:21.670 "superblock": true, 00:16:21.670 "num_base_bdevs": 3, 00:16:21.670 "num_base_bdevs_discovered": 1, 00:16:21.670 "num_base_bdevs_operational": 3, 00:16:21.670 "base_bdevs_list": [ 00:16:21.670 { 00:16:21.670 "name": "pt1", 00:16:21.670 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:21.670 "is_configured": true, 00:16:21.670 "data_offset": 2048, 00:16:21.670 "data_size": 63488 00:16:21.670 }, 00:16:21.670 { 00:16:21.670 "name": null, 00:16:21.670 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:21.670 "is_configured": false, 00:16:21.670 "data_offset": 2048, 00:16:21.670 "data_size": 63488 00:16:21.670 }, 00:16:21.670 { 00:16:21.670 "name": null, 00:16:21.670 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:21.670 "is_configured": false, 00:16:21.670 "data_offset": 2048, 00:16:21.670 "data_size": 63488 00:16:21.670 } 00:16:21.670 ] 00:16:21.670 }' 00:16:21.670 19:52:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:21.670 19:52:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:22.239 19:52:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 3 -gt 2 ']' 00:16:22.239 19:52:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:22.239 [2024-07-24 19:52:13.827762] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:22.239 [2024-07-24 19:52:13.827807] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:22.240 [2024-07-24 19:52:13.827827] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20e3260 00:16:22.240 [2024-07-24 19:52:13.827839] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:22.240 [2024-07-24 19:52:13.828176] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:22.240 [2024-07-24 19:52:13.828194] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:22.240 [2024-07-24 19:52:13.828253] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:22.240 [2024-07-24 19:52:13.828272] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:22.498 pt2 00:16:22.498 19:52:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@488 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:22.498 [2024-07-24 19:52:14.076441] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:16:22.757 19:52:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@489 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:16:22.757 19:52:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:22.757 19:52:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:22.757 19:52:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:22.757 19:52:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:22.757 19:52:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:22.757 19:52:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:22.757 19:52:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:22.757 19:52:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:22.757 19:52:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:22.757 19:52:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:22.757 19:52:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:23.016 19:52:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:23.016 "name": "raid_bdev1", 00:16:23.016 "uuid": "15ee56e0-3b33-41d6-8948-dc776038ddf0", 00:16:23.016 "strip_size_kb": 64, 00:16:23.016 "state": "configuring", 00:16:23.016 "raid_level": "concat", 00:16:23.016 "superblock": true, 00:16:23.016 "num_base_bdevs": 3, 00:16:23.016 "num_base_bdevs_discovered": 1, 00:16:23.016 "num_base_bdevs_operational": 3, 00:16:23.016 "base_bdevs_list": [ 00:16:23.016 { 00:16:23.016 "name": "pt1", 00:16:23.016 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:23.016 "is_configured": true, 00:16:23.016 "data_offset": 2048, 00:16:23.016 "data_size": 63488 00:16:23.016 }, 00:16:23.016 { 00:16:23.016 "name": null, 00:16:23.016 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:23.016 "is_configured": false, 00:16:23.016 "data_offset": 2048, 00:16:23.016 "data_size": 63488 00:16:23.016 }, 00:16:23.016 { 00:16:23.016 "name": null, 00:16:23.016 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:23.016 "is_configured": false, 00:16:23.016 "data_offset": 2048, 00:16:23.016 "data_size": 63488 00:16:23.016 } 00:16:23.016 ] 00:16:23.016 }' 00:16:23.016 19:52:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:23.016 19:52:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:23.584 19:52:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:16:23.584 19:52:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:16:23.584 19:52:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:23.584 [2024-07-24 19:52:15.167314] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:23.584 [2024-07-24 19:52:15.167359] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:23.584 [2024-07-24 19:52:15.167378] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x229cd20 00:16:23.584 [2024-07-24 19:52:15.167398] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:23.584 [2024-07-24 19:52:15.167731] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:23.584 [2024-07-24 19:52:15.167748] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:23.584 [2024-07-24 19:52:15.167813] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:23.584 [2024-07-24 19:52:15.167833] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:23.584 pt2 00:16:23.843 19:52:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:16:23.843 19:52:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:16:23.843 19:52:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:23.843 [2024-07-24 19:52:15.351804] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:23.843 [2024-07-24 19:52:15.351832] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:23.843 [2024-07-24 19:52:15.351847] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20eb7c0 00:16:23.843 [2024-07-24 19:52:15.351858] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:23.843 [2024-07-24 19:52:15.352129] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:23.843 [2024-07-24 19:52:15.352146] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:23.843 [2024-07-24 19:52:15.352192] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:16:23.843 [2024-07-24 19:52:15.352208] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:23.843 [2024-07-24 19:52:15.352305] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x20e23c0 00:16:23.843 [2024-07-24 19:52:15.352315] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:23.843 [2024-07-24 19:52:15.352485] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20e6550 00:16:23.843 [2024-07-24 19:52:15.352619] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20e23c0 00:16:23.843 [2024-07-24 19:52:15.352629] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20e23c0 00:16:23.843 [2024-07-24 19:52:15.352722] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:23.843 pt3 00:16:23.843 19:52:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:16:23.843 19:52:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:16:23.843 19:52:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:23.843 19:52:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:23.843 19:52:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:23.843 19:52:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:23.843 19:52:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:23.843 19:52:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:23.843 19:52:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:23.843 19:52:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:23.843 19:52:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:23.843 19:52:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:23.843 19:52:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:23.843 19:52:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:24.102 19:52:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:24.102 "name": "raid_bdev1", 00:16:24.102 "uuid": "15ee56e0-3b33-41d6-8948-dc776038ddf0", 00:16:24.102 "strip_size_kb": 64, 00:16:24.102 "state": "online", 00:16:24.102 "raid_level": "concat", 00:16:24.102 "superblock": true, 00:16:24.102 "num_base_bdevs": 3, 00:16:24.102 "num_base_bdevs_discovered": 3, 00:16:24.102 "num_base_bdevs_operational": 3, 00:16:24.102 "base_bdevs_list": [ 00:16:24.102 { 00:16:24.102 "name": "pt1", 00:16:24.102 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:24.102 "is_configured": true, 00:16:24.102 "data_offset": 2048, 00:16:24.102 "data_size": 63488 00:16:24.102 }, 00:16:24.102 { 00:16:24.102 "name": "pt2", 00:16:24.102 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:24.102 "is_configured": true, 00:16:24.102 "data_offset": 2048, 00:16:24.102 "data_size": 63488 00:16:24.102 }, 00:16:24.102 { 00:16:24.102 "name": "pt3", 00:16:24.102 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:24.102 "is_configured": true, 00:16:24.102 "data_offset": 2048, 00:16:24.102 "data_size": 63488 00:16:24.102 } 00:16:24.102 ] 00:16:24.102 }' 00:16:24.102 19:52:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:24.102 19:52:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:24.669 19:52:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:16:24.669 19:52:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:24.669 19:52:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:24.669 19:52:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:24.669 19:52:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:24.669 19:52:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:24.669 19:52:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:24.669 19:52:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:24.927 [2024-07-24 19:52:16.374777] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:24.927 19:52:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:24.927 "name": "raid_bdev1", 00:16:24.927 "aliases": [ 00:16:24.927 "15ee56e0-3b33-41d6-8948-dc776038ddf0" 00:16:24.927 ], 00:16:24.927 "product_name": "Raid Volume", 00:16:24.927 "block_size": 512, 00:16:24.927 "num_blocks": 190464, 00:16:24.927 "uuid": "15ee56e0-3b33-41d6-8948-dc776038ddf0", 00:16:24.927 "assigned_rate_limits": { 00:16:24.927 "rw_ios_per_sec": 0, 00:16:24.927 "rw_mbytes_per_sec": 0, 00:16:24.927 "r_mbytes_per_sec": 0, 00:16:24.927 "w_mbytes_per_sec": 0 00:16:24.927 }, 00:16:24.927 "claimed": false, 00:16:24.927 "zoned": false, 00:16:24.927 "supported_io_types": { 00:16:24.927 "read": true, 00:16:24.927 "write": true, 00:16:24.927 "unmap": true, 00:16:24.927 "flush": true, 00:16:24.927 "reset": true, 00:16:24.927 "nvme_admin": false, 00:16:24.927 "nvme_io": false, 00:16:24.927 "nvme_io_md": false, 00:16:24.927 "write_zeroes": true, 00:16:24.927 "zcopy": false, 00:16:24.927 "get_zone_info": false, 00:16:24.927 "zone_management": false, 00:16:24.927 "zone_append": false, 00:16:24.927 "compare": false, 00:16:24.927 "compare_and_write": false, 00:16:24.927 "abort": false, 00:16:24.927 "seek_hole": false, 00:16:24.927 "seek_data": false, 00:16:24.927 "copy": false, 00:16:24.927 "nvme_iov_md": false 00:16:24.927 }, 00:16:24.927 "memory_domains": [ 00:16:24.927 { 00:16:24.927 "dma_device_id": "system", 00:16:24.927 "dma_device_type": 1 00:16:24.927 }, 00:16:24.927 { 00:16:24.927 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:24.927 "dma_device_type": 2 00:16:24.927 }, 00:16:24.927 { 00:16:24.927 "dma_device_id": "system", 00:16:24.927 "dma_device_type": 1 00:16:24.927 }, 00:16:24.927 { 00:16:24.927 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:24.927 "dma_device_type": 2 00:16:24.927 }, 00:16:24.927 { 00:16:24.927 "dma_device_id": "system", 00:16:24.927 "dma_device_type": 1 00:16:24.927 }, 00:16:24.927 { 00:16:24.927 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:24.927 "dma_device_type": 2 00:16:24.927 } 00:16:24.927 ], 00:16:24.927 "driver_specific": { 00:16:24.927 "raid": { 00:16:24.927 "uuid": "15ee56e0-3b33-41d6-8948-dc776038ddf0", 00:16:24.927 "strip_size_kb": 64, 00:16:24.927 "state": "online", 00:16:24.927 "raid_level": "concat", 00:16:24.927 "superblock": true, 00:16:24.927 "num_base_bdevs": 3, 00:16:24.927 "num_base_bdevs_discovered": 3, 00:16:24.927 "num_base_bdevs_operational": 3, 00:16:24.927 "base_bdevs_list": [ 00:16:24.927 { 00:16:24.927 "name": "pt1", 00:16:24.927 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:24.927 "is_configured": true, 00:16:24.927 "data_offset": 2048, 00:16:24.927 "data_size": 63488 00:16:24.927 }, 00:16:24.927 { 00:16:24.927 "name": "pt2", 00:16:24.927 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:24.927 "is_configured": true, 00:16:24.927 "data_offset": 2048, 00:16:24.927 "data_size": 63488 00:16:24.927 }, 00:16:24.927 { 00:16:24.927 "name": "pt3", 00:16:24.927 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:24.927 "is_configured": true, 00:16:24.927 "data_offset": 2048, 00:16:24.927 "data_size": 63488 00:16:24.927 } 00:16:24.927 ] 00:16:24.927 } 00:16:24.927 } 00:16:24.927 }' 00:16:24.927 19:52:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:24.927 19:52:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:24.927 pt2 00:16:24.927 pt3' 00:16:24.927 19:52:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:24.927 19:52:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:24.927 19:52:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:25.229 19:52:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:25.229 "name": "pt1", 00:16:25.229 "aliases": [ 00:16:25.229 "00000000-0000-0000-0000-000000000001" 00:16:25.229 ], 00:16:25.229 "product_name": "passthru", 00:16:25.229 "block_size": 512, 00:16:25.229 "num_blocks": 65536, 00:16:25.229 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:25.229 "assigned_rate_limits": { 00:16:25.229 "rw_ios_per_sec": 0, 00:16:25.229 "rw_mbytes_per_sec": 0, 00:16:25.229 "r_mbytes_per_sec": 0, 00:16:25.229 "w_mbytes_per_sec": 0 00:16:25.229 }, 00:16:25.229 "claimed": true, 00:16:25.229 "claim_type": "exclusive_write", 00:16:25.229 "zoned": false, 00:16:25.229 "supported_io_types": { 00:16:25.229 "read": true, 00:16:25.229 "write": true, 00:16:25.229 "unmap": true, 00:16:25.229 "flush": true, 00:16:25.229 "reset": true, 00:16:25.229 "nvme_admin": false, 00:16:25.229 "nvme_io": false, 00:16:25.229 "nvme_io_md": false, 00:16:25.229 "write_zeroes": true, 00:16:25.229 "zcopy": true, 00:16:25.229 "get_zone_info": false, 00:16:25.229 "zone_management": false, 00:16:25.229 "zone_append": false, 00:16:25.229 "compare": false, 00:16:25.229 "compare_and_write": false, 00:16:25.229 "abort": true, 00:16:25.229 "seek_hole": false, 00:16:25.229 "seek_data": false, 00:16:25.229 "copy": true, 00:16:25.229 "nvme_iov_md": false 00:16:25.229 }, 00:16:25.229 "memory_domains": [ 00:16:25.229 { 00:16:25.229 "dma_device_id": "system", 00:16:25.229 "dma_device_type": 1 00:16:25.229 }, 00:16:25.229 { 00:16:25.229 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:25.229 "dma_device_type": 2 00:16:25.229 } 00:16:25.229 ], 00:16:25.229 "driver_specific": { 00:16:25.229 "passthru": { 00:16:25.229 "name": "pt1", 00:16:25.229 "base_bdev_name": "malloc1" 00:16:25.229 } 00:16:25.229 } 00:16:25.229 }' 00:16:25.229 19:52:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:25.229 19:52:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:25.520 19:52:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:25.520 19:52:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:25.520 19:52:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:25.520 19:52:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:25.520 19:52:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:25.520 19:52:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:25.520 19:52:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:25.520 19:52:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:25.520 19:52:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:25.520 19:52:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:25.520 19:52:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:25.520 19:52:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:25.520 19:52:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:25.778 19:52:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:25.778 "name": "pt2", 00:16:25.778 "aliases": [ 00:16:25.778 "00000000-0000-0000-0000-000000000002" 00:16:25.778 ], 00:16:25.778 "product_name": "passthru", 00:16:25.778 "block_size": 512, 00:16:25.778 "num_blocks": 65536, 00:16:25.778 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:25.778 "assigned_rate_limits": { 00:16:25.778 "rw_ios_per_sec": 0, 00:16:25.778 "rw_mbytes_per_sec": 0, 00:16:25.778 "r_mbytes_per_sec": 0, 00:16:25.778 "w_mbytes_per_sec": 0 00:16:25.778 }, 00:16:25.778 "claimed": true, 00:16:25.778 "claim_type": "exclusive_write", 00:16:25.778 "zoned": false, 00:16:25.778 "supported_io_types": { 00:16:25.778 "read": true, 00:16:25.778 "write": true, 00:16:25.778 "unmap": true, 00:16:25.778 "flush": true, 00:16:25.778 "reset": true, 00:16:25.778 "nvme_admin": false, 00:16:25.778 "nvme_io": false, 00:16:25.778 "nvme_io_md": false, 00:16:25.778 "write_zeroes": true, 00:16:25.778 "zcopy": true, 00:16:25.778 "get_zone_info": false, 00:16:25.778 "zone_management": false, 00:16:25.778 "zone_append": false, 00:16:25.778 "compare": false, 00:16:25.778 "compare_and_write": false, 00:16:25.778 "abort": true, 00:16:25.778 "seek_hole": false, 00:16:25.778 "seek_data": false, 00:16:25.778 "copy": true, 00:16:25.778 "nvme_iov_md": false 00:16:25.778 }, 00:16:25.778 "memory_domains": [ 00:16:25.778 { 00:16:25.778 "dma_device_id": "system", 00:16:25.778 "dma_device_type": 1 00:16:25.778 }, 00:16:25.778 { 00:16:25.778 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:25.778 "dma_device_type": 2 00:16:25.778 } 00:16:25.778 ], 00:16:25.778 "driver_specific": { 00:16:25.778 "passthru": { 00:16:25.778 "name": "pt2", 00:16:25.778 "base_bdev_name": "malloc2" 00:16:25.778 } 00:16:25.778 } 00:16:25.778 }' 00:16:25.778 19:52:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:26.037 19:52:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:26.037 19:52:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:26.037 19:52:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:26.037 19:52:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:26.037 19:52:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:26.037 19:52:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:26.037 19:52:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:26.295 19:52:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:26.295 19:52:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:26.295 19:52:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:26.295 19:52:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:26.295 19:52:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:26.295 19:52:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:26.295 19:52:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:26.862 19:52:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:26.862 "name": "pt3", 00:16:26.862 "aliases": [ 00:16:26.862 "00000000-0000-0000-0000-000000000003" 00:16:26.862 ], 00:16:26.862 "product_name": "passthru", 00:16:26.862 "block_size": 512, 00:16:26.862 "num_blocks": 65536, 00:16:26.862 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:26.862 "assigned_rate_limits": { 00:16:26.862 "rw_ios_per_sec": 0, 00:16:26.862 "rw_mbytes_per_sec": 0, 00:16:26.862 "r_mbytes_per_sec": 0, 00:16:26.862 "w_mbytes_per_sec": 0 00:16:26.862 }, 00:16:26.862 "claimed": true, 00:16:26.862 "claim_type": "exclusive_write", 00:16:26.862 "zoned": false, 00:16:26.862 "supported_io_types": { 00:16:26.862 "read": true, 00:16:26.862 "write": true, 00:16:26.862 "unmap": true, 00:16:26.862 "flush": true, 00:16:26.862 "reset": true, 00:16:26.862 "nvme_admin": false, 00:16:26.862 "nvme_io": false, 00:16:26.862 "nvme_io_md": false, 00:16:26.862 "write_zeroes": true, 00:16:26.862 "zcopy": true, 00:16:26.862 "get_zone_info": false, 00:16:26.862 "zone_management": false, 00:16:26.862 "zone_append": false, 00:16:26.862 "compare": false, 00:16:26.862 "compare_and_write": false, 00:16:26.862 "abort": true, 00:16:26.862 "seek_hole": false, 00:16:26.862 "seek_data": false, 00:16:26.862 "copy": true, 00:16:26.862 "nvme_iov_md": false 00:16:26.862 }, 00:16:26.862 "memory_domains": [ 00:16:26.862 { 00:16:26.862 "dma_device_id": "system", 00:16:26.862 "dma_device_type": 1 00:16:26.862 }, 00:16:26.862 { 00:16:26.862 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:26.862 "dma_device_type": 2 00:16:26.862 } 00:16:26.862 ], 00:16:26.862 "driver_specific": { 00:16:26.862 "passthru": { 00:16:26.862 "name": "pt3", 00:16:26.862 "base_bdev_name": "malloc3" 00:16:26.862 } 00:16:26.862 } 00:16:26.862 }' 00:16:26.862 19:52:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:26.862 19:52:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:26.862 19:52:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:26.862 19:52:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:27.121 19:52:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:27.121 19:52:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:27.121 19:52:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:27.121 19:52:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:27.121 19:52:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:27.121 19:52:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:27.121 19:52:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:27.380 19:52:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:27.380 19:52:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:27.380 19:52:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:16:27.638 [2024-07-24 19:52:19.218340] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:27.896 19:52:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' 15ee56e0-3b33-41d6-8948-dc776038ddf0 '!=' 15ee56e0-3b33-41d6-8948-dc776038ddf0 ']' 00:16:27.896 19:52:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy concat 00:16:27.896 19:52:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:27.896 19:52:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:27.896 19:52:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 1416940 00:16:27.896 19:52:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1416940 ']' 00:16:27.896 19:52:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1416940 00:16:27.896 19:52:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:16:27.896 19:52:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:16:27.896 19:52:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1416940 00:16:27.896 19:52:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:16:27.897 19:52:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:16:27.897 19:52:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1416940' 00:16:27.897 killing process with pid 1416940 00:16:27.897 19:52:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 1416940 00:16:27.897 [2024-07-24 19:52:19.303472] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:27.897 [2024-07-24 19:52:19.303525] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:27.897 [2024-07-24 19:52:19.303578] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:27.897 [2024-07-24 19:52:19.303589] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20e23c0 name raid_bdev1, state offline 00:16:27.897 19:52:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 1416940 00:16:27.897 [2024-07-24 19:52:19.330307] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:28.156 19:52:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:16:28.156 00:16:28.156 real 0m14.688s 00:16:28.156 user 0m27.016s 00:16:28.156 sys 0m2.662s 00:16:28.156 19:52:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:16:28.156 19:52:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:28.156 ************************************ 00:16:28.156 END TEST raid_superblock_test 00:16:28.156 ************************************ 00:16:28.156 19:52:19 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:16:28.156 19:52:19 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:16:28.156 19:52:19 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:16:28.156 19:52:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:28.156 ************************************ 00:16:28.156 START TEST raid_read_error_test 00:16:28.156 ************************************ 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 3 read 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=concat 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=3 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' concat '!=' raid1 ']' 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.DlrW8gm0Qw 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1419159 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1419159 /var/tmp/spdk-raid.sock 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 1419159 ']' 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:28.156 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:16:28.156 19:52:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:28.156 [2024-07-24 19:52:19.737527] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:16:28.156 [2024-07-24 19:52:19.737664] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1419159 ] 00:16:28.415 [2024-07-24 19:52:19.932999] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:28.674 [2024-07-24 19:52:20.041388] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:28.674 [2024-07-24 19:52:20.108064] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:28.674 [2024-07-24 19:52:20.108094] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:29.241 19:52:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:16:29.241 19:52:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:16:29.241 19:52:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:16:29.241 19:52:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:29.498 BaseBdev1_malloc 00:16:29.498 19:52:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:29.756 true 00:16:29.756 19:52:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:29.756 [2024-07-24 19:52:21.337193] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:29.756 [2024-07-24 19:52:21.337236] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:29.756 [2024-07-24 19:52:21.337257] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1dba3a0 00:16:29.756 [2024-07-24 19:52:21.337270] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:29.756 [2024-07-24 19:52:21.338987] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:29.756 [2024-07-24 19:52:21.339016] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:29.756 BaseBdev1 00:16:30.015 19:52:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:16:30.015 19:52:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:30.015 BaseBdev2_malloc 00:16:30.273 19:52:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:30.273 true 00:16:30.273 19:52:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:30.532 [2024-07-24 19:52:22.064913] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:30.532 [2024-07-24 19:52:22.064958] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:30.532 [2024-07-24 19:52:22.064981] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e79370 00:16:30.532 [2024-07-24 19:52:22.064994] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:30.532 [2024-07-24 19:52:22.066571] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:30.532 [2024-07-24 19:52:22.066599] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:30.532 BaseBdev2 00:16:30.532 19:52:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:16:30.532 19:52:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:30.791 BaseBdev3_malloc 00:16:30.791 19:52:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:31.050 true 00:16:31.050 19:52:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:31.309 [2024-07-24 19:52:22.800663] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:31.309 [2024-07-24 19:52:22.800709] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:31.309 [2024-07-24 19:52:22.800732] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1daf2d0 00:16:31.309 [2024-07-24 19:52:22.800745] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:31.309 [2024-07-24 19:52:22.802330] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:31.309 [2024-07-24 19:52:22.802359] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:31.309 BaseBdev3 00:16:31.309 19:52:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:16:31.568 [2024-07-24 19:52:23.041324] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:31.568 [2024-07-24 19:52:23.042698] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:31.568 [2024-07-24 19:52:23.042768] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:31.568 [2024-07-24 19:52:23.042981] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1db0860 00:16:31.568 [2024-07-24 19:52:23.042993] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:31.568 [2024-07-24 19:52:23.043189] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1db96c0 00:16:31.568 [2024-07-24 19:52:23.043335] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1db0860 00:16:31.568 [2024-07-24 19:52:23.043345] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1db0860 00:16:31.568 [2024-07-24 19:52:23.043457] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:31.568 19:52:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:31.568 19:52:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:31.568 19:52:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:31.568 19:52:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:31.568 19:52:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:31.568 19:52:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:31.568 19:52:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:31.568 19:52:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:31.568 19:52:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:31.568 19:52:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:31.568 19:52:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:31.568 19:52:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:31.827 19:52:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:31.827 "name": "raid_bdev1", 00:16:31.827 "uuid": "bb9176e3-472b-4058-b19a-1dd8e1c2bf0f", 00:16:31.827 "strip_size_kb": 64, 00:16:31.827 "state": "online", 00:16:31.827 "raid_level": "concat", 00:16:31.827 "superblock": true, 00:16:31.827 "num_base_bdevs": 3, 00:16:31.827 "num_base_bdevs_discovered": 3, 00:16:31.827 "num_base_bdevs_operational": 3, 00:16:31.827 "base_bdevs_list": [ 00:16:31.827 { 00:16:31.827 "name": "BaseBdev1", 00:16:31.827 "uuid": "0405bd5c-3f2c-5374-a92c-36da81f6a977", 00:16:31.827 "is_configured": true, 00:16:31.827 "data_offset": 2048, 00:16:31.827 "data_size": 63488 00:16:31.827 }, 00:16:31.827 { 00:16:31.827 "name": "BaseBdev2", 00:16:31.827 "uuid": "f8e8fb94-8938-53e4-98c8-37fde15bdef2", 00:16:31.827 "is_configured": true, 00:16:31.827 "data_offset": 2048, 00:16:31.827 "data_size": 63488 00:16:31.827 }, 00:16:31.827 { 00:16:31.827 "name": "BaseBdev3", 00:16:31.827 "uuid": "87180a72-409f-560e-a939-4765eae2de8a", 00:16:31.827 "is_configured": true, 00:16:31.827 "data_offset": 2048, 00:16:31.827 "data_size": 63488 00:16:31.827 } 00:16:31.827 ] 00:16:31.827 }' 00:16:31.827 19:52:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:31.827 19:52:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:32.395 19:52:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:16:32.395 19:52:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:32.653 [2024-07-24 19:52:24.072373] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1db8b60 00:16:33.589 19:52:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:16:33.848 19:52:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:16:33.848 19:52:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ concat = \r\a\i\d\1 ]] 00:16:33.848 19:52:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=3 00:16:33.848 19:52:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:33.848 19:52:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:33.848 19:52:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:33.848 19:52:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:33.848 19:52:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:33.848 19:52:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:33.848 19:52:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:33.848 19:52:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:33.848 19:52:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:33.848 19:52:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:33.848 19:52:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:33.848 19:52:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:34.106 19:52:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:34.106 "name": "raid_bdev1", 00:16:34.106 "uuid": "bb9176e3-472b-4058-b19a-1dd8e1c2bf0f", 00:16:34.106 "strip_size_kb": 64, 00:16:34.106 "state": "online", 00:16:34.106 "raid_level": "concat", 00:16:34.106 "superblock": true, 00:16:34.106 "num_base_bdevs": 3, 00:16:34.106 "num_base_bdevs_discovered": 3, 00:16:34.106 "num_base_bdevs_operational": 3, 00:16:34.106 "base_bdevs_list": [ 00:16:34.106 { 00:16:34.106 "name": "BaseBdev1", 00:16:34.106 "uuid": "0405bd5c-3f2c-5374-a92c-36da81f6a977", 00:16:34.106 "is_configured": true, 00:16:34.106 "data_offset": 2048, 00:16:34.106 "data_size": 63488 00:16:34.106 }, 00:16:34.106 { 00:16:34.106 "name": "BaseBdev2", 00:16:34.106 "uuid": "f8e8fb94-8938-53e4-98c8-37fde15bdef2", 00:16:34.106 "is_configured": true, 00:16:34.106 "data_offset": 2048, 00:16:34.106 "data_size": 63488 00:16:34.106 }, 00:16:34.106 { 00:16:34.106 "name": "BaseBdev3", 00:16:34.106 "uuid": "87180a72-409f-560e-a939-4765eae2de8a", 00:16:34.106 "is_configured": true, 00:16:34.106 "data_offset": 2048, 00:16:34.106 "data_size": 63488 00:16:34.106 } 00:16:34.106 ] 00:16:34.106 }' 00:16:34.106 19:52:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:34.106 19:52:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:34.673 19:52:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:34.932 [2024-07-24 19:52:26.309647] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:34.932 [2024-07-24 19:52:26.309677] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:34.932 [2024-07-24 19:52:26.312838] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:34.932 [2024-07-24 19:52:26.312872] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:34.932 [2024-07-24 19:52:26.312907] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:34.932 [2024-07-24 19:52:26.312918] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1db0860 name raid_bdev1, state offline 00:16:34.932 0 00:16:34.932 19:52:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1419159 00:16:34.932 19:52:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 1419159 ']' 00:16:34.932 19:52:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 1419159 00:16:34.932 19:52:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:16:34.932 19:52:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:16:34.932 19:52:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1419159 00:16:34.932 19:52:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:16:34.932 19:52:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:16:34.932 19:52:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1419159' 00:16:34.932 killing process with pid 1419159 00:16:34.932 19:52:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 1419159 00:16:34.932 [2024-07-24 19:52:26.397929] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:34.932 19:52:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 1419159 00:16:34.932 [2024-07-24 19:52:26.418636] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:35.191 19:52:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:16:35.191 19:52:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.DlrW8gm0Qw 00:16:35.191 19:52:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:16:35.191 19:52:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.45 00:16:35.191 19:52:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy concat 00:16:35.191 19:52:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:35.191 19:52:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:35.191 19:52:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.45 != \0\.\0\0 ]] 00:16:35.191 00:16:35.191 real 0m7.028s 00:16:35.191 user 0m11.127s 00:16:35.191 sys 0m1.295s 00:16:35.191 19:52:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:16:35.191 19:52:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:35.191 ************************************ 00:16:35.191 END TEST raid_read_error_test 00:16:35.191 ************************************ 00:16:35.191 19:52:26 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:16:35.191 19:52:26 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:16:35.191 19:52:26 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:16:35.191 19:52:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:35.191 ************************************ 00:16:35.191 START TEST raid_write_error_test 00:16:35.191 ************************************ 00:16:35.191 19:52:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 3 write 00:16:35.191 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=concat 00:16:35.191 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=3 00:16:35.191 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:16:35.191 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:16:35.191 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:16:35.191 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:16:35.191 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:16:35.191 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:16:35.191 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:16:35.191 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:16:35.191 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:16:35.191 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:16:35.191 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:16:35.191 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:16:35.191 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:35.191 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:16:35.191 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:16:35.191 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:16:35.191 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:16:35.191 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:16:35.191 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:16:35.191 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' concat '!=' raid1 ']' 00:16:35.192 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:16:35.192 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:16:35.192 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:16:35.192 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.xK8hxu6STl 00:16:35.192 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1420140 00:16:35.192 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1420140 /var/tmp/spdk-raid.sock 00:16:35.192 19:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:35.192 19:52:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 1420140 ']' 00:16:35.192 19:52:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:35.192 19:52:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:16:35.192 19:52:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:35.192 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:35.192 19:52:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:16:35.192 19:52:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:35.451 [2024-07-24 19:52:26.809581] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:16:35.451 [2024-07-24 19:52:26.809654] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1420140 ] 00:16:35.451 [2024-07-24 19:52:26.941591] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:35.451 [2024-07-24 19:52:27.042707] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:35.710 [2024-07-24 19:52:27.106514] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:35.710 [2024-07-24 19:52:27.106554] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:36.276 19:52:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:16:36.276 19:52:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:16:36.276 19:52:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:16:36.276 19:52:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:36.534 BaseBdev1_malloc 00:16:36.534 19:52:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:36.793 true 00:16:36.793 19:52:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:37.051 [2024-07-24 19:52:28.472894] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:37.051 [2024-07-24 19:52:28.472936] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:37.051 [2024-07-24 19:52:28.472957] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc173a0 00:16:37.051 [2024-07-24 19:52:28.472976] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:37.051 [2024-07-24 19:52:28.474605] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:37.051 [2024-07-24 19:52:28.474633] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:37.051 BaseBdev1 00:16:37.051 19:52:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:16:37.051 19:52:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:37.309 BaseBdev2_malloc 00:16:37.309 19:52:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:37.567 true 00:16:37.567 19:52:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:37.824 [2024-07-24 19:52:29.227504] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:37.824 [2024-07-24 19:52:29.227546] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:37.824 [2024-07-24 19:52:29.227568] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcd6370 00:16:37.824 [2024-07-24 19:52:29.227580] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:37.824 [2024-07-24 19:52:29.228953] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:37.824 [2024-07-24 19:52:29.228979] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:37.824 BaseBdev2 00:16:37.824 19:52:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:16:37.824 19:52:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:38.082 BaseBdev3_malloc 00:16:38.082 19:52:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:38.341 true 00:16:38.341 19:52:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:38.600 [2024-07-24 19:52:29.965983] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:38.600 [2024-07-24 19:52:29.966023] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:38.600 [2024-07-24 19:52:29.966044] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc0c2d0 00:16:38.600 [2024-07-24 19:52:29.966057] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:38.600 [2024-07-24 19:52:29.967456] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:38.600 [2024-07-24 19:52:29.967484] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:38.600 BaseBdev3 00:16:38.600 19:52:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:16:38.859 [2024-07-24 19:52:30.214686] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:38.859 [2024-07-24 19:52:30.216006] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:38.859 [2024-07-24 19:52:30.216073] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:38.859 [2024-07-24 19:52:30.216282] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xc0d860 00:16:38.859 [2024-07-24 19:52:30.216294] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:38.859 [2024-07-24 19:52:30.216503] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc166c0 00:16:38.859 [2024-07-24 19:52:30.216655] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc0d860 00:16:38.859 [2024-07-24 19:52:30.216666] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc0d860 00:16:38.859 [2024-07-24 19:52:30.216767] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:38.859 19:52:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:38.859 19:52:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:38.859 19:52:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:38.859 19:52:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:38.859 19:52:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:38.859 19:52:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:38.859 19:52:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:38.859 19:52:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:38.859 19:52:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:38.859 19:52:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:38.859 19:52:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:38.859 19:52:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:39.119 19:52:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:39.119 "name": "raid_bdev1", 00:16:39.119 "uuid": "837db719-3cc8-498d-8732-b2c290d5043d", 00:16:39.119 "strip_size_kb": 64, 00:16:39.119 "state": "online", 00:16:39.119 "raid_level": "concat", 00:16:39.119 "superblock": true, 00:16:39.119 "num_base_bdevs": 3, 00:16:39.119 "num_base_bdevs_discovered": 3, 00:16:39.119 "num_base_bdevs_operational": 3, 00:16:39.119 "base_bdevs_list": [ 00:16:39.119 { 00:16:39.119 "name": "BaseBdev1", 00:16:39.119 "uuid": "d9d75ce2-03a9-58b0-9dbb-e12bbb73e88f", 00:16:39.119 "is_configured": true, 00:16:39.119 "data_offset": 2048, 00:16:39.119 "data_size": 63488 00:16:39.119 }, 00:16:39.119 { 00:16:39.119 "name": "BaseBdev2", 00:16:39.119 "uuid": "2c82b313-e2bd-506a-9bb7-5ca4fa9aaafe", 00:16:39.119 "is_configured": true, 00:16:39.119 "data_offset": 2048, 00:16:39.119 "data_size": 63488 00:16:39.119 }, 00:16:39.119 { 00:16:39.119 "name": "BaseBdev3", 00:16:39.119 "uuid": "59367901-e895-5ee4-95da-4902424968a2", 00:16:39.119 "is_configured": true, 00:16:39.119 "data_offset": 2048, 00:16:39.119 "data_size": 63488 00:16:39.119 } 00:16:39.119 ] 00:16:39.119 }' 00:16:39.119 19:52:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:39.119 19:52:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:39.687 19:52:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:16:39.687 19:52:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:39.687 [2024-07-24 19:52:31.169546] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc15b60 00:16:40.624 19:52:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:16:40.883 19:52:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:16:40.883 19:52:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ concat = \r\a\i\d\1 ]] 00:16:40.883 19:52:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=3 00:16:40.883 19:52:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:40.883 19:52:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:40.883 19:52:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:40.883 19:52:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:40.883 19:52:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:40.883 19:52:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:40.883 19:52:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:40.883 19:52:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:40.883 19:52:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:40.883 19:52:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:40.883 19:52:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:40.883 19:52:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:41.142 19:52:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:41.142 "name": "raid_bdev1", 00:16:41.142 "uuid": "837db719-3cc8-498d-8732-b2c290d5043d", 00:16:41.142 "strip_size_kb": 64, 00:16:41.142 "state": "online", 00:16:41.142 "raid_level": "concat", 00:16:41.142 "superblock": true, 00:16:41.142 "num_base_bdevs": 3, 00:16:41.142 "num_base_bdevs_discovered": 3, 00:16:41.142 "num_base_bdevs_operational": 3, 00:16:41.142 "base_bdevs_list": [ 00:16:41.142 { 00:16:41.142 "name": "BaseBdev1", 00:16:41.142 "uuid": "d9d75ce2-03a9-58b0-9dbb-e12bbb73e88f", 00:16:41.142 "is_configured": true, 00:16:41.142 "data_offset": 2048, 00:16:41.142 "data_size": 63488 00:16:41.142 }, 00:16:41.142 { 00:16:41.142 "name": "BaseBdev2", 00:16:41.142 "uuid": "2c82b313-e2bd-506a-9bb7-5ca4fa9aaafe", 00:16:41.142 "is_configured": true, 00:16:41.142 "data_offset": 2048, 00:16:41.142 "data_size": 63488 00:16:41.142 }, 00:16:41.142 { 00:16:41.142 "name": "BaseBdev3", 00:16:41.142 "uuid": "59367901-e895-5ee4-95da-4902424968a2", 00:16:41.142 "is_configured": true, 00:16:41.142 "data_offset": 2048, 00:16:41.142 "data_size": 63488 00:16:41.142 } 00:16:41.142 ] 00:16:41.142 }' 00:16:41.142 19:52:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:41.142 19:52:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:41.708 19:52:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:41.968 [2024-07-24 19:52:33.342020] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:41.968 [2024-07-24 19:52:33.342061] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:41.968 [2024-07-24 19:52:33.345237] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:41.968 [2024-07-24 19:52:33.345273] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:41.968 [2024-07-24 19:52:33.345308] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:41.968 [2024-07-24 19:52:33.345320] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc0d860 name raid_bdev1, state offline 00:16:41.968 0 00:16:41.968 19:52:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1420140 00:16:41.968 19:52:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 1420140 ']' 00:16:41.968 19:52:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 1420140 00:16:41.968 19:52:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:16:41.968 19:52:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:16:41.968 19:52:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1420140 00:16:41.968 19:52:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:16:41.968 19:52:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:16:41.968 19:52:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1420140' 00:16:41.968 killing process with pid 1420140 00:16:41.968 19:52:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 1420140 00:16:41.968 [2024-07-24 19:52:33.423218] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:41.968 19:52:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 1420140 00:16:41.968 [2024-07-24 19:52:33.443104] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:42.265 19:52:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.xK8hxu6STl 00:16:42.265 19:52:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:16:42.265 19:52:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:16:42.265 19:52:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.46 00:16:42.265 19:52:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy concat 00:16:42.265 19:52:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:42.265 19:52:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:42.265 19:52:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.46 != \0\.\0\0 ]] 00:16:42.265 00:16:42.265 real 0m6.938s 00:16:42.265 user 0m11.019s 00:16:42.265 sys 0m1.206s 00:16:42.265 19:52:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:16:42.265 19:52:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:42.265 ************************************ 00:16:42.265 END TEST raid_write_error_test 00:16:42.265 ************************************ 00:16:42.265 19:52:33 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:16:42.265 19:52:33 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:16:42.265 19:52:33 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:16:42.265 19:52:33 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:16:42.265 19:52:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:42.265 ************************************ 00:16:42.265 START TEST raid_state_function_test 00:16:42.265 ************************************ 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 3 false 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1421119 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1421119' 00:16:42.265 Process raid pid: 1421119 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1421119 /var/tmp/spdk-raid.sock 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 1421119 ']' 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:42.265 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:16:42.265 19:52:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:42.265 [2024-07-24 19:52:33.820534] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:16:42.265 [2024-07-24 19:52:33.820603] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:42.532 [2024-07-24 19:52:33.950893] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:42.532 [2024-07-24 19:52:34.057209] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:42.532 [2024-07-24 19:52:34.124967] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:42.532 [2024-07-24 19:52:34.124997] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:43.469 19:52:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:16:43.469 19:52:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:16:43.469 19:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:43.469 [2024-07-24 19:52:34.980079] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:43.469 [2024-07-24 19:52:34.980116] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:43.469 [2024-07-24 19:52:34.980126] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:43.469 [2024-07-24 19:52:34.980138] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:43.469 [2024-07-24 19:52:34.980147] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:43.469 [2024-07-24 19:52:34.980158] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:43.469 19:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:43.469 19:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:43.469 19:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:43.469 19:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:43.469 19:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:43.469 19:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:43.469 19:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:43.469 19:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:43.469 19:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:43.469 19:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:43.469 19:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:43.469 19:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:43.728 19:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:43.728 "name": "Existed_Raid", 00:16:43.728 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:43.728 "strip_size_kb": 0, 00:16:43.728 "state": "configuring", 00:16:43.728 "raid_level": "raid1", 00:16:43.728 "superblock": false, 00:16:43.728 "num_base_bdevs": 3, 00:16:43.728 "num_base_bdevs_discovered": 0, 00:16:43.728 "num_base_bdevs_operational": 3, 00:16:43.728 "base_bdevs_list": [ 00:16:43.728 { 00:16:43.728 "name": "BaseBdev1", 00:16:43.728 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:43.728 "is_configured": false, 00:16:43.728 "data_offset": 0, 00:16:43.728 "data_size": 0 00:16:43.728 }, 00:16:43.728 { 00:16:43.728 "name": "BaseBdev2", 00:16:43.728 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:43.728 "is_configured": false, 00:16:43.728 "data_offset": 0, 00:16:43.728 "data_size": 0 00:16:43.728 }, 00:16:43.728 { 00:16:43.728 "name": "BaseBdev3", 00:16:43.728 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:43.728 "is_configured": false, 00:16:43.728 "data_offset": 0, 00:16:43.728 "data_size": 0 00:16:43.728 } 00:16:43.728 ] 00:16:43.728 }' 00:16:43.728 19:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:43.728 19:52:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:44.294 19:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:44.553 [2024-07-24 19:52:36.066823] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:44.553 [2024-07-24 19:52:36.066850] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13b5a10 name Existed_Raid, state configuring 00:16:44.553 19:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:44.811 [2024-07-24 19:52:36.315498] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:44.811 [2024-07-24 19:52:36.315523] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:44.811 [2024-07-24 19:52:36.315533] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:44.811 [2024-07-24 19:52:36.315544] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:44.811 [2024-07-24 19:52:36.315553] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:44.811 [2024-07-24 19:52:36.315564] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:44.811 19:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:45.070 [2024-07-24 19:52:36.574061] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:45.070 BaseBdev1 00:16:45.070 19:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:45.070 19:52:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:16:45.070 19:52:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:45.070 19:52:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:45.070 19:52:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:45.070 19:52:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:45.070 19:52:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:45.329 19:52:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:45.589 [ 00:16:45.589 { 00:16:45.589 "name": "BaseBdev1", 00:16:45.589 "aliases": [ 00:16:45.589 "85873d97-62c7-4059-bd03-f3d55d1c2cfa" 00:16:45.589 ], 00:16:45.589 "product_name": "Malloc disk", 00:16:45.589 "block_size": 512, 00:16:45.589 "num_blocks": 65536, 00:16:45.589 "uuid": "85873d97-62c7-4059-bd03-f3d55d1c2cfa", 00:16:45.589 "assigned_rate_limits": { 00:16:45.589 "rw_ios_per_sec": 0, 00:16:45.589 "rw_mbytes_per_sec": 0, 00:16:45.589 "r_mbytes_per_sec": 0, 00:16:45.589 "w_mbytes_per_sec": 0 00:16:45.589 }, 00:16:45.589 "claimed": true, 00:16:45.589 "claim_type": "exclusive_write", 00:16:45.589 "zoned": false, 00:16:45.589 "supported_io_types": { 00:16:45.589 "read": true, 00:16:45.589 "write": true, 00:16:45.589 "unmap": true, 00:16:45.589 "flush": true, 00:16:45.589 "reset": true, 00:16:45.589 "nvme_admin": false, 00:16:45.589 "nvme_io": false, 00:16:45.589 "nvme_io_md": false, 00:16:45.589 "write_zeroes": true, 00:16:45.589 "zcopy": true, 00:16:45.589 "get_zone_info": false, 00:16:45.589 "zone_management": false, 00:16:45.589 "zone_append": false, 00:16:45.589 "compare": false, 00:16:45.589 "compare_and_write": false, 00:16:45.589 "abort": true, 00:16:45.589 "seek_hole": false, 00:16:45.589 "seek_data": false, 00:16:45.589 "copy": true, 00:16:45.589 "nvme_iov_md": false 00:16:45.589 }, 00:16:45.589 "memory_domains": [ 00:16:45.589 { 00:16:45.589 "dma_device_id": "system", 00:16:45.589 "dma_device_type": 1 00:16:45.589 }, 00:16:45.589 { 00:16:45.589 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:45.589 "dma_device_type": 2 00:16:45.589 } 00:16:45.589 ], 00:16:45.589 "driver_specific": {} 00:16:45.589 } 00:16:45.589 ] 00:16:45.589 19:52:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:45.589 19:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:45.589 19:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:45.589 19:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:45.589 19:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:45.589 19:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:45.589 19:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:45.589 19:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:45.589 19:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:45.589 19:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:45.589 19:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:45.589 19:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.589 19:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:45.848 19:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:45.848 "name": "Existed_Raid", 00:16:45.848 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:45.848 "strip_size_kb": 0, 00:16:45.848 "state": "configuring", 00:16:45.848 "raid_level": "raid1", 00:16:45.848 "superblock": false, 00:16:45.848 "num_base_bdevs": 3, 00:16:45.848 "num_base_bdevs_discovered": 1, 00:16:45.848 "num_base_bdevs_operational": 3, 00:16:45.848 "base_bdevs_list": [ 00:16:45.848 { 00:16:45.848 "name": "BaseBdev1", 00:16:45.848 "uuid": "85873d97-62c7-4059-bd03-f3d55d1c2cfa", 00:16:45.848 "is_configured": true, 00:16:45.848 "data_offset": 0, 00:16:45.848 "data_size": 65536 00:16:45.848 }, 00:16:45.848 { 00:16:45.848 "name": "BaseBdev2", 00:16:45.848 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:45.848 "is_configured": false, 00:16:45.848 "data_offset": 0, 00:16:45.848 "data_size": 0 00:16:45.848 }, 00:16:45.848 { 00:16:45.848 "name": "BaseBdev3", 00:16:45.848 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:45.848 "is_configured": false, 00:16:45.848 "data_offset": 0, 00:16:45.848 "data_size": 0 00:16:45.848 } 00:16:45.848 ] 00:16:45.848 }' 00:16:45.848 19:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:45.848 19:52:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:46.415 19:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:46.673 [2024-07-24 19:52:38.166291] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:46.673 [2024-07-24 19:52:38.166327] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13b52e0 name Existed_Raid, state configuring 00:16:46.673 19:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:46.932 [2024-07-24 19:52:38.410956] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:46.932 [2024-07-24 19:52:38.412426] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:46.932 [2024-07-24 19:52:38.412458] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:46.932 [2024-07-24 19:52:38.412468] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:46.932 [2024-07-24 19:52:38.412480] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:46.932 19:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:46.932 19:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:46.932 19:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:46.932 19:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:46.932 19:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:46.932 19:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:46.932 19:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:46.932 19:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:46.932 19:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:46.932 19:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:46.932 19:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:46.932 19:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:46.932 19:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.932 19:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:47.191 19:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:47.191 "name": "Existed_Raid", 00:16:47.191 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:47.191 "strip_size_kb": 0, 00:16:47.191 "state": "configuring", 00:16:47.191 "raid_level": "raid1", 00:16:47.191 "superblock": false, 00:16:47.191 "num_base_bdevs": 3, 00:16:47.191 "num_base_bdevs_discovered": 1, 00:16:47.191 "num_base_bdevs_operational": 3, 00:16:47.191 "base_bdevs_list": [ 00:16:47.191 { 00:16:47.191 "name": "BaseBdev1", 00:16:47.191 "uuid": "85873d97-62c7-4059-bd03-f3d55d1c2cfa", 00:16:47.191 "is_configured": true, 00:16:47.191 "data_offset": 0, 00:16:47.191 "data_size": 65536 00:16:47.191 }, 00:16:47.191 { 00:16:47.191 "name": "BaseBdev2", 00:16:47.191 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:47.191 "is_configured": false, 00:16:47.191 "data_offset": 0, 00:16:47.191 "data_size": 0 00:16:47.191 }, 00:16:47.191 { 00:16:47.191 "name": "BaseBdev3", 00:16:47.191 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:47.191 "is_configured": false, 00:16:47.191 "data_offset": 0, 00:16:47.191 "data_size": 0 00:16:47.191 } 00:16:47.191 ] 00:16:47.191 }' 00:16:47.191 19:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:47.191 19:52:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:47.758 19:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:48.017 [2024-07-24 19:52:39.457048] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:48.017 BaseBdev2 00:16:48.017 19:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:48.017 19:52:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:16:48.017 19:52:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:48.017 19:52:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:48.017 19:52:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:48.017 19:52:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:48.017 19:52:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:48.276 19:52:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:48.534 [ 00:16:48.534 { 00:16:48.534 "name": "BaseBdev2", 00:16:48.534 "aliases": [ 00:16:48.534 "3c2b3a95-41d4-4107-853c-764436615b94" 00:16:48.534 ], 00:16:48.534 "product_name": "Malloc disk", 00:16:48.534 "block_size": 512, 00:16:48.534 "num_blocks": 65536, 00:16:48.534 "uuid": "3c2b3a95-41d4-4107-853c-764436615b94", 00:16:48.534 "assigned_rate_limits": { 00:16:48.534 "rw_ios_per_sec": 0, 00:16:48.534 "rw_mbytes_per_sec": 0, 00:16:48.534 "r_mbytes_per_sec": 0, 00:16:48.534 "w_mbytes_per_sec": 0 00:16:48.534 }, 00:16:48.534 "claimed": true, 00:16:48.534 "claim_type": "exclusive_write", 00:16:48.534 "zoned": false, 00:16:48.534 "supported_io_types": { 00:16:48.534 "read": true, 00:16:48.534 "write": true, 00:16:48.534 "unmap": true, 00:16:48.534 "flush": true, 00:16:48.534 "reset": true, 00:16:48.534 "nvme_admin": false, 00:16:48.534 "nvme_io": false, 00:16:48.535 "nvme_io_md": false, 00:16:48.535 "write_zeroes": true, 00:16:48.535 "zcopy": true, 00:16:48.535 "get_zone_info": false, 00:16:48.535 "zone_management": false, 00:16:48.535 "zone_append": false, 00:16:48.535 "compare": false, 00:16:48.535 "compare_and_write": false, 00:16:48.535 "abort": true, 00:16:48.535 "seek_hole": false, 00:16:48.535 "seek_data": false, 00:16:48.535 "copy": true, 00:16:48.535 "nvme_iov_md": false 00:16:48.535 }, 00:16:48.535 "memory_domains": [ 00:16:48.535 { 00:16:48.535 "dma_device_id": "system", 00:16:48.535 "dma_device_type": 1 00:16:48.535 }, 00:16:48.535 { 00:16:48.535 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:48.535 "dma_device_type": 2 00:16:48.535 } 00:16:48.535 ], 00:16:48.535 "driver_specific": {} 00:16:48.535 } 00:16:48.535 ] 00:16:48.535 19:52:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:48.535 19:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:48.535 19:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:48.535 19:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:48.535 19:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:48.535 19:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:48.535 19:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:48.535 19:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:48.535 19:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:48.535 19:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:48.535 19:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:48.535 19:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:48.535 19:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:48.535 19:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:48.535 19:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:48.793 19:52:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:48.793 "name": "Existed_Raid", 00:16:48.793 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:48.793 "strip_size_kb": 0, 00:16:48.793 "state": "configuring", 00:16:48.793 "raid_level": "raid1", 00:16:48.793 "superblock": false, 00:16:48.793 "num_base_bdevs": 3, 00:16:48.793 "num_base_bdevs_discovered": 2, 00:16:48.793 "num_base_bdevs_operational": 3, 00:16:48.793 "base_bdevs_list": [ 00:16:48.793 { 00:16:48.793 "name": "BaseBdev1", 00:16:48.793 "uuid": "85873d97-62c7-4059-bd03-f3d55d1c2cfa", 00:16:48.793 "is_configured": true, 00:16:48.793 "data_offset": 0, 00:16:48.793 "data_size": 65536 00:16:48.793 }, 00:16:48.793 { 00:16:48.793 "name": "BaseBdev2", 00:16:48.793 "uuid": "3c2b3a95-41d4-4107-853c-764436615b94", 00:16:48.793 "is_configured": true, 00:16:48.793 "data_offset": 0, 00:16:48.793 "data_size": 65536 00:16:48.793 }, 00:16:48.793 { 00:16:48.793 "name": "BaseBdev3", 00:16:48.793 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:48.793 "is_configured": false, 00:16:48.793 "data_offset": 0, 00:16:48.793 "data_size": 0 00:16:48.793 } 00:16:48.793 ] 00:16:48.793 }' 00:16:48.793 19:52:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:48.793 19:52:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:49.359 19:52:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:49.618 [2024-07-24 19:52:41.064739] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:49.618 [2024-07-24 19:52:41.064780] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x13b61d0 00:16:49.618 [2024-07-24 19:52:41.064789] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:16:49.618 [2024-07-24 19:52:41.064981] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13b5ea0 00:16:49.618 [2024-07-24 19:52:41.065112] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13b61d0 00:16:49.618 [2024-07-24 19:52:41.065123] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x13b61d0 00:16:49.618 [2024-07-24 19:52:41.065285] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:49.618 BaseBdev3 00:16:49.618 19:52:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:49.618 19:52:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:16:49.618 19:52:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:49.618 19:52:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:49.618 19:52:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:49.618 19:52:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:49.618 19:52:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:49.877 19:52:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:50.135 [ 00:16:50.135 { 00:16:50.135 "name": "BaseBdev3", 00:16:50.135 "aliases": [ 00:16:50.135 "f68e76ec-51ba-46d3-8257-bb12a9fb00f3" 00:16:50.135 ], 00:16:50.135 "product_name": "Malloc disk", 00:16:50.135 "block_size": 512, 00:16:50.135 "num_blocks": 65536, 00:16:50.135 "uuid": "f68e76ec-51ba-46d3-8257-bb12a9fb00f3", 00:16:50.135 "assigned_rate_limits": { 00:16:50.135 "rw_ios_per_sec": 0, 00:16:50.135 "rw_mbytes_per_sec": 0, 00:16:50.135 "r_mbytes_per_sec": 0, 00:16:50.135 "w_mbytes_per_sec": 0 00:16:50.135 }, 00:16:50.135 "claimed": true, 00:16:50.135 "claim_type": "exclusive_write", 00:16:50.135 "zoned": false, 00:16:50.135 "supported_io_types": { 00:16:50.135 "read": true, 00:16:50.135 "write": true, 00:16:50.135 "unmap": true, 00:16:50.135 "flush": true, 00:16:50.135 "reset": true, 00:16:50.135 "nvme_admin": false, 00:16:50.135 "nvme_io": false, 00:16:50.135 "nvme_io_md": false, 00:16:50.135 "write_zeroes": true, 00:16:50.135 "zcopy": true, 00:16:50.135 "get_zone_info": false, 00:16:50.135 "zone_management": false, 00:16:50.135 "zone_append": false, 00:16:50.135 "compare": false, 00:16:50.135 "compare_and_write": false, 00:16:50.135 "abort": true, 00:16:50.135 "seek_hole": false, 00:16:50.135 "seek_data": false, 00:16:50.135 "copy": true, 00:16:50.135 "nvme_iov_md": false 00:16:50.135 }, 00:16:50.135 "memory_domains": [ 00:16:50.135 { 00:16:50.135 "dma_device_id": "system", 00:16:50.135 "dma_device_type": 1 00:16:50.135 }, 00:16:50.135 { 00:16:50.135 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:50.135 "dma_device_type": 2 00:16:50.135 } 00:16:50.135 ], 00:16:50.135 "driver_specific": {} 00:16:50.135 } 00:16:50.135 ] 00:16:50.135 19:52:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:50.135 19:52:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:50.135 19:52:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:50.135 19:52:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:50.135 19:52:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:50.135 19:52:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:50.135 19:52:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:50.135 19:52:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:50.135 19:52:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:50.135 19:52:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:50.136 19:52:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:50.136 19:52:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:50.136 19:52:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:50.136 19:52:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:50.136 19:52:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:50.394 19:52:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:50.394 "name": "Existed_Raid", 00:16:50.394 "uuid": "c2782a27-4055-4a7d-be4d-7aa07f8db144", 00:16:50.394 "strip_size_kb": 0, 00:16:50.394 "state": "online", 00:16:50.394 "raid_level": "raid1", 00:16:50.394 "superblock": false, 00:16:50.394 "num_base_bdevs": 3, 00:16:50.394 "num_base_bdevs_discovered": 3, 00:16:50.394 "num_base_bdevs_operational": 3, 00:16:50.394 "base_bdevs_list": [ 00:16:50.394 { 00:16:50.394 "name": "BaseBdev1", 00:16:50.394 "uuid": "85873d97-62c7-4059-bd03-f3d55d1c2cfa", 00:16:50.394 "is_configured": true, 00:16:50.394 "data_offset": 0, 00:16:50.394 "data_size": 65536 00:16:50.394 }, 00:16:50.394 { 00:16:50.394 "name": "BaseBdev2", 00:16:50.394 "uuid": "3c2b3a95-41d4-4107-853c-764436615b94", 00:16:50.394 "is_configured": true, 00:16:50.394 "data_offset": 0, 00:16:50.394 "data_size": 65536 00:16:50.394 }, 00:16:50.394 { 00:16:50.394 "name": "BaseBdev3", 00:16:50.394 "uuid": "f68e76ec-51ba-46d3-8257-bb12a9fb00f3", 00:16:50.394 "is_configured": true, 00:16:50.394 "data_offset": 0, 00:16:50.394 "data_size": 65536 00:16:50.394 } 00:16:50.394 ] 00:16:50.394 }' 00:16:50.394 19:52:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:50.395 19:52:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:50.963 19:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:50.963 19:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:50.963 19:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:50.963 19:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:50.963 19:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:50.963 19:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:50.963 19:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:50.963 19:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:51.221 [2024-07-24 19:52:42.673314] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:51.221 19:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:51.221 "name": "Existed_Raid", 00:16:51.221 "aliases": [ 00:16:51.221 "c2782a27-4055-4a7d-be4d-7aa07f8db144" 00:16:51.221 ], 00:16:51.221 "product_name": "Raid Volume", 00:16:51.221 "block_size": 512, 00:16:51.221 "num_blocks": 65536, 00:16:51.221 "uuid": "c2782a27-4055-4a7d-be4d-7aa07f8db144", 00:16:51.221 "assigned_rate_limits": { 00:16:51.221 "rw_ios_per_sec": 0, 00:16:51.221 "rw_mbytes_per_sec": 0, 00:16:51.221 "r_mbytes_per_sec": 0, 00:16:51.221 "w_mbytes_per_sec": 0 00:16:51.221 }, 00:16:51.221 "claimed": false, 00:16:51.221 "zoned": false, 00:16:51.221 "supported_io_types": { 00:16:51.221 "read": true, 00:16:51.221 "write": true, 00:16:51.221 "unmap": false, 00:16:51.221 "flush": false, 00:16:51.221 "reset": true, 00:16:51.221 "nvme_admin": false, 00:16:51.221 "nvme_io": false, 00:16:51.221 "nvme_io_md": false, 00:16:51.222 "write_zeroes": true, 00:16:51.222 "zcopy": false, 00:16:51.222 "get_zone_info": false, 00:16:51.222 "zone_management": false, 00:16:51.222 "zone_append": false, 00:16:51.222 "compare": false, 00:16:51.222 "compare_and_write": false, 00:16:51.222 "abort": false, 00:16:51.222 "seek_hole": false, 00:16:51.222 "seek_data": false, 00:16:51.222 "copy": false, 00:16:51.222 "nvme_iov_md": false 00:16:51.222 }, 00:16:51.222 "memory_domains": [ 00:16:51.222 { 00:16:51.222 "dma_device_id": "system", 00:16:51.222 "dma_device_type": 1 00:16:51.222 }, 00:16:51.222 { 00:16:51.222 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.222 "dma_device_type": 2 00:16:51.222 }, 00:16:51.222 { 00:16:51.222 "dma_device_id": "system", 00:16:51.222 "dma_device_type": 1 00:16:51.222 }, 00:16:51.222 { 00:16:51.222 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.222 "dma_device_type": 2 00:16:51.222 }, 00:16:51.222 { 00:16:51.222 "dma_device_id": "system", 00:16:51.222 "dma_device_type": 1 00:16:51.222 }, 00:16:51.222 { 00:16:51.222 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.222 "dma_device_type": 2 00:16:51.222 } 00:16:51.222 ], 00:16:51.222 "driver_specific": { 00:16:51.222 "raid": { 00:16:51.222 "uuid": "c2782a27-4055-4a7d-be4d-7aa07f8db144", 00:16:51.222 "strip_size_kb": 0, 00:16:51.222 "state": "online", 00:16:51.222 "raid_level": "raid1", 00:16:51.222 "superblock": false, 00:16:51.222 "num_base_bdevs": 3, 00:16:51.222 "num_base_bdevs_discovered": 3, 00:16:51.222 "num_base_bdevs_operational": 3, 00:16:51.222 "base_bdevs_list": [ 00:16:51.222 { 00:16:51.222 "name": "BaseBdev1", 00:16:51.222 "uuid": "85873d97-62c7-4059-bd03-f3d55d1c2cfa", 00:16:51.222 "is_configured": true, 00:16:51.222 "data_offset": 0, 00:16:51.222 "data_size": 65536 00:16:51.222 }, 00:16:51.222 { 00:16:51.222 "name": "BaseBdev2", 00:16:51.222 "uuid": "3c2b3a95-41d4-4107-853c-764436615b94", 00:16:51.222 "is_configured": true, 00:16:51.222 "data_offset": 0, 00:16:51.222 "data_size": 65536 00:16:51.222 }, 00:16:51.222 { 00:16:51.222 "name": "BaseBdev3", 00:16:51.222 "uuid": "f68e76ec-51ba-46d3-8257-bb12a9fb00f3", 00:16:51.222 "is_configured": true, 00:16:51.222 "data_offset": 0, 00:16:51.222 "data_size": 65536 00:16:51.222 } 00:16:51.222 ] 00:16:51.222 } 00:16:51.222 } 00:16:51.222 }' 00:16:51.222 19:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:51.222 19:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:51.222 BaseBdev2 00:16:51.222 BaseBdev3' 00:16:51.222 19:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:51.222 19:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:51.222 19:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:51.481 19:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:51.481 "name": "BaseBdev1", 00:16:51.481 "aliases": [ 00:16:51.481 "85873d97-62c7-4059-bd03-f3d55d1c2cfa" 00:16:51.481 ], 00:16:51.481 "product_name": "Malloc disk", 00:16:51.481 "block_size": 512, 00:16:51.481 "num_blocks": 65536, 00:16:51.481 "uuid": "85873d97-62c7-4059-bd03-f3d55d1c2cfa", 00:16:51.481 "assigned_rate_limits": { 00:16:51.481 "rw_ios_per_sec": 0, 00:16:51.481 "rw_mbytes_per_sec": 0, 00:16:51.481 "r_mbytes_per_sec": 0, 00:16:51.481 "w_mbytes_per_sec": 0 00:16:51.481 }, 00:16:51.481 "claimed": true, 00:16:51.481 "claim_type": "exclusive_write", 00:16:51.481 "zoned": false, 00:16:51.481 "supported_io_types": { 00:16:51.481 "read": true, 00:16:51.481 "write": true, 00:16:51.481 "unmap": true, 00:16:51.481 "flush": true, 00:16:51.481 "reset": true, 00:16:51.481 "nvme_admin": false, 00:16:51.481 "nvme_io": false, 00:16:51.481 "nvme_io_md": false, 00:16:51.481 "write_zeroes": true, 00:16:51.481 "zcopy": true, 00:16:51.481 "get_zone_info": false, 00:16:51.481 "zone_management": false, 00:16:51.481 "zone_append": false, 00:16:51.481 "compare": false, 00:16:51.481 "compare_and_write": false, 00:16:51.481 "abort": true, 00:16:51.481 "seek_hole": false, 00:16:51.481 "seek_data": false, 00:16:51.481 "copy": true, 00:16:51.481 "nvme_iov_md": false 00:16:51.481 }, 00:16:51.481 "memory_domains": [ 00:16:51.481 { 00:16:51.481 "dma_device_id": "system", 00:16:51.481 "dma_device_type": 1 00:16:51.481 }, 00:16:51.481 { 00:16:51.481 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.481 "dma_device_type": 2 00:16:51.481 } 00:16:51.481 ], 00:16:51.481 "driver_specific": {} 00:16:51.481 }' 00:16:51.481 19:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:51.481 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:51.481 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:51.481 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:51.740 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:51.740 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:51.740 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:51.740 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:51.740 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:51.740 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:51.740 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:51.740 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:51.740 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:51.740 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:51.740 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:51.997 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:51.997 "name": "BaseBdev2", 00:16:51.997 "aliases": [ 00:16:51.997 "3c2b3a95-41d4-4107-853c-764436615b94" 00:16:51.997 ], 00:16:51.997 "product_name": "Malloc disk", 00:16:51.997 "block_size": 512, 00:16:51.997 "num_blocks": 65536, 00:16:51.997 "uuid": "3c2b3a95-41d4-4107-853c-764436615b94", 00:16:51.997 "assigned_rate_limits": { 00:16:51.997 "rw_ios_per_sec": 0, 00:16:51.997 "rw_mbytes_per_sec": 0, 00:16:51.997 "r_mbytes_per_sec": 0, 00:16:51.997 "w_mbytes_per_sec": 0 00:16:51.997 }, 00:16:51.997 "claimed": true, 00:16:51.997 "claim_type": "exclusive_write", 00:16:51.997 "zoned": false, 00:16:51.997 "supported_io_types": { 00:16:51.997 "read": true, 00:16:51.997 "write": true, 00:16:51.997 "unmap": true, 00:16:51.997 "flush": true, 00:16:51.997 "reset": true, 00:16:51.997 "nvme_admin": false, 00:16:51.997 "nvme_io": false, 00:16:51.997 "nvme_io_md": false, 00:16:51.997 "write_zeroes": true, 00:16:51.997 "zcopy": true, 00:16:51.997 "get_zone_info": false, 00:16:51.997 "zone_management": false, 00:16:51.997 "zone_append": false, 00:16:51.997 "compare": false, 00:16:51.997 "compare_and_write": false, 00:16:51.997 "abort": true, 00:16:51.997 "seek_hole": false, 00:16:51.997 "seek_data": false, 00:16:51.997 "copy": true, 00:16:51.997 "nvme_iov_md": false 00:16:51.997 }, 00:16:51.997 "memory_domains": [ 00:16:51.997 { 00:16:51.997 "dma_device_id": "system", 00:16:51.997 "dma_device_type": 1 00:16:51.997 }, 00:16:51.997 { 00:16:51.997 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.997 "dma_device_type": 2 00:16:51.997 } 00:16:51.997 ], 00:16:51.997 "driver_specific": {} 00:16:51.997 }' 00:16:51.997 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:52.255 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:52.255 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:52.255 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:52.255 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:52.255 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:52.255 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:52.255 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:52.513 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:52.513 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:52.513 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:52.513 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:52.513 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:52.514 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:52.514 19:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:52.772 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:52.772 "name": "BaseBdev3", 00:16:52.772 "aliases": [ 00:16:52.772 "f68e76ec-51ba-46d3-8257-bb12a9fb00f3" 00:16:52.772 ], 00:16:52.772 "product_name": "Malloc disk", 00:16:52.772 "block_size": 512, 00:16:52.772 "num_blocks": 65536, 00:16:52.772 "uuid": "f68e76ec-51ba-46d3-8257-bb12a9fb00f3", 00:16:52.772 "assigned_rate_limits": { 00:16:52.772 "rw_ios_per_sec": 0, 00:16:52.772 "rw_mbytes_per_sec": 0, 00:16:52.772 "r_mbytes_per_sec": 0, 00:16:52.772 "w_mbytes_per_sec": 0 00:16:52.772 }, 00:16:52.772 "claimed": true, 00:16:52.772 "claim_type": "exclusive_write", 00:16:52.772 "zoned": false, 00:16:52.772 "supported_io_types": { 00:16:52.772 "read": true, 00:16:52.772 "write": true, 00:16:52.772 "unmap": true, 00:16:52.772 "flush": true, 00:16:52.772 "reset": true, 00:16:52.772 "nvme_admin": false, 00:16:52.772 "nvme_io": false, 00:16:52.772 "nvme_io_md": false, 00:16:52.772 "write_zeroes": true, 00:16:52.772 "zcopy": true, 00:16:52.772 "get_zone_info": false, 00:16:52.772 "zone_management": false, 00:16:52.772 "zone_append": false, 00:16:52.772 "compare": false, 00:16:52.772 "compare_and_write": false, 00:16:52.772 "abort": true, 00:16:52.772 "seek_hole": false, 00:16:52.772 "seek_data": false, 00:16:52.772 "copy": true, 00:16:52.772 "nvme_iov_md": false 00:16:52.772 }, 00:16:52.772 "memory_domains": [ 00:16:52.772 { 00:16:52.772 "dma_device_id": "system", 00:16:52.772 "dma_device_type": 1 00:16:52.772 }, 00:16:52.772 { 00:16:52.772 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:52.772 "dma_device_type": 2 00:16:52.772 } 00:16:52.772 ], 00:16:52.772 "driver_specific": {} 00:16:52.772 }' 00:16:52.772 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:52.772 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:52.772 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:52.773 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:52.773 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:53.032 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:53.032 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:53.032 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:53.032 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:53.032 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:53.032 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:53.032 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:53.032 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:53.290 [2024-07-24 19:52:44.702609] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:53.290 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:53.291 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:16:53.291 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:53.291 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:16:53.291 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:16:53.291 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:16:53.291 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:53.291 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:53.291 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:53.291 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:53.291 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:53.291 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:53.291 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:53.291 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:53.291 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:53.291 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:53.291 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:53.549 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:53.549 "name": "Existed_Raid", 00:16:53.549 "uuid": "c2782a27-4055-4a7d-be4d-7aa07f8db144", 00:16:53.549 "strip_size_kb": 0, 00:16:53.549 "state": "online", 00:16:53.549 "raid_level": "raid1", 00:16:53.549 "superblock": false, 00:16:53.549 "num_base_bdevs": 3, 00:16:53.549 "num_base_bdevs_discovered": 2, 00:16:53.549 "num_base_bdevs_operational": 2, 00:16:53.549 "base_bdevs_list": [ 00:16:53.549 { 00:16:53.549 "name": null, 00:16:53.549 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:53.549 "is_configured": false, 00:16:53.549 "data_offset": 0, 00:16:53.549 "data_size": 65536 00:16:53.549 }, 00:16:53.549 { 00:16:53.549 "name": "BaseBdev2", 00:16:53.549 "uuid": "3c2b3a95-41d4-4107-853c-764436615b94", 00:16:53.549 "is_configured": true, 00:16:53.549 "data_offset": 0, 00:16:53.549 "data_size": 65536 00:16:53.549 }, 00:16:53.549 { 00:16:53.549 "name": "BaseBdev3", 00:16:53.549 "uuid": "f68e76ec-51ba-46d3-8257-bb12a9fb00f3", 00:16:53.549 "is_configured": true, 00:16:53.549 "data_offset": 0, 00:16:53.549 "data_size": 65536 00:16:53.549 } 00:16:53.549 ] 00:16:53.549 }' 00:16:53.549 19:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:53.549 19:52:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:54.116 19:52:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:54.116 19:52:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:54.116 19:52:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:54.116 19:52:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:54.374 19:52:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:54.374 19:52:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:54.374 19:52:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:54.633 [2024-07-24 19:52:46.115368] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:54.633 19:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:54.633 19:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:54.633 19:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:54.633 19:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:54.892 19:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:54.892 19:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:54.892 19:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:55.150 [2024-07-24 19:52:46.621151] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:55.150 [2024-07-24 19:52:46.621229] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:55.150 [2024-07-24 19:52:46.633951] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:55.150 [2024-07-24 19:52:46.633986] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:55.150 [2024-07-24 19:52:46.633998] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13b61d0 name Existed_Raid, state offline 00:16:55.150 19:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:55.150 19:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:55.150 19:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:55.150 19:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:55.408 19:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:55.408 19:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:55.408 19:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:55.408 19:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:55.408 19:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:55.408 19:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:55.667 BaseBdev2 00:16:55.667 19:52:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:55.667 19:52:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:16:55.667 19:52:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:55.667 19:52:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:55.667 19:52:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:55.667 19:52:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:55.667 19:52:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:55.925 19:52:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:56.185 [ 00:16:56.185 { 00:16:56.185 "name": "BaseBdev2", 00:16:56.185 "aliases": [ 00:16:56.185 "c2d087b6-aa66-4317-b035-b228fa9b0e5b" 00:16:56.185 ], 00:16:56.185 "product_name": "Malloc disk", 00:16:56.185 "block_size": 512, 00:16:56.185 "num_blocks": 65536, 00:16:56.185 "uuid": "c2d087b6-aa66-4317-b035-b228fa9b0e5b", 00:16:56.185 "assigned_rate_limits": { 00:16:56.185 "rw_ios_per_sec": 0, 00:16:56.185 "rw_mbytes_per_sec": 0, 00:16:56.185 "r_mbytes_per_sec": 0, 00:16:56.185 "w_mbytes_per_sec": 0 00:16:56.185 }, 00:16:56.185 "claimed": false, 00:16:56.185 "zoned": false, 00:16:56.185 "supported_io_types": { 00:16:56.185 "read": true, 00:16:56.185 "write": true, 00:16:56.185 "unmap": true, 00:16:56.185 "flush": true, 00:16:56.185 "reset": true, 00:16:56.185 "nvme_admin": false, 00:16:56.185 "nvme_io": false, 00:16:56.185 "nvme_io_md": false, 00:16:56.185 "write_zeroes": true, 00:16:56.185 "zcopy": true, 00:16:56.185 "get_zone_info": false, 00:16:56.185 "zone_management": false, 00:16:56.185 "zone_append": false, 00:16:56.185 "compare": false, 00:16:56.185 "compare_and_write": false, 00:16:56.185 "abort": true, 00:16:56.185 "seek_hole": false, 00:16:56.185 "seek_data": false, 00:16:56.185 "copy": true, 00:16:56.185 "nvme_iov_md": false 00:16:56.185 }, 00:16:56.185 "memory_domains": [ 00:16:56.185 { 00:16:56.185 "dma_device_id": "system", 00:16:56.185 "dma_device_type": 1 00:16:56.185 }, 00:16:56.185 { 00:16:56.185 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:56.185 "dma_device_type": 2 00:16:56.185 } 00:16:56.185 ], 00:16:56.185 "driver_specific": {} 00:16:56.185 } 00:16:56.185 ] 00:16:56.185 19:52:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:56.185 19:52:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:56.185 19:52:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:56.185 19:52:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:56.444 BaseBdev3 00:16:56.444 19:52:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:56.444 19:52:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:16:56.444 19:52:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:56.444 19:52:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:56.444 19:52:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:56.444 19:52:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:56.444 19:52:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:56.702 19:52:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:56.960 [ 00:16:56.960 { 00:16:56.960 "name": "BaseBdev3", 00:16:56.960 "aliases": [ 00:16:56.960 "bd9a3f71-ce69-4ad1-9e81-4528d199c696" 00:16:56.960 ], 00:16:56.960 "product_name": "Malloc disk", 00:16:56.960 "block_size": 512, 00:16:56.960 "num_blocks": 65536, 00:16:56.960 "uuid": "bd9a3f71-ce69-4ad1-9e81-4528d199c696", 00:16:56.960 "assigned_rate_limits": { 00:16:56.960 "rw_ios_per_sec": 0, 00:16:56.960 "rw_mbytes_per_sec": 0, 00:16:56.960 "r_mbytes_per_sec": 0, 00:16:56.960 "w_mbytes_per_sec": 0 00:16:56.961 }, 00:16:56.961 "claimed": false, 00:16:56.961 "zoned": false, 00:16:56.961 "supported_io_types": { 00:16:56.961 "read": true, 00:16:56.961 "write": true, 00:16:56.961 "unmap": true, 00:16:56.961 "flush": true, 00:16:56.961 "reset": true, 00:16:56.961 "nvme_admin": false, 00:16:56.961 "nvme_io": false, 00:16:56.961 "nvme_io_md": false, 00:16:56.961 "write_zeroes": true, 00:16:56.961 "zcopy": true, 00:16:56.961 "get_zone_info": false, 00:16:56.961 "zone_management": false, 00:16:56.961 "zone_append": false, 00:16:56.961 "compare": false, 00:16:56.961 "compare_and_write": false, 00:16:56.961 "abort": true, 00:16:56.961 "seek_hole": false, 00:16:56.961 "seek_data": false, 00:16:56.961 "copy": true, 00:16:56.961 "nvme_iov_md": false 00:16:56.961 }, 00:16:56.961 "memory_domains": [ 00:16:56.961 { 00:16:56.961 "dma_device_id": "system", 00:16:56.961 "dma_device_type": 1 00:16:56.961 }, 00:16:56.961 { 00:16:56.961 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:56.961 "dma_device_type": 2 00:16:56.961 } 00:16:56.961 ], 00:16:56.961 "driver_specific": {} 00:16:56.961 } 00:16:56.961 ] 00:16:56.961 19:52:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:56.961 19:52:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:56.961 19:52:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:56.961 19:52:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:57.220 [2024-07-24 19:52:48.605058] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:57.220 [2024-07-24 19:52:48.605099] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:57.220 [2024-07-24 19:52:48.605119] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:57.220 [2024-07-24 19:52:48.606486] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:57.220 19:52:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:57.220 19:52:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:57.220 19:52:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:57.220 19:52:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:57.220 19:52:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:57.220 19:52:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:57.220 19:52:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:57.220 19:52:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:57.220 19:52:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:57.220 19:52:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:57.220 19:52:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:57.220 19:52:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:57.479 19:52:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:57.479 "name": "Existed_Raid", 00:16:57.479 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:57.479 "strip_size_kb": 0, 00:16:57.479 "state": "configuring", 00:16:57.479 "raid_level": "raid1", 00:16:57.479 "superblock": false, 00:16:57.479 "num_base_bdevs": 3, 00:16:57.479 "num_base_bdevs_discovered": 2, 00:16:57.479 "num_base_bdevs_operational": 3, 00:16:57.479 "base_bdevs_list": [ 00:16:57.479 { 00:16:57.479 "name": "BaseBdev1", 00:16:57.479 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:57.479 "is_configured": false, 00:16:57.479 "data_offset": 0, 00:16:57.479 "data_size": 0 00:16:57.479 }, 00:16:57.479 { 00:16:57.479 "name": "BaseBdev2", 00:16:57.479 "uuid": "c2d087b6-aa66-4317-b035-b228fa9b0e5b", 00:16:57.479 "is_configured": true, 00:16:57.479 "data_offset": 0, 00:16:57.479 "data_size": 65536 00:16:57.479 }, 00:16:57.479 { 00:16:57.479 "name": "BaseBdev3", 00:16:57.479 "uuid": "bd9a3f71-ce69-4ad1-9e81-4528d199c696", 00:16:57.479 "is_configured": true, 00:16:57.479 "data_offset": 0, 00:16:57.479 "data_size": 65536 00:16:57.479 } 00:16:57.479 ] 00:16:57.479 }' 00:16:57.479 19:52:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:57.479 19:52:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:58.047 19:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:58.305 [2024-07-24 19:52:49.683918] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:58.305 19:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:58.305 19:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:58.305 19:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:58.305 19:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:58.305 19:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:58.305 19:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:58.305 19:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:58.305 19:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:58.305 19:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:58.305 19:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:58.305 19:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.305 19:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:58.581 19:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:58.581 "name": "Existed_Raid", 00:16:58.581 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:58.581 "strip_size_kb": 0, 00:16:58.581 "state": "configuring", 00:16:58.581 "raid_level": "raid1", 00:16:58.581 "superblock": false, 00:16:58.581 "num_base_bdevs": 3, 00:16:58.581 "num_base_bdevs_discovered": 1, 00:16:58.581 "num_base_bdevs_operational": 3, 00:16:58.581 "base_bdevs_list": [ 00:16:58.581 { 00:16:58.581 "name": "BaseBdev1", 00:16:58.581 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:58.581 "is_configured": false, 00:16:58.581 "data_offset": 0, 00:16:58.581 "data_size": 0 00:16:58.581 }, 00:16:58.581 { 00:16:58.581 "name": null, 00:16:58.581 "uuid": "c2d087b6-aa66-4317-b035-b228fa9b0e5b", 00:16:58.581 "is_configured": false, 00:16:58.581 "data_offset": 0, 00:16:58.581 "data_size": 65536 00:16:58.581 }, 00:16:58.581 { 00:16:58.582 "name": "BaseBdev3", 00:16:58.582 "uuid": "bd9a3f71-ce69-4ad1-9e81-4528d199c696", 00:16:58.582 "is_configured": true, 00:16:58.582 "data_offset": 0, 00:16:58.582 "data_size": 65536 00:16:58.582 } 00:16:58.582 ] 00:16:58.582 }' 00:16:58.582 19:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:58.582 19:52:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:59.181 19:52:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:59.181 19:52:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:59.440 19:52:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:59.440 19:52:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:59.440 [2024-07-24 19:52:51.026833] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:59.440 BaseBdev1 00:16:59.698 19:52:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:59.698 19:52:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:16:59.698 19:52:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:59.698 19:52:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:59.698 19:52:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:59.698 19:52:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:59.698 19:52:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:59.698 19:52:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:59.958 [ 00:16:59.958 { 00:16:59.958 "name": "BaseBdev1", 00:16:59.958 "aliases": [ 00:16:59.958 "b8f2e108-a127-4fc3-9f45-9b38aef1429b" 00:16:59.958 ], 00:16:59.958 "product_name": "Malloc disk", 00:16:59.958 "block_size": 512, 00:16:59.958 "num_blocks": 65536, 00:16:59.958 "uuid": "b8f2e108-a127-4fc3-9f45-9b38aef1429b", 00:16:59.958 "assigned_rate_limits": { 00:16:59.958 "rw_ios_per_sec": 0, 00:16:59.958 "rw_mbytes_per_sec": 0, 00:16:59.958 "r_mbytes_per_sec": 0, 00:16:59.958 "w_mbytes_per_sec": 0 00:16:59.958 }, 00:16:59.958 "claimed": true, 00:16:59.958 "claim_type": "exclusive_write", 00:16:59.958 "zoned": false, 00:16:59.958 "supported_io_types": { 00:16:59.958 "read": true, 00:16:59.958 "write": true, 00:16:59.958 "unmap": true, 00:16:59.958 "flush": true, 00:16:59.958 "reset": true, 00:16:59.958 "nvme_admin": false, 00:16:59.958 "nvme_io": false, 00:16:59.958 "nvme_io_md": false, 00:16:59.958 "write_zeroes": true, 00:16:59.958 "zcopy": true, 00:16:59.958 "get_zone_info": false, 00:16:59.958 "zone_management": false, 00:16:59.958 "zone_append": false, 00:16:59.958 "compare": false, 00:16:59.958 "compare_and_write": false, 00:16:59.958 "abort": true, 00:16:59.958 "seek_hole": false, 00:16:59.958 "seek_data": false, 00:16:59.958 "copy": true, 00:16:59.958 "nvme_iov_md": false 00:16:59.958 }, 00:16:59.958 "memory_domains": [ 00:16:59.958 { 00:16:59.958 "dma_device_id": "system", 00:16:59.958 "dma_device_type": 1 00:16:59.958 }, 00:16:59.958 { 00:16:59.958 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.958 "dma_device_type": 2 00:16:59.958 } 00:16:59.958 ], 00:16:59.958 "driver_specific": {} 00:16:59.958 } 00:16:59.958 ] 00:16:59.958 19:52:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:59.958 19:52:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:59.958 19:52:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:59.958 19:52:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:59.958 19:52:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:59.958 19:52:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:59.958 19:52:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:59.958 19:52:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:59.958 19:52:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:59.958 19:52:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:59.958 19:52:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:59.958 19:52:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:59.958 19:52:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:00.217 19:52:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:00.217 "name": "Existed_Raid", 00:17:00.217 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:00.217 "strip_size_kb": 0, 00:17:00.217 "state": "configuring", 00:17:00.217 "raid_level": "raid1", 00:17:00.217 "superblock": false, 00:17:00.217 "num_base_bdevs": 3, 00:17:00.217 "num_base_bdevs_discovered": 2, 00:17:00.217 "num_base_bdevs_operational": 3, 00:17:00.217 "base_bdevs_list": [ 00:17:00.217 { 00:17:00.217 "name": "BaseBdev1", 00:17:00.217 "uuid": "b8f2e108-a127-4fc3-9f45-9b38aef1429b", 00:17:00.217 "is_configured": true, 00:17:00.217 "data_offset": 0, 00:17:00.217 "data_size": 65536 00:17:00.217 }, 00:17:00.217 { 00:17:00.217 "name": null, 00:17:00.217 "uuid": "c2d087b6-aa66-4317-b035-b228fa9b0e5b", 00:17:00.217 "is_configured": false, 00:17:00.217 "data_offset": 0, 00:17:00.217 "data_size": 65536 00:17:00.217 }, 00:17:00.217 { 00:17:00.217 "name": "BaseBdev3", 00:17:00.217 "uuid": "bd9a3f71-ce69-4ad1-9e81-4528d199c696", 00:17:00.217 "is_configured": true, 00:17:00.217 "data_offset": 0, 00:17:00.217 "data_size": 65536 00:17:00.217 } 00:17:00.217 ] 00:17:00.217 }' 00:17:00.217 19:52:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:00.217 19:52:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:01.153 19:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:01.153 19:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:01.153 19:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:01.153 19:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:01.412 [2024-07-24 19:52:52.891812] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:01.412 19:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:01.412 19:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:01.412 19:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:01.412 19:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:01.412 19:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:01.412 19:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:01.412 19:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:01.412 19:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:01.412 19:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:01.412 19:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:01.412 19:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:01.412 19:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:01.671 19:52:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:01.671 "name": "Existed_Raid", 00:17:01.671 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:01.671 "strip_size_kb": 0, 00:17:01.671 "state": "configuring", 00:17:01.671 "raid_level": "raid1", 00:17:01.671 "superblock": false, 00:17:01.671 "num_base_bdevs": 3, 00:17:01.671 "num_base_bdevs_discovered": 1, 00:17:01.671 "num_base_bdevs_operational": 3, 00:17:01.671 "base_bdevs_list": [ 00:17:01.671 { 00:17:01.671 "name": "BaseBdev1", 00:17:01.671 "uuid": "b8f2e108-a127-4fc3-9f45-9b38aef1429b", 00:17:01.671 "is_configured": true, 00:17:01.671 "data_offset": 0, 00:17:01.671 "data_size": 65536 00:17:01.671 }, 00:17:01.671 { 00:17:01.671 "name": null, 00:17:01.671 "uuid": "c2d087b6-aa66-4317-b035-b228fa9b0e5b", 00:17:01.671 "is_configured": false, 00:17:01.671 "data_offset": 0, 00:17:01.671 "data_size": 65536 00:17:01.671 }, 00:17:01.671 { 00:17:01.671 "name": null, 00:17:01.671 "uuid": "bd9a3f71-ce69-4ad1-9e81-4528d199c696", 00:17:01.671 "is_configured": false, 00:17:01.671 "data_offset": 0, 00:17:01.671 "data_size": 65536 00:17:01.671 } 00:17:01.671 ] 00:17:01.671 }' 00:17:01.671 19:52:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:01.671 19:52:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:02.613 19:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:02.613 19:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:02.871 19:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:02.871 19:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:03.130 [2024-07-24 19:52:54.516132] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:03.130 19:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:03.130 19:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:03.130 19:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:03.130 19:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:03.130 19:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:03.130 19:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:03.130 19:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:03.130 19:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:03.130 19:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:03.130 19:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:03.130 19:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:03.130 19:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:03.389 19:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:03.389 "name": "Existed_Raid", 00:17:03.389 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:03.389 "strip_size_kb": 0, 00:17:03.389 "state": "configuring", 00:17:03.389 "raid_level": "raid1", 00:17:03.389 "superblock": false, 00:17:03.389 "num_base_bdevs": 3, 00:17:03.389 "num_base_bdevs_discovered": 2, 00:17:03.389 "num_base_bdevs_operational": 3, 00:17:03.389 "base_bdevs_list": [ 00:17:03.389 { 00:17:03.389 "name": "BaseBdev1", 00:17:03.389 "uuid": "b8f2e108-a127-4fc3-9f45-9b38aef1429b", 00:17:03.390 "is_configured": true, 00:17:03.390 "data_offset": 0, 00:17:03.390 "data_size": 65536 00:17:03.390 }, 00:17:03.390 { 00:17:03.390 "name": null, 00:17:03.390 "uuid": "c2d087b6-aa66-4317-b035-b228fa9b0e5b", 00:17:03.390 "is_configured": false, 00:17:03.390 "data_offset": 0, 00:17:03.390 "data_size": 65536 00:17:03.390 }, 00:17:03.390 { 00:17:03.390 "name": "BaseBdev3", 00:17:03.390 "uuid": "bd9a3f71-ce69-4ad1-9e81-4528d199c696", 00:17:03.390 "is_configured": true, 00:17:03.390 "data_offset": 0, 00:17:03.390 "data_size": 65536 00:17:03.390 } 00:17:03.390 ] 00:17:03.390 }' 00:17:03.390 19:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:03.390 19:52:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:03.956 19:52:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:03.956 19:52:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:04.214 19:52:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:04.214 19:52:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:04.781 [2024-07-24 19:52:56.092336] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:04.781 19:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:04.781 19:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:04.781 19:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:04.781 19:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:04.781 19:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:04.781 19:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:04.781 19:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:04.781 19:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:04.781 19:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:04.781 19:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:04.781 19:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:04.781 19:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:05.039 19:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:05.039 "name": "Existed_Raid", 00:17:05.039 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:05.039 "strip_size_kb": 0, 00:17:05.039 "state": "configuring", 00:17:05.039 "raid_level": "raid1", 00:17:05.039 "superblock": false, 00:17:05.039 "num_base_bdevs": 3, 00:17:05.039 "num_base_bdevs_discovered": 1, 00:17:05.039 "num_base_bdevs_operational": 3, 00:17:05.039 "base_bdevs_list": [ 00:17:05.039 { 00:17:05.039 "name": null, 00:17:05.039 "uuid": "b8f2e108-a127-4fc3-9f45-9b38aef1429b", 00:17:05.039 "is_configured": false, 00:17:05.039 "data_offset": 0, 00:17:05.039 "data_size": 65536 00:17:05.040 }, 00:17:05.040 { 00:17:05.040 "name": null, 00:17:05.040 "uuid": "c2d087b6-aa66-4317-b035-b228fa9b0e5b", 00:17:05.040 "is_configured": false, 00:17:05.040 "data_offset": 0, 00:17:05.040 "data_size": 65536 00:17:05.040 }, 00:17:05.040 { 00:17:05.040 "name": "BaseBdev3", 00:17:05.040 "uuid": "bd9a3f71-ce69-4ad1-9e81-4528d199c696", 00:17:05.040 "is_configured": true, 00:17:05.040 "data_offset": 0, 00:17:05.040 "data_size": 65536 00:17:05.040 } 00:17:05.040 ] 00:17:05.040 }' 00:17:05.040 19:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:05.040 19:52:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:05.607 19:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:05.607 19:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:05.866 19:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:05.866 19:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:06.124 [2024-07-24 19:52:57.462430] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:06.124 19:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:06.124 19:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:06.124 19:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:06.124 19:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:06.124 19:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:06.124 19:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:06.124 19:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:06.124 19:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:06.124 19:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:06.124 19:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:06.124 19:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:06.124 19:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:06.382 19:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:06.382 "name": "Existed_Raid", 00:17:06.382 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:06.382 "strip_size_kb": 0, 00:17:06.382 "state": "configuring", 00:17:06.382 "raid_level": "raid1", 00:17:06.382 "superblock": false, 00:17:06.382 "num_base_bdevs": 3, 00:17:06.382 "num_base_bdevs_discovered": 2, 00:17:06.382 "num_base_bdevs_operational": 3, 00:17:06.382 "base_bdevs_list": [ 00:17:06.382 { 00:17:06.382 "name": null, 00:17:06.382 "uuid": "b8f2e108-a127-4fc3-9f45-9b38aef1429b", 00:17:06.382 "is_configured": false, 00:17:06.382 "data_offset": 0, 00:17:06.382 "data_size": 65536 00:17:06.382 }, 00:17:06.382 { 00:17:06.382 "name": "BaseBdev2", 00:17:06.382 "uuid": "c2d087b6-aa66-4317-b035-b228fa9b0e5b", 00:17:06.382 "is_configured": true, 00:17:06.382 "data_offset": 0, 00:17:06.382 "data_size": 65536 00:17:06.382 }, 00:17:06.382 { 00:17:06.382 "name": "BaseBdev3", 00:17:06.382 "uuid": "bd9a3f71-ce69-4ad1-9e81-4528d199c696", 00:17:06.382 "is_configured": true, 00:17:06.382 "data_offset": 0, 00:17:06.382 "data_size": 65536 00:17:06.382 } 00:17:06.382 ] 00:17:06.382 }' 00:17:06.382 19:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:06.382 19:52:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:06.949 19:52:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:06.949 19:52:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:07.208 19:52:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:07.208 19:52:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:07.208 19:52:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:07.467 19:52:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u b8f2e108-a127-4fc3-9f45-9b38aef1429b 00:17:07.725 [2024-07-24 19:52:59.063222] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:07.725 [2024-07-24 19:52:59.063262] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x13b7530 00:17:07.725 [2024-07-24 19:52:59.063270] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:17:07.725 [2024-07-24 19:52:59.063473] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x155e870 00:17:07.725 [2024-07-24 19:52:59.063605] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13b7530 00:17:07.725 [2024-07-24 19:52:59.063615] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x13b7530 00:17:07.725 [2024-07-24 19:52:59.063780] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:07.725 NewBaseBdev 00:17:07.725 19:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:07.726 19:52:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:17:07.726 19:52:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:07.726 19:52:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:17:07.726 19:52:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:07.726 19:52:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:07.726 19:52:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:07.985 19:52:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:07.985 [ 00:17:07.985 { 00:17:07.985 "name": "NewBaseBdev", 00:17:07.985 "aliases": [ 00:17:07.985 "b8f2e108-a127-4fc3-9f45-9b38aef1429b" 00:17:07.985 ], 00:17:07.985 "product_name": "Malloc disk", 00:17:07.985 "block_size": 512, 00:17:07.985 "num_blocks": 65536, 00:17:07.985 "uuid": "b8f2e108-a127-4fc3-9f45-9b38aef1429b", 00:17:07.985 "assigned_rate_limits": { 00:17:07.985 "rw_ios_per_sec": 0, 00:17:07.985 "rw_mbytes_per_sec": 0, 00:17:07.985 "r_mbytes_per_sec": 0, 00:17:07.985 "w_mbytes_per_sec": 0 00:17:07.985 }, 00:17:07.985 "claimed": true, 00:17:07.985 "claim_type": "exclusive_write", 00:17:07.985 "zoned": false, 00:17:07.985 "supported_io_types": { 00:17:07.985 "read": true, 00:17:07.985 "write": true, 00:17:07.985 "unmap": true, 00:17:07.985 "flush": true, 00:17:07.985 "reset": true, 00:17:07.985 "nvme_admin": false, 00:17:07.985 "nvme_io": false, 00:17:07.985 "nvme_io_md": false, 00:17:07.985 "write_zeroes": true, 00:17:07.985 "zcopy": true, 00:17:07.985 "get_zone_info": false, 00:17:07.985 "zone_management": false, 00:17:07.985 "zone_append": false, 00:17:07.985 "compare": false, 00:17:07.985 "compare_and_write": false, 00:17:07.985 "abort": true, 00:17:07.985 "seek_hole": false, 00:17:07.985 "seek_data": false, 00:17:07.985 "copy": true, 00:17:07.985 "nvme_iov_md": false 00:17:07.985 }, 00:17:07.985 "memory_domains": [ 00:17:07.985 { 00:17:07.985 "dma_device_id": "system", 00:17:07.985 "dma_device_type": 1 00:17:07.985 }, 00:17:07.985 { 00:17:07.985 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.985 "dma_device_type": 2 00:17:07.985 } 00:17:07.985 ], 00:17:07.985 "driver_specific": {} 00:17:07.985 } 00:17:07.985 ] 00:17:07.985 19:52:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:17:07.985 19:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:07.985 19:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:07.985 19:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:07.985 19:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:07.985 19:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:07.985 19:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:07.985 19:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:07.985 19:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:07.985 19:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:07.985 19:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:07.985 19:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:07.985 19:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:08.244 19:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:08.244 "name": "Existed_Raid", 00:17:08.244 "uuid": "c54baf9e-5e1c-4694-8820-0d3c7a965439", 00:17:08.244 "strip_size_kb": 0, 00:17:08.244 "state": "online", 00:17:08.244 "raid_level": "raid1", 00:17:08.244 "superblock": false, 00:17:08.244 "num_base_bdevs": 3, 00:17:08.244 "num_base_bdevs_discovered": 3, 00:17:08.244 "num_base_bdevs_operational": 3, 00:17:08.244 "base_bdevs_list": [ 00:17:08.244 { 00:17:08.244 "name": "NewBaseBdev", 00:17:08.244 "uuid": "b8f2e108-a127-4fc3-9f45-9b38aef1429b", 00:17:08.244 "is_configured": true, 00:17:08.244 "data_offset": 0, 00:17:08.244 "data_size": 65536 00:17:08.244 }, 00:17:08.244 { 00:17:08.244 "name": "BaseBdev2", 00:17:08.244 "uuid": "c2d087b6-aa66-4317-b035-b228fa9b0e5b", 00:17:08.244 "is_configured": true, 00:17:08.244 "data_offset": 0, 00:17:08.244 "data_size": 65536 00:17:08.244 }, 00:17:08.244 { 00:17:08.244 "name": "BaseBdev3", 00:17:08.244 "uuid": "bd9a3f71-ce69-4ad1-9e81-4528d199c696", 00:17:08.244 "is_configured": true, 00:17:08.244 "data_offset": 0, 00:17:08.244 "data_size": 65536 00:17:08.244 } 00:17:08.244 ] 00:17:08.244 }' 00:17:08.244 19:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:08.244 19:52:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:09.181 19:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:09.181 19:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:09.181 19:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:09.181 19:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:09.181 19:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:09.181 19:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:09.181 19:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:09.181 19:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:09.181 [2024-07-24 19:53:00.639718] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:09.181 19:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:09.181 "name": "Existed_Raid", 00:17:09.181 "aliases": [ 00:17:09.181 "c54baf9e-5e1c-4694-8820-0d3c7a965439" 00:17:09.181 ], 00:17:09.181 "product_name": "Raid Volume", 00:17:09.181 "block_size": 512, 00:17:09.181 "num_blocks": 65536, 00:17:09.181 "uuid": "c54baf9e-5e1c-4694-8820-0d3c7a965439", 00:17:09.181 "assigned_rate_limits": { 00:17:09.181 "rw_ios_per_sec": 0, 00:17:09.181 "rw_mbytes_per_sec": 0, 00:17:09.181 "r_mbytes_per_sec": 0, 00:17:09.181 "w_mbytes_per_sec": 0 00:17:09.181 }, 00:17:09.181 "claimed": false, 00:17:09.181 "zoned": false, 00:17:09.181 "supported_io_types": { 00:17:09.181 "read": true, 00:17:09.181 "write": true, 00:17:09.181 "unmap": false, 00:17:09.181 "flush": false, 00:17:09.181 "reset": true, 00:17:09.181 "nvme_admin": false, 00:17:09.181 "nvme_io": false, 00:17:09.181 "nvme_io_md": false, 00:17:09.181 "write_zeroes": true, 00:17:09.181 "zcopy": false, 00:17:09.181 "get_zone_info": false, 00:17:09.181 "zone_management": false, 00:17:09.181 "zone_append": false, 00:17:09.181 "compare": false, 00:17:09.181 "compare_and_write": false, 00:17:09.181 "abort": false, 00:17:09.181 "seek_hole": false, 00:17:09.181 "seek_data": false, 00:17:09.181 "copy": false, 00:17:09.181 "nvme_iov_md": false 00:17:09.181 }, 00:17:09.181 "memory_domains": [ 00:17:09.181 { 00:17:09.181 "dma_device_id": "system", 00:17:09.181 "dma_device_type": 1 00:17:09.181 }, 00:17:09.181 { 00:17:09.181 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:09.181 "dma_device_type": 2 00:17:09.181 }, 00:17:09.181 { 00:17:09.181 "dma_device_id": "system", 00:17:09.182 "dma_device_type": 1 00:17:09.182 }, 00:17:09.182 { 00:17:09.182 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:09.182 "dma_device_type": 2 00:17:09.182 }, 00:17:09.182 { 00:17:09.182 "dma_device_id": "system", 00:17:09.182 "dma_device_type": 1 00:17:09.182 }, 00:17:09.182 { 00:17:09.182 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:09.182 "dma_device_type": 2 00:17:09.182 } 00:17:09.182 ], 00:17:09.182 "driver_specific": { 00:17:09.182 "raid": { 00:17:09.182 "uuid": "c54baf9e-5e1c-4694-8820-0d3c7a965439", 00:17:09.182 "strip_size_kb": 0, 00:17:09.182 "state": "online", 00:17:09.182 "raid_level": "raid1", 00:17:09.182 "superblock": false, 00:17:09.182 "num_base_bdevs": 3, 00:17:09.182 "num_base_bdevs_discovered": 3, 00:17:09.182 "num_base_bdevs_operational": 3, 00:17:09.182 "base_bdevs_list": [ 00:17:09.182 { 00:17:09.182 "name": "NewBaseBdev", 00:17:09.182 "uuid": "b8f2e108-a127-4fc3-9f45-9b38aef1429b", 00:17:09.182 "is_configured": true, 00:17:09.182 "data_offset": 0, 00:17:09.182 "data_size": 65536 00:17:09.182 }, 00:17:09.182 { 00:17:09.182 "name": "BaseBdev2", 00:17:09.182 "uuid": "c2d087b6-aa66-4317-b035-b228fa9b0e5b", 00:17:09.182 "is_configured": true, 00:17:09.182 "data_offset": 0, 00:17:09.182 "data_size": 65536 00:17:09.182 }, 00:17:09.182 { 00:17:09.182 "name": "BaseBdev3", 00:17:09.182 "uuid": "bd9a3f71-ce69-4ad1-9e81-4528d199c696", 00:17:09.182 "is_configured": true, 00:17:09.182 "data_offset": 0, 00:17:09.182 "data_size": 65536 00:17:09.182 } 00:17:09.182 ] 00:17:09.182 } 00:17:09.182 } 00:17:09.182 }' 00:17:09.182 19:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:09.182 19:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:09.182 BaseBdev2 00:17:09.182 BaseBdev3' 00:17:09.182 19:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:09.182 19:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:09.182 19:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:09.441 19:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:09.441 "name": "NewBaseBdev", 00:17:09.441 "aliases": [ 00:17:09.441 "b8f2e108-a127-4fc3-9f45-9b38aef1429b" 00:17:09.441 ], 00:17:09.441 "product_name": "Malloc disk", 00:17:09.441 "block_size": 512, 00:17:09.441 "num_blocks": 65536, 00:17:09.441 "uuid": "b8f2e108-a127-4fc3-9f45-9b38aef1429b", 00:17:09.441 "assigned_rate_limits": { 00:17:09.441 "rw_ios_per_sec": 0, 00:17:09.441 "rw_mbytes_per_sec": 0, 00:17:09.441 "r_mbytes_per_sec": 0, 00:17:09.441 "w_mbytes_per_sec": 0 00:17:09.441 }, 00:17:09.441 "claimed": true, 00:17:09.441 "claim_type": "exclusive_write", 00:17:09.441 "zoned": false, 00:17:09.441 "supported_io_types": { 00:17:09.441 "read": true, 00:17:09.441 "write": true, 00:17:09.441 "unmap": true, 00:17:09.441 "flush": true, 00:17:09.441 "reset": true, 00:17:09.441 "nvme_admin": false, 00:17:09.441 "nvme_io": false, 00:17:09.441 "nvme_io_md": false, 00:17:09.441 "write_zeroes": true, 00:17:09.441 "zcopy": true, 00:17:09.441 "get_zone_info": false, 00:17:09.441 "zone_management": false, 00:17:09.441 "zone_append": false, 00:17:09.441 "compare": false, 00:17:09.441 "compare_and_write": false, 00:17:09.441 "abort": true, 00:17:09.441 "seek_hole": false, 00:17:09.441 "seek_data": false, 00:17:09.441 "copy": true, 00:17:09.441 "nvme_iov_md": false 00:17:09.441 }, 00:17:09.441 "memory_domains": [ 00:17:09.441 { 00:17:09.441 "dma_device_id": "system", 00:17:09.441 "dma_device_type": 1 00:17:09.441 }, 00:17:09.441 { 00:17:09.441 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:09.441 "dma_device_type": 2 00:17:09.441 } 00:17:09.441 ], 00:17:09.441 "driver_specific": {} 00:17:09.441 }' 00:17:09.441 19:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:09.441 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:09.699 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:09.699 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:09.699 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:09.699 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:09.699 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:09.699 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:09.699 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:09.699 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:09.699 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:09.958 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:09.958 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:09.958 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:09.958 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:10.217 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:10.217 "name": "BaseBdev2", 00:17:10.217 "aliases": [ 00:17:10.217 "c2d087b6-aa66-4317-b035-b228fa9b0e5b" 00:17:10.217 ], 00:17:10.217 "product_name": "Malloc disk", 00:17:10.217 "block_size": 512, 00:17:10.217 "num_blocks": 65536, 00:17:10.217 "uuid": "c2d087b6-aa66-4317-b035-b228fa9b0e5b", 00:17:10.217 "assigned_rate_limits": { 00:17:10.217 "rw_ios_per_sec": 0, 00:17:10.217 "rw_mbytes_per_sec": 0, 00:17:10.217 "r_mbytes_per_sec": 0, 00:17:10.217 "w_mbytes_per_sec": 0 00:17:10.217 }, 00:17:10.217 "claimed": true, 00:17:10.217 "claim_type": "exclusive_write", 00:17:10.217 "zoned": false, 00:17:10.217 "supported_io_types": { 00:17:10.217 "read": true, 00:17:10.217 "write": true, 00:17:10.217 "unmap": true, 00:17:10.217 "flush": true, 00:17:10.217 "reset": true, 00:17:10.217 "nvme_admin": false, 00:17:10.217 "nvme_io": false, 00:17:10.217 "nvme_io_md": false, 00:17:10.217 "write_zeroes": true, 00:17:10.217 "zcopy": true, 00:17:10.217 "get_zone_info": false, 00:17:10.217 "zone_management": false, 00:17:10.217 "zone_append": false, 00:17:10.217 "compare": false, 00:17:10.217 "compare_and_write": false, 00:17:10.217 "abort": true, 00:17:10.217 "seek_hole": false, 00:17:10.217 "seek_data": false, 00:17:10.217 "copy": true, 00:17:10.218 "nvme_iov_md": false 00:17:10.218 }, 00:17:10.218 "memory_domains": [ 00:17:10.218 { 00:17:10.218 "dma_device_id": "system", 00:17:10.218 "dma_device_type": 1 00:17:10.218 }, 00:17:10.218 { 00:17:10.218 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:10.218 "dma_device_type": 2 00:17:10.218 } 00:17:10.218 ], 00:17:10.218 "driver_specific": {} 00:17:10.218 }' 00:17:10.218 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:10.218 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:10.218 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:10.218 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:10.218 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:10.218 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:10.218 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:10.218 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:10.218 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:10.218 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:10.476 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:10.476 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:10.477 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:10.477 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:10.477 19:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:10.736 19:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:10.736 "name": "BaseBdev3", 00:17:10.736 "aliases": [ 00:17:10.736 "bd9a3f71-ce69-4ad1-9e81-4528d199c696" 00:17:10.736 ], 00:17:10.736 "product_name": "Malloc disk", 00:17:10.736 "block_size": 512, 00:17:10.736 "num_blocks": 65536, 00:17:10.736 "uuid": "bd9a3f71-ce69-4ad1-9e81-4528d199c696", 00:17:10.736 "assigned_rate_limits": { 00:17:10.736 "rw_ios_per_sec": 0, 00:17:10.736 "rw_mbytes_per_sec": 0, 00:17:10.736 "r_mbytes_per_sec": 0, 00:17:10.736 "w_mbytes_per_sec": 0 00:17:10.736 }, 00:17:10.736 "claimed": true, 00:17:10.736 "claim_type": "exclusive_write", 00:17:10.736 "zoned": false, 00:17:10.736 "supported_io_types": { 00:17:10.736 "read": true, 00:17:10.736 "write": true, 00:17:10.736 "unmap": true, 00:17:10.736 "flush": true, 00:17:10.736 "reset": true, 00:17:10.736 "nvme_admin": false, 00:17:10.736 "nvme_io": false, 00:17:10.736 "nvme_io_md": false, 00:17:10.736 "write_zeroes": true, 00:17:10.736 "zcopy": true, 00:17:10.736 "get_zone_info": false, 00:17:10.736 "zone_management": false, 00:17:10.736 "zone_append": false, 00:17:10.736 "compare": false, 00:17:10.736 "compare_and_write": false, 00:17:10.736 "abort": true, 00:17:10.736 "seek_hole": false, 00:17:10.736 "seek_data": false, 00:17:10.736 "copy": true, 00:17:10.736 "nvme_iov_md": false 00:17:10.736 }, 00:17:10.736 "memory_domains": [ 00:17:10.736 { 00:17:10.736 "dma_device_id": "system", 00:17:10.736 "dma_device_type": 1 00:17:10.736 }, 00:17:10.736 { 00:17:10.736 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:10.736 "dma_device_type": 2 00:17:10.736 } 00:17:10.736 ], 00:17:10.736 "driver_specific": {} 00:17:10.736 }' 00:17:10.736 19:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:10.736 19:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:10.736 19:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:10.736 19:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:10.736 19:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:10.736 19:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:10.736 19:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:10.995 19:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:10.995 19:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:10.995 19:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:10.995 19:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:10.995 19:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:10.995 19:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:11.255 [2024-07-24 19:53:02.704915] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:11.255 [2024-07-24 19:53:02.704941] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:11.255 [2024-07-24 19:53:02.704990] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:11.255 [2024-07-24 19:53:02.705258] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:11.255 [2024-07-24 19:53:02.705270] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13b7530 name Existed_Raid, state offline 00:17:11.255 19:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1421119 00:17:11.255 19:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 1421119 ']' 00:17:11.255 19:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 1421119 00:17:11.255 19:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:17:11.255 19:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:17:11.255 19:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1421119 00:17:11.255 19:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:17:11.255 19:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:17:11.255 19:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1421119' 00:17:11.255 killing process with pid 1421119 00:17:11.255 19:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 1421119 00:17:11.255 [2024-07-24 19:53:02.775724] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:11.255 19:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 1421119 00:17:11.255 [2024-07-24 19:53:02.801592] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:17:11.514 00:17:11.514 real 0m29.256s 00:17:11.514 user 0m53.712s 00:17:11.514 sys 0m5.215s 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:11.514 ************************************ 00:17:11.514 END TEST raid_state_function_test 00:17:11.514 ************************************ 00:17:11.514 19:53:03 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:17:11.514 19:53:03 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:17:11.514 19:53:03 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:17:11.514 19:53:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:11.514 ************************************ 00:17:11.514 START TEST raid_state_function_test_sb 00:17:11.514 ************************************ 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 3 true 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1425577 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1425577' 00:17:11.514 Process raid pid: 1425577 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1425577 /var/tmp/spdk-raid.sock 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1425577 ']' 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:11.514 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:17:11.514 19:53:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:11.773 [2024-07-24 19:53:03.206106] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:17:11.773 [2024-07-24 19:53:03.206239] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:12.032 [2024-07-24 19:53:03.403763] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:12.032 [2024-07-24 19:53:03.506985] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:12.032 [2024-07-24 19:53:03.569021] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:12.032 [2024-07-24 19:53:03.569052] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:12.292 19:53:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:17:12.292 19:53:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:17:12.292 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:12.292 [2024-07-24 19:53:03.862079] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:12.292 [2024-07-24 19:53:03.862115] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:12.292 [2024-07-24 19:53:03.862126] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:12.292 [2024-07-24 19:53:03.862138] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:12.292 [2024-07-24 19:53:03.862147] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:12.292 [2024-07-24 19:53:03.862158] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:12.292 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:12.292 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:12.292 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:12.292 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:12.292 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:12.292 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:12.292 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:12.292 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:12.292 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:12.292 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:12.551 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:12.551 19:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:12.551 19:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:12.551 "name": "Existed_Raid", 00:17:12.551 "uuid": "22cfda9f-cf5e-4a43-a022-7b19b739724d", 00:17:12.551 "strip_size_kb": 0, 00:17:12.551 "state": "configuring", 00:17:12.551 "raid_level": "raid1", 00:17:12.551 "superblock": true, 00:17:12.551 "num_base_bdevs": 3, 00:17:12.551 "num_base_bdevs_discovered": 0, 00:17:12.551 "num_base_bdevs_operational": 3, 00:17:12.551 "base_bdevs_list": [ 00:17:12.551 { 00:17:12.551 "name": "BaseBdev1", 00:17:12.551 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:12.551 "is_configured": false, 00:17:12.551 "data_offset": 0, 00:17:12.551 "data_size": 0 00:17:12.551 }, 00:17:12.551 { 00:17:12.551 "name": "BaseBdev2", 00:17:12.551 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:12.551 "is_configured": false, 00:17:12.551 "data_offset": 0, 00:17:12.551 "data_size": 0 00:17:12.551 }, 00:17:12.551 { 00:17:12.551 "name": "BaseBdev3", 00:17:12.551 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:12.551 "is_configured": false, 00:17:12.551 "data_offset": 0, 00:17:12.551 "data_size": 0 00:17:12.551 } 00:17:12.551 ] 00:17:12.551 }' 00:17:12.551 19:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:12.551 19:53:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:13.120 19:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:13.688 [2024-07-24 19:53:05.169374] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:13.688 [2024-07-24 19:53:05.169409] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20a8a10 name Existed_Raid, state configuring 00:17:13.688 19:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:14.257 [2024-07-24 19:53:05.682739] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:14.257 [2024-07-24 19:53:05.682771] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:14.257 [2024-07-24 19:53:05.682781] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:14.257 [2024-07-24 19:53:05.682793] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:14.257 [2024-07-24 19:53:05.682802] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:14.257 [2024-07-24 19:53:05.682813] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:14.257 19:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:14.826 [2024-07-24 19:53:06.207234] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:14.826 BaseBdev1 00:17:14.826 19:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:14.826 19:53:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:17:14.826 19:53:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:14.826 19:53:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:17:14.826 19:53:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:14.826 19:53:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:14.826 19:53:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:15.085 19:53:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:15.654 [ 00:17:15.654 { 00:17:15.654 "name": "BaseBdev1", 00:17:15.654 "aliases": [ 00:17:15.654 "c682316d-f3e3-4570-b128-31883d890f57" 00:17:15.654 ], 00:17:15.654 "product_name": "Malloc disk", 00:17:15.654 "block_size": 512, 00:17:15.654 "num_blocks": 65536, 00:17:15.654 "uuid": "c682316d-f3e3-4570-b128-31883d890f57", 00:17:15.654 "assigned_rate_limits": { 00:17:15.654 "rw_ios_per_sec": 0, 00:17:15.654 "rw_mbytes_per_sec": 0, 00:17:15.654 "r_mbytes_per_sec": 0, 00:17:15.654 "w_mbytes_per_sec": 0 00:17:15.654 }, 00:17:15.654 "claimed": true, 00:17:15.654 "claim_type": "exclusive_write", 00:17:15.654 "zoned": false, 00:17:15.654 "supported_io_types": { 00:17:15.654 "read": true, 00:17:15.654 "write": true, 00:17:15.654 "unmap": true, 00:17:15.654 "flush": true, 00:17:15.654 "reset": true, 00:17:15.654 "nvme_admin": false, 00:17:15.654 "nvme_io": false, 00:17:15.654 "nvme_io_md": false, 00:17:15.654 "write_zeroes": true, 00:17:15.654 "zcopy": true, 00:17:15.654 "get_zone_info": false, 00:17:15.654 "zone_management": false, 00:17:15.654 "zone_append": false, 00:17:15.654 "compare": false, 00:17:15.654 "compare_and_write": false, 00:17:15.655 "abort": true, 00:17:15.655 "seek_hole": false, 00:17:15.655 "seek_data": false, 00:17:15.655 "copy": true, 00:17:15.655 "nvme_iov_md": false 00:17:15.655 }, 00:17:15.655 "memory_domains": [ 00:17:15.655 { 00:17:15.655 "dma_device_id": "system", 00:17:15.655 "dma_device_type": 1 00:17:15.655 }, 00:17:15.655 { 00:17:15.655 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:15.655 "dma_device_type": 2 00:17:15.655 } 00:17:15.655 ], 00:17:15.655 "driver_specific": {} 00:17:15.655 } 00:17:15.655 ] 00:17:15.655 19:53:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:17:15.655 19:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:15.655 19:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:15.655 19:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:15.655 19:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:15.655 19:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:15.655 19:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:15.655 19:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:15.655 19:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:15.655 19:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:15.655 19:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:15.655 19:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:15.655 19:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:15.655 19:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:15.655 "name": "Existed_Raid", 00:17:15.655 "uuid": "50cfb807-2410-4669-87b9-82d0319e5f06", 00:17:15.655 "strip_size_kb": 0, 00:17:15.655 "state": "configuring", 00:17:15.655 "raid_level": "raid1", 00:17:15.655 "superblock": true, 00:17:15.655 "num_base_bdevs": 3, 00:17:15.655 "num_base_bdevs_discovered": 1, 00:17:15.655 "num_base_bdevs_operational": 3, 00:17:15.655 "base_bdevs_list": [ 00:17:15.655 { 00:17:15.655 "name": "BaseBdev1", 00:17:15.655 "uuid": "c682316d-f3e3-4570-b128-31883d890f57", 00:17:15.655 "is_configured": true, 00:17:15.655 "data_offset": 2048, 00:17:15.655 "data_size": 63488 00:17:15.655 }, 00:17:15.655 { 00:17:15.655 "name": "BaseBdev2", 00:17:15.655 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:15.655 "is_configured": false, 00:17:15.655 "data_offset": 0, 00:17:15.655 "data_size": 0 00:17:15.655 }, 00:17:15.655 { 00:17:15.655 "name": "BaseBdev3", 00:17:15.655 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:15.655 "is_configured": false, 00:17:15.655 "data_offset": 0, 00:17:15.655 "data_size": 0 00:17:15.655 } 00:17:15.655 ] 00:17:15.655 }' 00:17:15.655 19:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:15.655 19:53:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:16.280 19:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:16.540 [2024-07-24 19:53:08.060137] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:16.540 [2024-07-24 19:53:08.060175] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20a82e0 name Existed_Raid, state configuring 00:17:16.540 19:53:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:16.799 [2024-07-24 19:53:08.308844] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:16.799 [2024-07-24 19:53:08.310375] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:16.799 [2024-07-24 19:53:08.310416] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:16.799 [2024-07-24 19:53:08.310426] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:16.799 [2024-07-24 19:53:08.310438] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:16.799 19:53:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:16.799 19:53:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:16.799 19:53:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:16.799 19:53:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:16.799 19:53:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:16.799 19:53:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:16.799 19:53:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:16.799 19:53:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:16.799 19:53:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:16.799 19:53:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:16.799 19:53:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:16.799 19:53:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:16.799 19:53:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.799 19:53:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:17.059 19:53:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:17.059 "name": "Existed_Raid", 00:17:17.059 "uuid": "b3125246-c41d-490d-af8a-386836c9fb8e", 00:17:17.059 "strip_size_kb": 0, 00:17:17.059 "state": "configuring", 00:17:17.059 "raid_level": "raid1", 00:17:17.059 "superblock": true, 00:17:17.059 "num_base_bdevs": 3, 00:17:17.059 "num_base_bdevs_discovered": 1, 00:17:17.059 "num_base_bdevs_operational": 3, 00:17:17.059 "base_bdevs_list": [ 00:17:17.059 { 00:17:17.059 "name": "BaseBdev1", 00:17:17.059 "uuid": "c682316d-f3e3-4570-b128-31883d890f57", 00:17:17.059 "is_configured": true, 00:17:17.059 "data_offset": 2048, 00:17:17.059 "data_size": 63488 00:17:17.059 }, 00:17:17.059 { 00:17:17.059 "name": "BaseBdev2", 00:17:17.059 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:17.059 "is_configured": false, 00:17:17.059 "data_offset": 0, 00:17:17.059 "data_size": 0 00:17:17.059 }, 00:17:17.059 { 00:17:17.059 "name": "BaseBdev3", 00:17:17.059 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:17.059 "is_configured": false, 00:17:17.059 "data_offset": 0, 00:17:17.059 "data_size": 0 00:17:17.059 } 00:17:17.059 ] 00:17:17.059 }' 00:17:17.059 19:53:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:17.059 19:53:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:17.627 19:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:17.885 [2024-07-24 19:53:09.427203] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:17.885 BaseBdev2 00:17:17.885 19:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:17.885 19:53:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:17:17.885 19:53:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:17.885 19:53:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:17:17.885 19:53:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:17.885 19:53:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:17.885 19:53:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:18.145 19:53:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:18.404 [ 00:17:18.404 { 00:17:18.404 "name": "BaseBdev2", 00:17:18.404 "aliases": [ 00:17:18.404 "40a3a13f-4b41-48fa-862e-965811577cb6" 00:17:18.404 ], 00:17:18.404 "product_name": "Malloc disk", 00:17:18.404 "block_size": 512, 00:17:18.404 "num_blocks": 65536, 00:17:18.404 "uuid": "40a3a13f-4b41-48fa-862e-965811577cb6", 00:17:18.404 "assigned_rate_limits": { 00:17:18.404 "rw_ios_per_sec": 0, 00:17:18.404 "rw_mbytes_per_sec": 0, 00:17:18.404 "r_mbytes_per_sec": 0, 00:17:18.404 "w_mbytes_per_sec": 0 00:17:18.404 }, 00:17:18.404 "claimed": true, 00:17:18.404 "claim_type": "exclusive_write", 00:17:18.404 "zoned": false, 00:17:18.404 "supported_io_types": { 00:17:18.404 "read": true, 00:17:18.404 "write": true, 00:17:18.404 "unmap": true, 00:17:18.404 "flush": true, 00:17:18.404 "reset": true, 00:17:18.404 "nvme_admin": false, 00:17:18.404 "nvme_io": false, 00:17:18.404 "nvme_io_md": false, 00:17:18.404 "write_zeroes": true, 00:17:18.404 "zcopy": true, 00:17:18.404 "get_zone_info": false, 00:17:18.404 "zone_management": false, 00:17:18.404 "zone_append": false, 00:17:18.404 "compare": false, 00:17:18.404 "compare_and_write": false, 00:17:18.404 "abort": true, 00:17:18.404 "seek_hole": false, 00:17:18.404 "seek_data": false, 00:17:18.404 "copy": true, 00:17:18.404 "nvme_iov_md": false 00:17:18.404 }, 00:17:18.404 "memory_domains": [ 00:17:18.404 { 00:17:18.404 "dma_device_id": "system", 00:17:18.404 "dma_device_type": 1 00:17:18.404 }, 00:17:18.404 { 00:17:18.404 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:18.404 "dma_device_type": 2 00:17:18.404 } 00:17:18.404 ], 00:17:18.404 "driver_specific": {} 00:17:18.404 } 00:17:18.404 ] 00:17:18.404 19:53:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:17:18.404 19:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:18.404 19:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:18.404 19:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:18.404 19:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:18.404 19:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:18.404 19:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:18.404 19:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:18.404 19:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:18.404 19:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:18.404 19:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:18.404 19:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:18.404 19:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:18.404 19:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:18.404 19:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:18.664 19:53:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:18.664 "name": "Existed_Raid", 00:17:18.664 "uuid": "b3125246-c41d-490d-af8a-386836c9fb8e", 00:17:18.664 "strip_size_kb": 0, 00:17:18.664 "state": "configuring", 00:17:18.664 "raid_level": "raid1", 00:17:18.664 "superblock": true, 00:17:18.664 "num_base_bdevs": 3, 00:17:18.664 "num_base_bdevs_discovered": 2, 00:17:18.664 "num_base_bdevs_operational": 3, 00:17:18.664 "base_bdevs_list": [ 00:17:18.664 { 00:17:18.664 "name": "BaseBdev1", 00:17:18.664 "uuid": "c682316d-f3e3-4570-b128-31883d890f57", 00:17:18.664 "is_configured": true, 00:17:18.664 "data_offset": 2048, 00:17:18.664 "data_size": 63488 00:17:18.664 }, 00:17:18.664 { 00:17:18.664 "name": "BaseBdev2", 00:17:18.664 "uuid": "40a3a13f-4b41-48fa-862e-965811577cb6", 00:17:18.664 "is_configured": true, 00:17:18.664 "data_offset": 2048, 00:17:18.664 "data_size": 63488 00:17:18.664 }, 00:17:18.665 { 00:17:18.665 "name": "BaseBdev3", 00:17:18.665 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:18.665 "is_configured": false, 00:17:18.665 "data_offset": 0, 00:17:18.665 "data_size": 0 00:17:18.665 } 00:17:18.665 ] 00:17:18.665 }' 00:17:18.665 19:53:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:18.665 19:53:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:19.233 19:53:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:19.492 [2024-07-24 19:53:11.026926] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:19.492 [2024-07-24 19:53:11.027092] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x20a91d0 00:17:19.492 [2024-07-24 19:53:11.027105] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:19.492 [2024-07-24 19:53:11.027280] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20a8ea0 00:17:19.492 [2024-07-24 19:53:11.027424] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20a91d0 00:17:19.492 [2024-07-24 19:53:11.027435] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x20a91d0 00:17:19.492 [2024-07-24 19:53:11.027527] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:19.492 BaseBdev3 00:17:19.492 19:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:19.492 19:53:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:17:19.492 19:53:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:19.492 19:53:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:17:19.492 19:53:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:19.492 19:53:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:19.492 19:53:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:19.751 19:53:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:20.010 [ 00:17:20.010 { 00:17:20.010 "name": "BaseBdev3", 00:17:20.010 "aliases": [ 00:17:20.010 "3f8114d6-005d-4177-bd5a-ca767b62101c" 00:17:20.010 ], 00:17:20.010 "product_name": "Malloc disk", 00:17:20.010 "block_size": 512, 00:17:20.010 "num_blocks": 65536, 00:17:20.010 "uuid": "3f8114d6-005d-4177-bd5a-ca767b62101c", 00:17:20.010 "assigned_rate_limits": { 00:17:20.010 "rw_ios_per_sec": 0, 00:17:20.010 "rw_mbytes_per_sec": 0, 00:17:20.010 "r_mbytes_per_sec": 0, 00:17:20.010 "w_mbytes_per_sec": 0 00:17:20.010 }, 00:17:20.010 "claimed": true, 00:17:20.010 "claim_type": "exclusive_write", 00:17:20.010 "zoned": false, 00:17:20.010 "supported_io_types": { 00:17:20.010 "read": true, 00:17:20.010 "write": true, 00:17:20.010 "unmap": true, 00:17:20.010 "flush": true, 00:17:20.010 "reset": true, 00:17:20.010 "nvme_admin": false, 00:17:20.010 "nvme_io": false, 00:17:20.010 "nvme_io_md": false, 00:17:20.010 "write_zeroes": true, 00:17:20.010 "zcopy": true, 00:17:20.010 "get_zone_info": false, 00:17:20.010 "zone_management": false, 00:17:20.010 "zone_append": false, 00:17:20.010 "compare": false, 00:17:20.010 "compare_and_write": false, 00:17:20.010 "abort": true, 00:17:20.010 "seek_hole": false, 00:17:20.010 "seek_data": false, 00:17:20.010 "copy": true, 00:17:20.010 "nvme_iov_md": false 00:17:20.010 }, 00:17:20.010 "memory_domains": [ 00:17:20.010 { 00:17:20.010 "dma_device_id": "system", 00:17:20.010 "dma_device_type": 1 00:17:20.010 }, 00:17:20.010 { 00:17:20.010 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:20.010 "dma_device_type": 2 00:17:20.010 } 00:17:20.010 ], 00:17:20.010 "driver_specific": {} 00:17:20.010 } 00:17:20.010 ] 00:17:20.010 19:53:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:17:20.010 19:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:20.010 19:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:20.010 19:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:20.010 19:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:20.010 19:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:20.010 19:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:20.010 19:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:20.010 19:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:20.010 19:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:20.010 19:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:20.010 19:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:20.010 19:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:20.011 19:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:20.011 19:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:20.269 19:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:20.269 "name": "Existed_Raid", 00:17:20.269 "uuid": "b3125246-c41d-490d-af8a-386836c9fb8e", 00:17:20.269 "strip_size_kb": 0, 00:17:20.269 "state": "online", 00:17:20.269 "raid_level": "raid1", 00:17:20.269 "superblock": true, 00:17:20.269 "num_base_bdevs": 3, 00:17:20.269 "num_base_bdevs_discovered": 3, 00:17:20.269 "num_base_bdevs_operational": 3, 00:17:20.269 "base_bdevs_list": [ 00:17:20.269 { 00:17:20.269 "name": "BaseBdev1", 00:17:20.269 "uuid": "c682316d-f3e3-4570-b128-31883d890f57", 00:17:20.269 "is_configured": true, 00:17:20.269 "data_offset": 2048, 00:17:20.269 "data_size": 63488 00:17:20.269 }, 00:17:20.269 { 00:17:20.269 "name": "BaseBdev2", 00:17:20.269 "uuid": "40a3a13f-4b41-48fa-862e-965811577cb6", 00:17:20.269 "is_configured": true, 00:17:20.269 "data_offset": 2048, 00:17:20.269 "data_size": 63488 00:17:20.269 }, 00:17:20.269 { 00:17:20.269 "name": "BaseBdev3", 00:17:20.269 "uuid": "3f8114d6-005d-4177-bd5a-ca767b62101c", 00:17:20.269 "is_configured": true, 00:17:20.269 "data_offset": 2048, 00:17:20.269 "data_size": 63488 00:17:20.269 } 00:17:20.269 ] 00:17:20.269 }' 00:17:20.269 19:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:20.269 19:53:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:20.836 19:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:20.836 19:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:20.836 19:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:20.836 19:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:20.836 19:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:20.836 19:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:20.836 19:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:20.836 19:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:21.095 [2024-07-24 19:53:12.623539] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:21.095 19:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:21.095 "name": "Existed_Raid", 00:17:21.095 "aliases": [ 00:17:21.095 "b3125246-c41d-490d-af8a-386836c9fb8e" 00:17:21.095 ], 00:17:21.095 "product_name": "Raid Volume", 00:17:21.095 "block_size": 512, 00:17:21.095 "num_blocks": 63488, 00:17:21.095 "uuid": "b3125246-c41d-490d-af8a-386836c9fb8e", 00:17:21.095 "assigned_rate_limits": { 00:17:21.095 "rw_ios_per_sec": 0, 00:17:21.095 "rw_mbytes_per_sec": 0, 00:17:21.095 "r_mbytes_per_sec": 0, 00:17:21.095 "w_mbytes_per_sec": 0 00:17:21.095 }, 00:17:21.095 "claimed": false, 00:17:21.095 "zoned": false, 00:17:21.095 "supported_io_types": { 00:17:21.095 "read": true, 00:17:21.095 "write": true, 00:17:21.095 "unmap": false, 00:17:21.095 "flush": false, 00:17:21.095 "reset": true, 00:17:21.095 "nvme_admin": false, 00:17:21.095 "nvme_io": false, 00:17:21.095 "nvme_io_md": false, 00:17:21.095 "write_zeroes": true, 00:17:21.095 "zcopy": false, 00:17:21.095 "get_zone_info": false, 00:17:21.095 "zone_management": false, 00:17:21.095 "zone_append": false, 00:17:21.095 "compare": false, 00:17:21.095 "compare_and_write": false, 00:17:21.095 "abort": false, 00:17:21.095 "seek_hole": false, 00:17:21.095 "seek_data": false, 00:17:21.095 "copy": false, 00:17:21.095 "nvme_iov_md": false 00:17:21.095 }, 00:17:21.095 "memory_domains": [ 00:17:21.095 { 00:17:21.095 "dma_device_id": "system", 00:17:21.095 "dma_device_type": 1 00:17:21.095 }, 00:17:21.095 { 00:17:21.095 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:21.095 "dma_device_type": 2 00:17:21.095 }, 00:17:21.095 { 00:17:21.095 "dma_device_id": "system", 00:17:21.095 "dma_device_type": 1 00:17:21.095 }, 00:17:21.095 { 00:17:21.095 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:21.095 "dma_device_type": 2 00:17:21.095 }, 00:17:21.095 { 00:17:21.095 "dma_device_id": "system", 00:17:21.095 "dma_device_type": 1 00:17:21.095 }, 00:17:21.095 { 00:17:21.095 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:21.095 "dma_device_type": 2 00:17:21.095 } 00:17:21.095 ], 00:17:21.095 "driver_specific": { 00:17:21.095 "raid": { 00:17:21.095 "uuid": "b3125246-c41d-490d-af8a-386836c9fb8e", 00:17:21.095 "strip_size_kb": 0, 00:17:21.095 "state": "online", 00:17:21.095 "raid_level": "raid1", 00:17:21.095 "superblock": true, 00:17:21.095 "num_base_bdevs": 3, 00:17:21.095 "num_base_bdevs_discovered": 3, 00:17:21.095 "num_base_bdevs_operational": 3, 00:17:21.095 "base_bdevs_list": [ 00:17:21.096 { 00:17:21.096 "name": "BaseBdev1", 00:17:21.096 "uuid": "c682316d-f3e3-4570-b128-31883d890f57", 00:17:21.096 "is_configured": true, 00:17:21.096 "data_offset": 2048, 00:17:21.096 "data_size": 63488 00:17:21.096 }, 00:17:21.096 { 00:17:21.096 "name": "BaseBdev2", 00:17:21.096 "uuid": "40a3a13f-4b41-48fa-862e-965811577cb6", 00:17:21.096 "is_configured": true, 00:17:21.096 "data_offset": 2048, 00:17:21.096 "data_size": 63488 00:17:21.096 }, 00:17:21.096 { 00:17:21.096 "name": "BaseBdev3", 00:17:21.096 "uuid": "3f8114d6-005d-4177-bd5a-ca767b62101c", 00:17:21.096 "is_configured": true, 00:17:21.096 "data_offset": 2048, 00:17:21.096 "data_size": 63488 00:17:21.096 } 00:17:21.096 ] 00:17:21.096 } 00:17:21.096 } 00:17:21.096 }' 00:17:21.096 19:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:21.354 19:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:21.354 BaseBdev2 00:17:21.354 BaseBdev3' 00:17:21.354 19:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:21.354 19:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:21.354 19:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:21.612 19:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:21.612 "name": "BaseBdev1", 00:17:21.612 "aliases": [ 00:17:21.612 "c682316d-f3e3-4570-b128-31883d890f57" 00:17:21.612 ], 00:17:21.612 "product_name": "Malloc disk", 00:17:21.612 "block_size": 512, 00:17:21.612 "num_blocks": 65536, 00:17:21.612 "uuid": "c682316d-f3e3-4570-b128-31883d890f57", 00:17:21.612 "assigned_rate_limits": { 00:17:21.612 "rw_ios_per_sec": 0, 00:17:21.612 "rw_mbytes_per_sec": 0, 00:17:21.612 "r_mbytes_per_sec": 0, 00:17:21.612 "w_mbytes_per_sec": 0 00:17:21.612 }, 00:17:21.612 "claimed": true, 00:17:21.612 "claim_type": "exclusive_write", 00:17:21.612 "zoned": false, 00:17:21.612 "supported_io_types": { 00:17:21.612 "read": true, 00:17:21.612 "write": true, 00:17:21.612 "unmap": true, 00:17:21.612 "flush": true, 00:17:21.612 "reset": true, 00:17:21.612 "nvme_admin": false, 00:17:21.612 "nvme_io": false, 00:17:21.612 "nvme_io_md": false, 00:17:21.612 "write_zeroes": true, 00:17:21.612 "zcopy": true, 00:17:21.612 "get_zone_info": false, 00:17:21.612 "zone_management": false, 00:17:21.612 "zone_append": false, 00:17:21.612 "compare": false, 00:17:21.612 "compare_and_write": false, 00:17:21.612 "abort": true, 00:17:21.612 "seek_hole": false, 00:17:21.612 "seek_data": false, 00:17:21.612 "copy": true, 00:17:21.612 "nvme_iov_md": false 00:17:21.612 }, 00:17:21.612 "memory_domains": [ 00:17:21.612 { 00:17:21.612 "dma_device_id": "system", 00:17:21.612 "dma_device_type": 1 00:17:21.612 }, 00:17:21.612 { 00:17:21.612 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:21.612 "dma_device_type": 2 00:17:21.612 } 00:17:21.612 ], 00:17:21.612 "driver_specific": {} 00:17:21.612 }' 00:17:21.612 19:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:21.612 19:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:21.612 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:21.612 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:21.612 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:21.612 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:21.612 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:21.612 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:21.871 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:21.871 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:21.871 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:21.871 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:21.871 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:21.871 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:21.871 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:22.129 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:22.129 "name": "BaseBdev2", 00:17:22.129 "aliases": [ 00:17:22.129 "40a3a13f-4b41-48fa-862e-965811577cb6" 00:17:22.129 ], 00:17:22.129 "product_name": "Malloc disk", 00:17:22.129 "block_size": 512, 00:17:22.129 "num_blocks": 65536, 00:17:22.129 "uuid": "40a3a13f-4b41-48fa-862e-965811577cb6", 00:17:22.129 "assigned_rate_limits": { 00:17:22.129 "rw_ios_per_sec": 0, 00:17:22.129 "rw_mbytes_per_sec": 0, 00:17:22.129 "r_mbytes_per_sec": 0, 00:17:22.129 "w_mbytes_per_sec": 0 00:17:22.129 }, 00:17:22.129 "claimed": true, 00:17:22.129 "claim_type": "exclusive_write", 00:17:22.129 "zoned": false, 00:17:22.129 "supported_io_types": { 00:17:22.129 "read": true, 00:17:22.129 "write": true, 00:17:22.129 "unmap": true, 00:17:22.129 "flush": true, 00:17:22.129 "reset": true, 00:17:22.129 "nvme_admin": false, 00:17:22.129 "nvme_io": false, 00:17:22.129 "nvme_io_md": false, 00:17:22.129 "write_zeroes": true, 00:17:22.129 "zcopy": true, 00:17:22.129 "get_zone_info": false, 00:17:22.129 "zone_management": false, 00:17:22.129 "zone_append": false, 00:17:22.129 "compare": false, 00:17:22.129 "compare_and_write": false, 00:17:22.129 "abort": true, 00:17:22.129 "seek_hole": false, 00:17:22.129 "seek_data": false, 00:17:22.129 "copy": true, 00:17:22.129 "nvme_iov_md": false 00:17:22.129 }, 00:17:22.129 "memory_domains": [ 00:17:22.129 { 00:17:22.129 "dma_device_id": "system", 00:17:22.129 "dma_device_type": 1 00:17:22.129 }, 00:17:22.129 { 00:17:22.129 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:22.129 "dma_device_type": 2 00:17:22.129 } 00:17:22.129 ], 00:17:22.129 "driver_specific": {} 00:17:22.129 }' 00:17:22.129 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:22.129 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:22.129 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:22.129 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:22.129 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:22.388 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:22.388 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:22.388 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:22.388 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:22.388 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:22.388 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:22.388 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:22.388 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:22.388 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:22.388 19:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:22.647 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:22.647 "name": "BaseBdev3", 00:17:22.647 "aliases": [ 00:17:22.647 "3f8114d6-005d-4177-bd5a-ca767b62101c" 00:17:22.647 ], 00:17:22.647 "product_name": "Malloc disk", 00:17:22.647 "block_size": 512, 00:17:22.647 "num_blocks": 65536, 00:17:22.647 "uuid": "3f8114d6-005d-4177-bd5a-ca767b62101c", 00:17:22.647 "assigned_rate_limits": { 00:17:22.647 "rw_ios_per_sec": 0, 00:17:22.647 "rw_mbytes_per_sec": 0, 00:17:22.647 "r_mbytes_per_sec": 0, 00:17:22.647 "w_mbytes_per_sec": 0 00:17:22.647 }, 00:17:22.647 "claimed": true, 00:17:22.647 "claim_type": "exclusive_write", 00:17:22.647 "zoned": false, 00:17:22.647 "supported_io_types": { 00:17:22.647 "read": true, 00:17:22.647 "write": true, 00:17:22.647 "unmap": true, 00:17:22.647 "flush": true, 00:17:22.647 "reset": true, 00:17:22.647 "nvme_admin": false, 00:17:22.647 "nvme_io": false, 00:17:22.647 "nvme_io_md": false, 00:17:22.647 "write_zeroes": true, 00:17:22.647 "zcopy": true, 00:17:22.647 "get_zone_info": false, 00:17:22.647 "zone_management": false, 00:17:22.647 "zone_append": false, 00:17:22.647 "compare": false, 00:17:22.647 "compare_and_write": false, 00:17:22.647 "abort": true, 00:17:22.647 "seek_hole": false, 00:17:22.647 "seek_data": false, 00:17:22.647 "copy": true, 00:17:22.647 "nvme_iov_md": false 00:17:22.647 }, 00:17:22.647 "memory_domains": [ 00:17:22.647 { 00:17:22.647 "dma_device_id": "system", 00:17:22.647 "dma_device_type": 1 00:17:22.647 }, 00:17:22.647 { 00:17:22.647 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:22.647 "dma_device_type": 2 00:17:22.647 } 00:17:22.647 ], 00:17:22.647 "driver_specific": {} 00:17:22.647 }' 00:17:22.647 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:22.647 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:22.647 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:22.647 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:22.905 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:22.906 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:22.906 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:22.906 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:22.906 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:22.906 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:22.906 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:22.906 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:22.906 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:23.164 [2024-07-24 19:53:14.716835] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:23.164 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:23.164 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:17:23.164 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:23.164 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:17:23.164 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:17:23.164 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:17:23.164 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:23.164 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:23.164 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:23.164 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:23.164 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:23.164 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:23.164 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:23.164 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:23.164 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:23.164 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:23.164 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:23.423 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:23.423 "name": "Existed_Raid", 00:17:23.423 "uuid": "b3125246-c41d-490d-af8a-386836c9fb8e", 00:17:23.423 "strip_size_kb": 0, 00:17:23.423 "state": "online", 00:17:23.423 "raid_level": "raid1", 00:17:23.423 "superblock": true, 00:17:23.423 "num_base_bdevs": 3, 00:17:23.423 "num_base_bdevs_discovered": 2, 00:17:23.423 "num_base_bdevs_operational": 2, 00:17:23.423 "base_bdevs_list": [ 00:17:23.423 { 00:17:23.423 "name": null, 00:17:23.423 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:23.423 "is_configured": false, 00:17:23.423 "data_offset": 2048, 00:17:23.423 "data_size": 63488 00:17:23.423 }, 00:17:23.423 { 00:17:23.423 "name": "BaseBdev2", 00:17:23.423 "uuid": "40a3a13f-4b41-48fa-862e-965811577cb6", 00:17:23.423 "is_configured": true, 00:17:23.423 "data_offset": 2048, 00:17:23.423 "data_size": 63488 00:17:23.423 }, 00:17:23.423 { 00:17:23.423 "name": "BaseBdev3", 00:17:23.423 "uuid": "3f8114d6-005d-4177-bd5a-ca767b62101c", 00:17:23.423 "is_configured": true, 00:17:23.423 "data_offset": 2048, 00:17:23.423 "data_size": 63488 00:17:23.423 } 00:17:23.423 ] 00:17:23.423 }' 00:17:23.423 19:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:23.423 19:53:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:24.359 19:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:24.359 19:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:24.359 19:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:24.359 19:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:24.359 19:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:24.359 19:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:24.359 19:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:24.618 [2024-07-24 19:53:16.070362] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:24.618 19:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:24.618 19:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:24.618 19:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:24.618 19:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:24.876 19:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:24.876 19:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:24.876 19:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:25.135 [2024-07-24 19:53:16.575988] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:25.135 [2024-07-24 19:53:16.576071] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:25.135 [2024-07-24 19:53:16.586831] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:25.135 [2024-07-24 19:53:16.586864] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:25.135 [2024-07-24 19:53:16.586875] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20a91d0 name Existed_Raid, state offline 00:17:25.135 19:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:25.135 19:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:25.135 19:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:25.135 19:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:25.393 19:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:25.393 19:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:25.393 19:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:17:25.393 19:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:25.393 19:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:25.394 19:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:25.652 BaseBdev2 00:17:25.652 19:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:25.652 19:53:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:17:25.652 19:53:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:25.652 19:53:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:17:25.652 19:53:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:25.652 19:53:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:25.652 19:53:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:25.917 19:53:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:26.182 [ 00:17:26.182 { 00:17:26.182 "name": "BaseBdev2", 00:17:26.182 "aliases": [ 00:17:26.182 "f8372fd6-c8d7-4971-9801-29f1a6974352" 00:17:26.182 ], 00:17:26.182 "product_name": "Malloc disk", 00:17:26.182 "block_size": 512, 00:17:26.182 "num_blocks": 65536, 00:17:26.182 "uuid": "f8372fd6-c8d7-4971-9801-29f1a6974352", 00:17:26.182 "assigned_rate_limits": { 00:17:26.182 "rw_ios_per_sec": 0, 00:17:26.182 "rw_mbytes_per_sec": 0, 00:17:26.182 "r_mbytes_per_sec": 0, 00:17:26.182 "w_mbytes_per_sec": 0 00:17:26.182 }, 00:17:26.182 "claimed": false, 00:17:26.182 "zoned": false, 00:17:26.182 "supported_io_types": { 00:17:26.182 "read": true, 00:17:26.182 "write": true, 00:17:26.182 "unmap": true, 00:17:26.182 "flush": true, 00:17:26.182 "reset": true, 00:17:26.182 "nvme_admin": false, 00:17:26.182 "nvme_io": false, 00:17:26.182 "nvme_io_md": false, 00:17:26.182 "write_zeroes": true, 00:17:26.182 "zcopy": true, 00:17:26.182 "get_zone_info": false, 00:17:26.182 "zone_management": false, 00:17:26.182 "zone_append": false, 00:17:26.182 "compare": false, 00:17:26.182 "compare_and_write": false, 00:17:26.182 "abort": true, 00:17:26.182 "seek_hole": false, 00:17:26.182 "seek_data": false, 00:17:26.182 "copy": true, 00:17:26.182 "nvme_iov_md": false 00:17:26.182 }, 00:17:26.182 "memory_domains": [ 00:17:26.182 { 00:17:26.182 "dma_device_id": "system", 00:17:26.182 "dma_device_type": 1 00:17:26.182 }, 00:17:26.182 { 00:17:26.182 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:26.182 "dma_device_type": 2 00:17:26.182 } 00:17:26.182 ], 00:17:26.182 "driver_specific": {} 00:17:26.182 } 00:17:26.182 ] 00:17:26.182 19:53:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:17:26.182 19:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:26.182 19:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:26.182 19:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:26.441 BaseBdev3 00:17:26.441 19:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:26.441 19:53:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:17:26.441 19:53:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:26.441 19:53:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:17:26.441 19:53:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:26.441 19:53:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:26.441 19:53:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:26.700 19:53:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:26.959 [ 00:17:26.959 { 00:17:26.959 "name": "BaseBdev3", 00:17:26.959 "aliases": [ 00:17:26.959 "862fc599-e694-4e48-bac5-f568e4847411" 00:17:26.959 ], 00:17:26.959 "product_name": "Malloc disk", 00:17:26.959 "block_size": 512, 00:17:26.959 "num_blocks": 65536, 00:17:26.959 "uuid": "862fc599-e694-4e48-bac5-f568e4847411", 00:17:26.959 "assigned_rate_limits": { 00:17:26.959 "rw_ios_per_sec": 0, 00:17:26.959 "rw_mbytes_per_sec": 0, 00:17:26.960 "r_mbytes_per_sec": 0, 00:17:26.960 "w_mbytes_per_sec": 0 00:17:26.960 }, 00:17:26.960 "claimed": false, 00:17:26.960 "zoned": false, 00:17:26.960 "supported_io_types": { 00:17:26.960 "read": true, 00:17:26.960 "write": true, 00:17:26.960 "unmap": true, 00:17:26.960 "flush": true, 00:17:26.960 "reset": true, 00:17:26.960 "nvme_admin": false, 00:17:26.960 "nvme_io": false, 00:17:26.960 "nvme_io_md": false, 00:17:26.960 "write_zeroes": true, 00:17:26.960 "zcopy": true, 00:17:26.960 "get_zone_info": false, 00:17:26.960 "zone_management": false, 00:17:26.960 "zone_append": false, 00:17:26.960 "compare": false, 00:17:26.960 "compare_and_write": false, 00:17:26.960 "abort": true, 00:17:26.960 "seek_hole": false, 00:17:26.960 "seek_data": false, 00:17:26.960 "copy": true, 00:17:26.960 "nvme_iov_md": false 00:17:26.960 }, 00:17:26.960 "memory_domains": [ 00:17:26.960 { 00:17:26.960 "dma_device_id": "system", 00:17:26.960 "dma_device_type": 1 00:17:26.960 }, 00:17:26.960 { 00:17:26.960 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:26.960 "dma_device_type": 2 00:17:26.960 } 00:17:26.960 ], 00:17:26.960 "driver_specific": {} 00:17:26.960 } 00:17:26.960 ] 00:17:26.960 19:53:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:17:26.960 19:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:26.960 19:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:26.960 19:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:26.960 [2024-07-24 19:53:18.521668] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:26.960 [2024-07-24 19:53:18.521707] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:26.960 [2024-07-24 19:53:18.521724] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:26.960 [2024-07-24 19:53:18.523096] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:26.960 19:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:26.960 19:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:26.960 19:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:26.960 19:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:26.960 19:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:26.960 19:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:26.960 19:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:26.960 19:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:26.960 19:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:26.960 19:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:26.960 19:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:26.960 19:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:27.220 19:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:27.220 "name": "Existed_Raid", 00:17:27.220 "uuid": "9d8cb07a-74fc-4a12-b3a2-2892cde54a7f", 00:17:27.220 "strip_size_kb": 0, 00:17:27.220 "state": "configuring", 00:17:27.220 "raid_level": "raid1", 00:17:27.220 "superblock": true, 00:17:27.220 "num_base_bdevs": 3, 00:17:27.220 "num_base_bdevs_discovered": 2, 00:17:27.220 "num_base_bdevs_operational": 3, 00:17:27.220 "base_bdevs_list": [ 00:17:27.220 { 00:17:27.220 "name": "BaseBdev1", 00:17:27.220 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:27.220 "is_configured": false, 00:17:27.220 "data_offset": 0, 00:17:27.220 "data_size": 0 00:17:27.220 }, 00:17:27.220 { 00:17:27.220 "name": "BaseBdev2", 00:17:27.220 "uuid": "f8372fd6-c8d7-4971-9801-29f1a6974352", 00:17:27.220 "is_configured": true, 00:17:27.220 "data_offset": 2048, 00:17:27.220 "data_size": 63488 00:17:27.220 }, 00:17:27.220 { 00:17:27.220 "name": "BaseBdev3", 00:17:27.220 "uuid": "862fc599-e694-4e48-bac5-f568e4847411", 00:17:27.220 "is_configured": true, 00:17:27.220 "data_offset": 2048, 00:17:27.220 "data_size": 63488 00:17:27.220 } 00:17:27.220 ] 00:17:27.220 }' 00:17:27.220 19:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:27.220 19:53:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:28.157 19:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:28.157 [2024-07-24 19:53:19.644628] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:28.157 19:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:28.157 19:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:28.157 19:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:28.157 19:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:28.158 19:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:28.158 19:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:28.158 19:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:28.158 19:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:28.158 19:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:28.158 19:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:28.158 19:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.158 19:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:28.417 19:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:28.417 "name": "Existed_Raid", 00:17:28.417 "uuid": "9d8cb07a-74fc-4a12-b3a2-2892cde54a7f", 00:17:28.417 "strip_size_kb": 0, 00:17:28.417 "state": "configuring", 00:17:28.417 "raid_level": "raid1", 00:17:28.417 "superblock": true, 00:17:28.417 "num_base_bdevs": 3, 00:17:28.417 "num_base_bdevs_discovered": 1, 00:17:28.417 "num_base_bdevs_operational": 3, 00:17:28.417 "base_bdevs_list": [ 00:17:28.417 { 00:17:28.417 "name": "BaseBdev1", 00:17:28.417 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:28.417 "is_configured": false, 00:17:28.417 "data_offset": 0, 00:17:28.417 "data_size": 0 00:17:28.417 }, 00:17:28.417 { 00:17:28.417 "name": null, 00:17:28.417 "uuid": "f8372fd6-c8d7-4971-9801-29f1a6974352", 00:17:28.417 "is_configured": false, 00:17:28.417 "data_offset": 2048, 00:17:28.417 "data_size": 63488 00:17:28.417 }, 00:17:28.417 { 00:17:28.417 "name": "BaseBdev3", 00:17:28.417 "uuid": "862fc599-e694-4e48-bac5-f568e4847411", 00:17:28.417 "is_configured": true, 00:17:28.417 "data_offset": 2048, 00:17:28.417 "data_size": 63488 00:17:28.417 } 00:17:28.417 ] 00:17:28.417 }' 00:17:28.417 19:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:28.417 19:53:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:28.986 19:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.986 19:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:29.246 19:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:29.246 19:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:29.506 [2024-07-24 19:53:20.948667] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:29.506 BaseBdev1 00:17:29.506 19:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:29.506 19:53:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:17:29.506 19:53:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:29.506 19:53:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:17:29.506 19:53:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:29.506 19:53:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:29.506 19:53:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:29.766 19:53:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:30.027 [ 00:17:30.027 { 00:17:30.027 "name": "BaseBdev1", 00:17:30.027 "aliases": [ 00:17:30.027 "01c0340f-43c5-423d-ad82-f6b74a25218f" 00:17:30.027 ], 00:17:30.027 "product_name": "Malloc disk", 00:17:30.027 "block_size": 512, 00:17:30.027 "num_blocks": 65536, 00:17:30.027 "uuid": "01c0340f-43c5-423d-ad82-f6b74a25218f", 00:17:30.027 "assigned_rate_limits": { 00:17:30.027 "rw_ios_per_sec": 0, 00:17:30.027 "rw_mbytes_per_sec": 0, 00:17:30.027 "r_mbytes_per_sec": 0, 00:17:30.027 "w_mbytes_per_sec": 0 00:17:30.027 }, 00:17:30.027 "claimed": true, 00:17:30.027 "claim_type": "exclusive_write", 00:17:30.027 "zoned": false, 00:17:30.027 "supported_io_types": { 00:17:30.027 "read": true, 00:17:30.027 "write": true, 00:17:30.027 "unmap": true, 00:17:30.027 "flush": true, 00:17:30.027 "reset": true, 00:17:30.027 "nvme_admin": false, 00:17:30.027 "nvme_io": false, 00:17:30.027 "nvme_io_md": false, 00:17:30.027 "write_zeroes": true, 00:17:30.027 "zcopy": true, 00:17:30.027 "get_zone_info": false, 00:17:30.027 "zone_management": false, 00:17:30.027 "zone_append": false, 00:17:30.027 "compare": false, 00:17:30.027 "compare_and_write": false, 00:17:30.027 "abort": true, 00:17:30.027 "seek_hole": false, 00:17:30.027 "seek_data": false, 00:17:30.027 "copy": true, 00:17:30.027 "nvme_iov_md": false 00:17:30.027 }, 00:17:30.027 "memory_domains": [ 00:17:30.027 { 00:17:30.027 "dma_device_id": "system", 00:17:30.027 "dma_device_type": 1 00:17:30.027 }, 00:17:30.027 { 00:17:30.027 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:30.027 "dma_device_type": 2 00:17:30.027 } 00:17:30.027 ], 00:17:30.027 "driver_specific": {} 00:17:30.027 } 00:17:30.027 ] 00:17:30.027 19:53:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:17:30.027 19:53:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:30.027 19:53:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:30.027 19:53:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:30.027 19:53:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:30.027 19:53:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:30.027 19:53:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:30.027 19:53:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:30.027 19:53:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:30.027 19:53:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:30.027 19:53:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:30.027 19:53:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:30.027 19:53:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:30.287 19:53:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:30.287 "name": "Existed_Raid", 00:17:30.287 "uuid": "9d8cb07a-74fc-4a12-b3a2-2892cde54a7f", 00:17:30.287 "strip_size_kb": 0, 00:17:30.287 "state": "configuring", 00:17:30.287 "raid_level": "raid1", 00:17:30.287 "superblock": true, 00:17:30.287 "num_base_bdevs": 3, 00:17:30.287 "num_base_bdevs_discovered": 2, 00:17:30.287 "num_base_bdevs_operational": 3, 00:17:30.287 "base_bdevs_list": [ 00:17:30.287 { 00:17:30.287 "name": "BaseBdev1", 00:17:30.287 "uuid": "01c0340f-43c5-423d-ad82-f6b74a25218f", 00:17:30.287 "is_configured": true, 00:17:30.287 "data_offset": 2048, 00:17:30.287 "data_size": 63488 00:17:30.287 }, 00:17:30.287 { 00:17:30.287 "name": null, 00:17:30.287 "uuid": "f8372fd6-c8d7-4971-9801-29f1a6974352", 00:17:30.287 "is_configured": false, 00:17:30.287 "data_offset": 2048, 00:17:30.287 "data_size": 63488 00:17:30.287 }, 00:17:30.287 { 00:17:30.287 "name": "BaseBdev3", 00:17:30.287 "uuid": "862fc599-e694-4e48-bac5-f568e4847411", 00:17:30.287 "is_configured": true, 00:17:30.287 "data_offset": 2048, 00:17:30.287 "data_size": 63488 00:17:30.287 } 00:17:30.287 ] 00:17:30.287 }' 00:17:30.287 19:53:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:30.287 19:53:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:30.856 19:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:30.856 19:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:31.116 19:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:31.116 19:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:31.116 [2024-07-24 19:53:22.705326] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:31.375 19:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:31.375 19:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:31.375 19:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:31.375 19:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:31.375 19:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:31.376 19:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:31.376 19:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:31.376 19:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:31.376 19:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:31.376 19:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:31.376 19:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:31.376 19:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:31.635 19:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:31.635 "name": "Existed_Raid", 00:17:31.635 "uuid": "9d8cb07a-74fc-4a12-b3a2-2892cde54a7f", 00:17:31.635 "strip_size_kb": 0, 00:17:31.635 "state": "configuring", 00:17:31.635 "raid_level": "raid1", 00:17:31.635 "superblock": true, 00:17:31.635 "num_base_bdevs": 3, 00:17:31.635 "num_base_bdevs_discovered": 1, 00:17:31.635 "num_base_bdevs_operational": 3, 00:17:31.635 "base_bdevs_list": [ 00:17:31.635 { 00:17:31.635 "name": "BaseBdev1", 00:17:31.635 "uuid": "01c0340f-43c5-423d-ad82-f6b74a25218f", 00:17:31.635 "is_configured": true, 00:17:31.635 "data_offset": 2048, 00:17:31.635 "data_size": 63488 00:17:31.635 }, 00:17:31.635 { 00:17:31.635 "name": null, 00:17:31.635 "uuid": "f8372fd6-c8d7-4971-9801-29f1a6974352", 00:17:31.635 "is_configured": false, 00:17:31.635 "data_offset": 2048, 00:17:31.635 "data_size": 63488 00:17:31.635 }, 00:17:31.635 { 00:17:31.635 "name": null, 00:17:31.635 "uuid": "862fc599-e694-4e48-bac5-f568e4847411", 00:17:31.635 "is_configured": false, 00:17:31.635 "data_offset": 2048, 00:17:31.635 "data_size": 63488 00:17:31.635 } 00:17:31.635 ] 00:17:31.635 }' 00:17:31.635 19:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:31.635 19:53:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:32.205 19:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:32.205 19:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:32.466 19:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:32.466 19:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:32.466 [2024-07-24 19:53:24.052898] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:32.784 19:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:32.784 19:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:32.784 19:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:32.784 19:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:32.784 19:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:32.784 19:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:32.784 19:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:32.784 19:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:32.784 19:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:32.784 19:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:32.784 19:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:32.784 19:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:32.784 19:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:32.784 "name": "Existed_Raid", 00:17:32.784 "uuid": "9d8cb07a-74fc-4a12-b3a2-2892cde54a7f", 00:17:32.784 "strip_size_kb": 0, 00:17:32.784 "state": "configuring", 00:17:32.784 "raid_level": "raid1", 00:17:32.784 "superblock": true, 00:17:32.784 "num_base_bdevs": 3, 00:17:32.784 "num_base_bdevs_discovered": 2, 00:17:32.784 "num_base_bdevs_operational": 3, 00:17:32.784 "base_bdevs_list": [ 00:17:32.784 { 00:17:32.784 "name": "BaseBdev1", 00:17:32.784 "uuid": "01c0340f-43c5-423d-ad82-f6b74a25218f", 00:17:32.784 "is_configured": true, 00:17:32.784 "data_offset": 2048, 00:17:32.784 "data_size": 63488 00:17:32.784 }, 00:17:32.784 { 00:17:32.784 "name": null, 00:17:32.784 "uuid": "f8372fd6-c8d7-4971-9801-29f1a6974352", 00:17:32.784 "is_configured": false, 00:17:32.784 "data_offset": 2048, 00:17:32.784 "data_size": 63488 00:17:32.784 }, 00:17:32.784 { 00:17:32.784 "name": "BaseBdev3", 00:17:32.784 "uuid": "862fc599-e694-4e48-bac5-f568e4847411", 00:17:32.784 "is_configured": true, 00:17:32.784 "data_offset": 2048, 00:17:32.784 "data_size": 63488 00:17:32.784 } 00:17:32.784 ] 00:17:32.784 }' 00:17:32.784 19:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:32.784 19:53:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:33.352 19:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:33.352 19:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:33.611 19:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:33.611 19:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:33.870 [2024-07-24 19:53:25.404511] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:33.870 19:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:33.870 19:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:33.870 19:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:33.870 19:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:33.870 19:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:33.870 19:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:33.870 19:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:33.871 19:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:33.871 19:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:33.871 19:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:33.871 19:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:33.871 19:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:34.130 19:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:34.130 "name": "Existed_Raid", 00:17:34.130 "uuid": "9d8cb07a-74fc-4a12-b3a2-2892cde54a7f", 00:17:34.130 "strip_size_kb": 0, 00:17:34.130 "state": "configuring", 00:17:34.130 "raid_level": "raid1", 00:17:34.130 "superblock": true, 00:17:34.130 "num_base_bdevs": 3, 00:17:34.130 "num_base_bdevs_discovered": 1, 00:17:34.130 "num_base_bdevs_operational": 3, 00:17:34.130 "base_bdevs_list": [ 00:17:34.130 { 00:17:34.130 "name": null, 00:17:34.130 "uuid": "01c0340f-43c5-423d-ad82-f6b74a25218f", 00:17:34.130 "is_configured": false, 00:17:34.130 "data_offset": 2048, 00:17:34.130 "data_size": 63488 00:17:34.130 }, 00:17:34.130 { 00:17:34.130 "name": null, 00:17:34.130 "uuid": "f8372fd6-c8d7-4971-9801-29f1a6974352", 00:17:34.130 "is_configured": false, 00:17:34.130 "data_offset": 2048, 00:17:34.130 "data_size": 63488 00:17:34.130 }, 00:17:34.130 { 00:17:34.130 "name": "BaseBdev3", 00:17:34.130 "uuid": "862fc599-e694-4e48-bac5-f568e4847411", 00:17:34.130 "is_configured": true, 00:17:34.130 "data_offset": 2048, 00:17:34.130 "data_size": 63488 00:17:34.130 } 00:17:34.130 ] 00:17:34.130 }' 00:17:34.130 19:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:34.130 19:53:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:34.699 19:53:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:34.699 19:53:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:34.957 19:53:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:34.957 19:53:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:35.216 [2024-07-24 19:53:26.771325] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:35.216 19:53:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:35.216 19:53:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:35.216 19:53:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:35.216 19:53:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:35.216 19:53:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:35.216 19:53:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:35.216 19:53:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:35.216 19:53:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:35.216 19:53:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:35.216 19:53:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:35.216 19:53:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:35.217 19:53:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:35.475 19:53:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:35.475 "name": "Existed_Raid", 00:17:35.475 "uuid": "9d8cb07a-74fc-4a12-b3a2-2892cde54a7f", 00:17:35.475 "strip_size_kb": 0, 00:17:35.475 "state": "configuring", 00:17:35.475 "raid_level": "raid1", 00:17:35.475 "superblock": true, 00:17:35.475 "num_base_bdevs": 3, 00:17:35.475 "num_base_bdevs_discovered": 2, 00:17:35.475 "num_base_bdevs_operational": 3, 00:17:35.475 "base_bdevs_list": [ 00:17:35.475 { 00:17:35.475 "name": null, 00:17:35.475 "uuid": "01c0340f-43c5-423d-ad82-f6b74a25218f", 00:17:35.475 "is_configured": false, 00:17:35.475 "data_offset": 2048, 00:17:35.475 "data_size": 63488 00:17:35.475 }, 00:17:35.475 { 00:17:35.475 "name": "BaseBdev2", 00:17:35.475 "uuid": "f8372fd6-c8d7-4971-9801-29f1a6974352", 00:17:35.475 "is_configured": true, 00:17:35.475 "data_offset": 2048, 00:17:35.475 "data_size": 63488 00:17:35.475 }, 00:17:35.475 { 00:17:35.475 "name": "BaseBdev3", 00:17:35.475 "uuid": "862fc599-e694-4e48-bac5-f568e4847411", 00:17:35.475 "is_configured": true, 00:17:35.475 "data_offset": 2048, 00:17:35.475 "data_size": 63488 00:17:35.475 } 00:17:35.475 ] 00:17:35.475 }' 00:17:35.475 19:53:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:35.475 19:53:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:36.043 19:53:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:36.043 19:53:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:36.301 19:53:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:36.301 19:53:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:36.301 19:53:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:36.560 19:53:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 01c0340f-43c5-423d-ad82-f6b74a25218f 00:17:37.129 [2024-07-24 19:53:28.591511] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:37.129 [2024-07-24 19:53:28.591664] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x20aa970 00:17:37.129 [2024-07-24 19:53:28.591683] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:37.129 [2024-07-24 19:53:28.591861] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20a8ea0 00:17:37.129 [2024-07-24 19:53:28.591981] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20aa970 00:17:37.129 [2024-07-24 19:53:28.591992] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x20aa970 00:17:37.129 [2024-07-24 19:53:28.592082] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:37.129 NewBaseBdev 00:17:37.129 19:53:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:37.129 19:53:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:17:37.129 19:53:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:37.129 19:53:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:17:37.129 19:53:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:37.129 19:53:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:37.130 19:53:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:37.389 19:53:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:37.957 [ 00:17:37.957 { 00:17:37.957 "name": "NewBaseBdev", 00:17:37.957 "aliases": [ 00:17:37.957 "01c0340f-43c5-423d-ad82-f6b74a25218f" 00:17:37.957 ], 00:17:37.957 "product_name": "Malloc disk", 00:17:37.957 "block_size": 512, 00:17:37.957 "num_blocks": 65536, 00:17:37.957 "uuid": "01c0340f-43c5-423d-ad82-f6b74a25218f", 00:17:37.957 "assigned_rate_limits": { 00:17:37.957 "rw_ios_per_sec": 0, 00:17:37.957 "rw_mbytes_per_sec": 0, 00:17:37.957 "r_mbytes_per_sec": 0, 00:17:37.957 "w_mbytes_per_sec": 0 00:17:37.957 }, 00:17:37.957 "claimed": true, 00:17:37.957 "claim_type": "exclusive_write", 00:17:37.957 "zoned": false, 00:17:37.957 "supported_io_types": { 00:17:37.957 "read": true, 00:17:37.957 "write": true, 00:17:37.957 "unmap": true, 00:17:37.957 "flush": true, 00:17:37.957 "reset": true, 00:17:37.957 "nvme_admin": false, 00:17:37.957 "nvme_io": false, 00:17:37.957 "nvme_io_md": false, 00:17:37.957 "write_zeroes": true, 00:17:37.957 "zcopy": true, 00:17:37.957 "get_zone_info": false, 00:17:37.957 "zone_management": false, 00:17:37.957 "zone_append": false, 00:17:37.957 "compare": false, 00:17:37.957 "compare_and_write": false, 00:17:37.957 "abort": true, 00:17:37.957 "seek_hole": false, 00:17:37.957 "seek_data": false, 00:17:37.957 "copy": true, 00:17:37.957 "nvme_iov_md": false 00:17:37.957 }, 00:17:37.957 "memory_domains": [ 00:17:37.957 { 00:17:37.957 "dma_device_id": "system", 00:17:37.957 "dma_device_type": 1 00:17:37.957 }, 00:17:37.957 { 00:17:37.957 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:37.957 "dma_device_type": 2 00:17:37.957 } 00:17:37.957 ], 00:17:37.957 "driver_specific": {} 00:17:37.957 } 00:17:37.957 ] 00:17:37.957 19:53:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:17:37.957 19:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:37.957 19:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:37.957 19:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:37.957 19:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:37.957 19:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:37.957 19:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:37.957 19:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:37.957 19:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:37.957 19:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:37.957 19:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:37.957 19:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:37.957 19:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:38.526 19:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:38.526 "name": "Existed_Raid", 00:17:38.526 "uuid": "9d8cb07a-74fc-4a12-b3a2-2892cde54a7f", 00:17:38.526 "strip_size_kb": 0, 00:17:38.526 "state": "online", 00:17:38.526 "raid_level": "raid1", 00:17:38.526 "superblock": true, 00:17:38.526 "num_base_bdevs": 3, 00:17:38.526 "num_base_bdevs_discovered": 3, 00:17:38.526 "num_base_bdevs_operational": 3, 00:17:38.526 "base_bdevs_list": [ 00:17:38.526 { 00:17:38.526 "name": "NewBaseBdev", 00:17:38.526 "uuid": "01c0340f-43c5-423d-ad82-f6b74a25218f", 00:17:38.526 "is_configured": true, 00:17:38.526 "data_offset": 2048, 00:17:38.526 "data_size": 63488 00:17:38.526 }, 00:17:38.526 { 00:17:38.526 "name": "BaseBdev2", 00:17:38.526 "uuid": "f8372fd6-c8d7-4971-9801-29f1a6974352", 00:17:38.526 "is_configured": true, 00:17:38.526 "data_offset": 2048, 00:17:38.526 "data_size": 63488 00:17:38.526 }, 00:17:38.526 { 00:17:38.526 "name": "BaseBdev3", 00:17:38.526 "uuid": "862fc599-e694-4e48-bac5-f568e4847411", 00:17:38.526 "is_configured": true, 00:17:38.526 "data_offset": 2048, 00:17:38.526 "data_size": 63488 00:17:38.526 } 00:17:38.526 ] 00:17:38.526 }' 00:17:38.526 19:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:38.526 19:53:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:39.095 19:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:39.095 19:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:39.095 19:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:39.095 19:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:39.095 19:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:39.095 19:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:39.095 19:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:39.095 19:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:39.354 [2024-07-24 19:53:30.789631] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:39.354 19:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:39.354 "name": "Existed_Raid", 00:17:39.354 "aliases": [ 00:17:39.354 "9d8cb07a-74fc-4a12-b3a2-2892cde54a7f" 00:17:39.354 ], 00:17:39.354 "product_name": "Raid Volume", 00:17:39.354 "block_size": 512, 00:17:39.354 "num_blocks": 63488, 00:17:39.354 "uuid": "9d8cb07a-74fc-4a12-b3a2-2892cde54a7f", 00:17:39.354 "assigned_rate_limits": { 00:17:39.354 "rw_ios_per_sec": 0, 00:17:39.354 "rw_mbytes_per_sec": 0, 00:17:39.354 "r_mbytes_per_sec": 0, 00:17:39.354 "w_mbytes_per_sec": 0 00:17:39.354 }, 00:17:39.354 "claimed": false, 00:17:39.354 "zoned": false, 00:17:39.354 "supported_io_types": { 00:17:39.354 "read": true, 00:17:39.354 "write": true, 00:17:39.354 "unmap": false, 00:17:39.354 "flush": false, 00:17:39.354 "reset": true, 00:17:39.354 "nvme_admin": false, 00:17:39.354 "nvme_io": false, 00:17:39.354 "nvme_io_md": false, 00:17:39.354 "write_zeroes": true, 00:17:39.354 "zcopy": false, 00:17:39.354 "get_zone_info": false, 00:17:39.354 "zone_management": false, 00:17:39.354 "zone_append": false, 00:17:39.354 "compare": false, 00:17:39.354 "compare_and_write": false, 00:17:39.354 "abort": false, 00:17:39.354 "seek_hole": false, 00:17:39.354 "seek_data": false, 00:17:39.354 "copy": false, 00:17:39.354 "nvme_iov_md": false 00:17:39.354 }, 00:17:39.354 "memory_domains": [ 00:17:39.354 { 00:17:39.354 "dma_device_id": "system", 00:17:39.354 "dma_device_type": 1 00:17:39.354 }, 00:17:39.354 { 00:17:39.354 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:39.354 "dma_device_type": 2 00:17:39.354 }, 00:17:39.354 { 00:17:39.354 "dma_device_id": "system", 00:17:39.354 "dma_device_type": 1 00:17:39.354 }, 00:17:39.354 { 00:17:39.354 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:39.354 "dma_device_type": 2 00:17:39.354 }, 00:17:39.354 { 00:17:39.354 "dma_device_id": "system", 00:17:39.354 "dma_device_type": 1 00:17:39.354 }, 00:17:39.354 { 00:17:39.354 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:39.354 "dma_device_type": 2 00:17:39.354 } 00:17:39.354 ], 00:17:39.354 "driver_specific": { 00:17:39.354 "raid": { 00:17:39.354 "uuid": "9d8cb07a-74fc-4a12-b3a2-2892cde54a7f", 00:17:39.354 "strip_size_kb": 0, 00:17:39.354 "state": "online", 00:17:39.354 "raid_level": "raid1", 00:17:39.354 "superblock": true, 00:17:39.354 "num_base_bdevs": 3, 00:17:39.354 "num_base_bdevs_discovered": 3, 00:17:39.354 "num_base_bdevs_operational": 3, 00:17:39.354 "base_bdevs_list": [ 00:17:39.354 { 00:17:39.354 "name": "NewBaseBdev", 00:17:39.354 "uuid": "01c0340f-43c5-423d-ad82-f6b74a25218f", 00:17:39.354 "is_configured": true, 00:17:39.354 "data_offset": 2048, 00:17:39.354 "data_size": 63488 00:17:39.354 }, 00:17:39.354 { 00:17:39.354 "name": "BaseBdev2", 00:17:39.354 "uuid": "f8372fd6-c8d7-4971-9801-29f1a6974352", 00:17:39.354 "is_configured": true, 00:17:39.354 "data_offset": 2048, 00:17:39.354 "data_size": 63488 00:17:39.354 }, 00:17:39.354 { 00:17:39.354 "name": "BaseBdev3", 00:17:39.354 "uuid": "862fc599-e694-4e48-bac5-f568e4847411", 00:17:39.354 "is_configured": true, 00:17:39.354 "data_offset": 2048, 00:17:39.354 "data_size": 63488 00:17:39.354 } 00:17:39.354 ] 00:17:39.354 } 00:17:39.354 } 00:17:39.354 }' 00:17:39.354 19:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:39.354 19:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:39.354 BaseBdev2 00:17:39.354 BaseBdev3' 00:17:39.354 19:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:39.354 19:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:39.354 19:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:39.922 19:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:39.922 "name": "NewBaseBdev", 00:17:39.922 "aliases": [ 00:17:39.922 "01c0340f-43c5-423d-ad82-f6b74a25218f" 00:17:39.922 ], 00:17:39.922 "product_name": "Malloc disk", 00:17:39.922 "block_size": 512, 00:17:39.922 "num_blocks": 65536, 00:17:39.922 "uuid": "01c0340f-43c5-423d-ad82-f6b74a25218f", 00:17:39.922 "assigned_rate_limits": { 00:17:39.922 "rw_ios_per_sec": 0, 00:17:39.922 "rw_mbytes_per_sec": 0, 00:17:39.922 "r_mbytes_per_sec": 0, 00:17:39.922 "w_mbytes_per_sec": 0 00:17:39.922 }, 00:17:39.922 "claimed": true, 00:17:39.922 "claim_type": "exclusive_write", 00:17:39.922 "zoned": false, 00:17:39.922 "supported_io_types": { 00:17:39.922 "read": true, 00:17:39.922 "write": true, 00:17:39.922 "unmap": true, 00:17:39.922 "flush": true, 00:17:39.922 "reset": true, 00:17:39.922 "nvme_admin": false, 00:17:39.922 "nvme_io": false, 00:17:39.922 "nvme_io_md": false, 00:17:39.922 "write_zeroes": true, 00:17:39.922 "zcopy": true, 00:17:39.922 "get_zone_info": false, 00:17:39.922 "zone_management": false, 00:17:39.922 "zone_append": false, 00:17:39.922 "compare": false, 00:17:39.922 "compare_and_write": false, 00:17:39.923 "abort": true, 00:17:39.923 "seek_hole": false, 00:17:39.923 "seek_data": false, 00:17:39.923 "copy": true, 00:17:39.923 "nvme_iov_md": false 00:17:39.923 }, 00:17:39.923 "memory_domains": [ 00:17:39.923 { 00:17:39.923 "dma_device_id": "system", 00:17:39.923 "dma_device_type": 1 00:17:39.923 }, 00:17:39.923 { 00:17:39.923 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:39.923 "dma_device_type": 2 00:17:39.923 } 00:17:39.923 ], 00:17:39.923 "driver_specific": {} 00:17:39.923 }' 00:17:39.923 19:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:39.923 19:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:39.923 19:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:39.923 19:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:40.181 19:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:40.181 19:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:40.181 19:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:40.181 19:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:40.441 19:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:40.441 19:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:40.441 19:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:40.441 19:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:40.441 19:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:40.441 19:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:40.441 19:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:41.010 19:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:41.010 "name": "BaseBdev2", 00:17:41.010 "aliases": [ 00:17:41.010 "f8372fd6-c8d7-4971-9801-29f1a6974352" 00:17:41.010 ], 00:17:41.010 "product_name": "Malloc disk", 00:17:41.010 "block_size": 512, 00:17:41.010 "num_blocks": 65536, 00:17:41.010 "uuid": "f8372fd6-c8d7-4971-9801-29f1a6974352", 00:17:41.010 "assigned_rate_limits": { 00:17:41.010 "rw_ios_per_sec": 0, 00:17:41.010 "rw_mbytes_per_sec": 0, 00:17:41.010 "r_mbytes_per_sec": 0, 00:17:41.010 "w_mbytes_per_sec": 0 00:17:41.010 }, 00:17:41.010 "claimed": true, 00:17:41.010 "claim_type": "exclusive_write", 00:17:41.010 "zoned": false, 00:17:41.010 "supported_io_types": { 00:17:41.010 "read": true, 00:17:41.010 "write": true, 00:17:41.010 "unmap": true, 00:17:41.010 "flush": true, 00:17:41.010 "reset": true, 00:17:41.010 "nvme_admin": false, 00:17:41.010 "nvme_io": false, 00:17:41.010 "nvme_io_md": false, 00:17:41.010 "write_zeroes": true, 00:17:41.010 "zcopy": true, 00:17:41.010 "get_zone_info": false, 00:17:41.010 "zone_management": false, 00:17:41.010 "zone_append": false, 00:17:41.010 "compare": false, 00:17:41.010 "compare_and_write": false, 00:17:41.010 "abort": true, 00:17:41.010 "seek_hole": false, 00:17:41.010 "seek_data": false, 00:17:41.010 "copy": true, 00:17:41.010 "nvme_iov_md": false 00:17:41.010 }, 00:17:41.010 "memory_domains": [ 00:17:41.010 { 00:17:41.010 "dma_device_id": "system", 00:17:41.010 "dma_device_type": 1 00:17:41.010 }, 00:17:41.010 { 00:17:41.010 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:41.010 "dma_device_type": 2 00:17:41.010 } 00:17:41.010 ], 00:17:41.010 "driver_specific": {} 00:17:41.010 }' 00:17:41.010 19:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:41.010 19:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:41.010 19:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:41.010 19:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:41.010 19:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:41.269 19:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:41.269 19:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:41.269 19:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:41.269 19:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:41.269 19:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:41.269 19:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:41.269 19:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:41.269 19:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:41.269 19:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:41.269 19:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:41.528 19:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:41.528 "name": "BaseBdev3", 00:17:41.528 "aliases": [ 00:17:41.528 "862fc599-e694-4e48-bac5-f568e4847411" 00:17:41.528 ], 00:17:41.528 "product_name": "Malloc disk", 00:17:41.528 "block_size": 512, 00:17:41.528 "num_blocks": 65536, 00:17:41.528 "uuid": "862fc599-e694-4e48-bac5-f568e4847411", 00:17:41.528 "assigned_rate_limits": { 00:17:41.528 "rw_ios_per_sec": 0, 00:17:41.528 "rw_mbytes_per_sec": 0, 00:17:41.528 "r_mbytes_per_sec": 0, 00:17:41.528 "w_mbytes_per_sec": 0 00:17:41.528 }, 00:17:41.528 "claimed": true, 00:17:41.528 "claim_type": "exclusive_write", 00:17:41.528 "zoned": false, 00:17:41.528 "supported_io_types": { 00:17:41.528 "read": true, 00:17:41.528 "write": true, 00:17:41.528 "unmap": true, 00:17:41.528 "flush": true, 00:17:41.528 "reset": true, 00:17:41.529 "nvme_admin": false, 00:17:41.529 "nvme_io": false, 00:17:41.529 "nvme_io_md": false, 00:17:41.529 "write_zeroes": true, 00:17:41.529 "zcopy": true, 00:17:41.529 "get_zone_info": false, 00:17:41.529 "zone_management": false, 00:17:41.529 "zone_append": false, 00:17:41.529 "compare": false, 00:17:41.529 "compare_and_write": false, 00:17:41.529 "abort": true, 00:17:41.529 "seek_hole": false, 00:17:41.529 "seek_data": false, 00:17:41.529 "copy": true, 00:17:41.529 "nvme_iov_md": false 00:17:41.529 }, 00:17:41.529 "memory_domains": [ 00:17:41.529 { 00:17:41.529 "dma_device_id": "system", 00:17:41.529 "dma_device_type": 1 00:17:41.529 }, 00:17:41.529 { 00:17:41.529 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:41.529 "dma_device_type": 2 00:17:41.529 } 00:17:41.529 ], 00:17:41.529 "driver_specific": {} 00:17:41.529 }' 00:17:41.529 19:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:41.529 19:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:41.529 19:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:41.529 19:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:41.787 19:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:41.787 19:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:41.787 19:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:41.787 19:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:41.788 19:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:41.788 19:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:41.788 19:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:41.788 19:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:41.788 19:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:42.046 [2024-07-24 19:53:33.592744] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:42.046 [2024-07-24 19:53:33.592771] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:42.046 [2024-07-24 19:53:33.592820] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:42.046 [2024-07-24 19:53:33.593099] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:42.046 [2024-07-24 19:53:33.593111] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20aa970 name Existed_Raid, state offline 00:17:42.046 19:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1425577 00:17:42.046 19:53:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1425577 ']' 00:17:42.046 19:53:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 1425577 00:17:42.046 19:53:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:17:42.046 19:53:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:17:42.046 19:53:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1425577 00:17:42.304 19:53:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:17:42.304 19:53:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:17:42.304 19:53:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1425577' 00:17:42.304 killing process with pid 1425577 00:17:42.304 19:53:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 1425577 00:17:42.304 [2024-07-24 19:53:33.666176] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:42.304 19:53:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 1425577 00:17:42.304 [2024-07-24 19:53:33.693324] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:42.564 19:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:17:42.564 00:17:42.564 real 0m30.826s 00:17:42.564 user 0m57.001s 00:17:42.564 sys 0m5.546s 00:17:42.564 19:53:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:17:42.564 19:53:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:42.564 ************************************ 00:17:42.564 END TEST raid_state_function_test_sb 00:17:42.564 ************************************ 00:17:42.564 19:53:33 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:17:42.564 19:53:33 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:17:42.564 19:53:33 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:17:42.564 19:53:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:42.564 ************************************ 00:17:42.564 START TEST raid_superblock_test 00:17:42.564 ************************************ 00:17:42.564 19:53:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 3 00:17:42.564 19:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=raid1 00:17:42.564 19:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=3 00:17:42.564 19:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:17:42.564 19:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:17:42.564 19:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:17:42.564 19:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:17:42.564 19:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:17:42.564 19:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:17:42.564 19:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:17:42.564 19:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:17:42.564 19:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:17:42.564 19:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:17:42.564 19:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:17:42.564 19:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' raid1 '!=' raid1 ']' 00:17:42.564 19:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # strip_size=0 00:17:42.564 19:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=1430095 00:17:42.564 19:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 1430095 /var/tmp/spdk-raid.sock 00:17:42.564 19:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:17:42.564 19:53:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1430095 ']' 00:17:42.564 19:53:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:42.564 19:53:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:17:42.564 19:53:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:42.564 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:42.564 19:53:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:17:42.564 19:53:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:42.564 [2024-07-24 19:53:34.066946] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:17:42.564 [2024-07-24 19:53:34.067011] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1430095 ] 00:17:42.823 [2024-07-24 19:53:34.189732] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:42.823 [2024-07-24 19:53:34.288804] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:42.824 [2024-07-24 19:53:34.350740] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:42.824 [2024-07-24 19:53:34.350784] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:43.816 19:53:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:17:43.816 19:53:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:17:43.816 19:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:17:43.816 19:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:17:43.816 19:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:17:43.816 19:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:17:43.816 19:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:17:43.816 19:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:43.816 19:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:17:43.816 19:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:43.816 19:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:17:43.816 malloc1 00:17:43.816 19:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:44.075 [2024-07-24 19:53:35.508818] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:44.075 [2024-07-24 19:53:35.508867] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:44.075 [2024-07-24 19:53:35.508888] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a15590 00:17:44.075 [2024-07-24 19:53:35.508901] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:44.075 [2024-07-24 19:53:35.510666] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:44.075 [2024-07-24 19:53:35.510695] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:44.075 pt1 00:17:44.076 19:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:17:44.076 19:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:17:44.076 19:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:17:44.076 19:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:17:44.076 19:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:17:44.076 19:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:44.076 19:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:17:44.076 19:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:44.076 19:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:17:44.644 malloc2 00:17:44.644 19:53:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:44.903 [2024-07-24 19:53:36.272779] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:44.903 [2024-07-24 19:53:36.272831] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:44.903 [2024-07-24 19:53:36.272850] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bbb690 00:17:44.903 [2024-07-24 19:53:36.272862] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:44.903 [2024-07-24 19:53:36.274447] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:44.903 [2024-07-24 19:53:36.274475] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:44.903 pt2 00:17:44.903 19:53:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:17:44.903 19:53:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:17:44.903 19:53:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc3 00:17:44.903 19:53:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt3 00:17:44.903 19:53:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:17:44.903 19:53:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:44.903 19:53:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:17:44.903 19:53:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:44.903 19:53:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:17:45.472 malloc3 00:17:45.472 19:53:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:45.472 [2024-07-24 19:53:37.035406] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:45.472 [2024-07-24 19:53:37.035453] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:45.472 [2024-07-24 19:53:37.035470] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bbcfc0 00:17:45.472 [2024-07-24 19:53:37.035483] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:45.472 [2024-07-24 19:53:37.037015] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:45.472 [2024-07-24 19:53:37.037044] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:45.472 pt3 00:17:45.472 19:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:17:45.472 19:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:17:45.472 19:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:17:46.041 [2024-07-24 19:53:37.532708] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:46.041 [2024-07-24 19:53:37.534076] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:46.041 [2024-07-24 19:53:37.534133] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:46.041 [2024-07-24 19:53:37.534300] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bbdd10 00:17:46.041 [2024-07-24 19:53:37.534311] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:46.041 [2024-07-24 19:53:37.534525] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a2c480 00:17:46.041 [2024-07-24 19:53:37.534680] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bbdd10 00:17:46.041 [2024-07-24 19:53:37.534690] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1bbdd10 00:17:46.041 [2024-07-24 19:53:37.534796] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:46.041 19:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:46.041 19:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:46.041 19:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:46.041 19:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:46.041 19:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:46.041 19:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:46.041 19:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:46.041 19:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:46.041 19:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:46.041 19:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:46.041 19:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:46.041 19:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:46.300 19:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:46.300 "name": "raid_bdev1", 00:17:46.300 "uuid": "ec4f620e-ef9a-4a82-b71c-8b767b8ba72a", 00:17:46.300 "strip_size_kb": 0, 00:17:46.300 "state": "online", 00:17:46.300 "raid_level": "raid1", 00:17:46.300 "superblock": true, 00:17:46.300 "num_base_bdevs": 3, 00:17:46.300 "num_base_bdevs_discovered": 3, 00:17:46.300 "num_base_bdevs_operational": 3, 00:17:46.300 "base_bdevs_list": [ 00:17:46.300 { 00:17:46.300 "name": "pt1", 00:17:46.300 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:46.300 "is_configured": true, 00:17:46.300 "data_offset": 2048, 00:17:46.300 "data_size": 63488 00:17:46.300 }, 00:17:46.300 { 00:17:46.300 "name": "pt2", 00:17:46.300 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:46.300 "is_configured": true, 00:17:46.300 "data_offset": 2048, 00:17:46.300 "data_size": 63488 00:17:46.300 }, 00:17:46.300 { 00:17:46.300 "name": "pt3", 00:17:46.300 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:46.300 "is_configured": true, 00:17:46.300 "data_offset": 2048, 00:17:46.300 "data_size": 63488 00:17:46.300 } 00:17:46.300 ] 00:17:46.300 }' 00:17:46.300 19:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:46.300 19:53:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:47.237 19:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:17:47.237 19:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:47.237 19:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:47.237 19:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:47.237 19:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:47.237 19:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:47.237 19:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:47.237 19:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:47.496 [2024-07-24 19:53:38.952713] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:47.496 19:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:47.496 "name": "raid_bdev1", 00:17:47.496 "aliases": [ 00:17:47.496 "ec4f620e-ef9a-4a82-b71c-8b767b8ba72a" 00:17:47.496 ], 00:17:47.496 "product_name": "Raid Volume", 00:17:47.496 "block_size": 512, 00:17:47.496 "num_blocks": 63488, 00:17:47.496 "uuid": "ec4f620e-ef9a-4a82-b71c-8b767b8ba72a", 00:17:47.496 "assigned_rate_limits": { 00:17:47.496 "rw_ios_per_sec": 0, 00:17:47.496 "rw_mbytes_per_sec": 0, 00:17:47.496 "r_mbytes_per_sec": 0, 00:17:47.496 "w_mbytes_per_sec": 0 00:17:47.496 }, 00:17:47.496 "claimed": false, 00:17:47.496 "zoned": false, 00:17:47.496 "supported_io_types": { 00:17:47.496 "read": true, 00:17:47.496 "write": true, 00:17:47.496 "unmap": false, 00:17:47.496 "flush": false, 00:17:47.496 "reset": true, 00:17:47.496 "nvme_admin": false, 00:17:47.496 "nvme_io": false, 00:17:47.496 "nvme_io_md": false, 00:17:47.496 "write_zeroes": true, 00:17:47.496 "zcopy": false, 00:17:47.496 "get_zone_info": false, 00:17:47.496 "zone_management": false, 00:17:47.496 "zone_append": false, 00:17:47.496 "compare": false, 00:17:47.496 "compare_and_write": false, 00:17:47.496 "abort": false, 00:17:47.496 "seek_hole": false, 00:17:47.496 "seek_data": false, 00:17:47.496 "copy": false, 00:17:47.496 "nvme_iov_md": false 00:17:47.496 }, 00:17:47.496 "memory_domains": [ 00:17:47.496 { 00:17:47.496 "dma_device_id": "system", 00:17:47.496 "dma_device_type": 1 00:17:47.496 }, 00:17:47.496 { 00:17:47.496 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.496 "dma_device_type": 2 00:17:47.496 }, 00:17:47.496 { 00:17:47.496 "dma_device_id": "system", 00:17:47.496 "dma_device_type": 1 00:17:47.496 }, 00:17:47.496 { 00:17:47.496 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.496 "dma_device_type": 2 00:17:47.496 }, 00:17:47.496 { 00:17:47.496 "dma_device_id": "system", 00:17:47.496 "dma_device_type": 1 00:17:47.496 }, 00:17:47.496 { 00:17:47.496 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.496 "dma_device_type": 2 00:17:47.496 } 00:17:47.496 ], 00:17:47.496 "driver_specific": { 00:17:47.496 "raid": { 00:17:47.496 "uuid": "ec4f620e-ef9a-4a82-b71c-8b767b8ba72a", 00:17:47.496 "strip_size_kb": 0, 00:17:47.496 "state": "online", 00:17:47.496 "raid_level": "raid1", 00:17:47.496 "superblock": true, 00:17:47.496 "num_base_bdevs": 3, 00:17:47.496 "num_base_bdevs_discovered": 3, 00:17:47.496 "num_base_bdevs_operational": 3, 00:17:47.496 "base_bdevs_list": [ 00:17:47.496 { 00:17:47.496 "name": "pt1", 00:17:47.496 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:47.496 "is_configured": true, 00:17:47.496 "data_offset": 2048, 00:17:47.496 "data_size": 63488 00:17:47.496 }, 00:17:47.496 { 00:17:47.496 "name": "pt2", 00:17:47.496 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:47.496 "is_configured": true, 00:17:47.496 "data_offset": 2048, 00:17:47.496 "data_size": 63488 00:17:47.496 }, 00:17:47.496 { 00:17:47.496 "name": "pt3", 00:17:47.496 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:47.496 "is_configured": true, 00:17:47.496 "data_offset": 2048, 00:17:47.496 "data_size": 63488 00:17:47.496 } 00:17:47.496 ] 00:17:47.496 } 00:17:47.496 } 00:17:47.496 }' 00:17:47.496 19:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:47.496 19:53:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:47.496 pt2 00:17:47.496 pt3' 00:17:47.496 19:53:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:47.496 19:53:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:47.496 19:53:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:48.065 19:53:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:48.065 "name": "pt1", 00:17:48.065 "aliases": [ 00:17:48.065 "00000000-0000-0000-0000-000000000001" 00:17:48.065 ], 00:17:48.065 "product_name": "passthru", 00:17:48.065 "block_size": 512, 00:17:48.065 "num_blocks": 65536, 00:17:48.065 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:48.065 "assigned_rate_limits": { 00:17:48.065 "rw_ios_per_sec": 0, 00:17:48.065 "rw_mbytes_per_sec": 0, 00:17:48.065 "r_mbytes_per_sec": 0, 00:17:48.065 "w_mbytes_per_sec": 0 00:17:48.065 }, 00:17:48.065 "claimed": true, 00:17:48.065 "claim_type": "exclusive_write", 00:17:48.065 "zoned": false, 00:17:48.065 "supported_io_types": { 00:17:48.065 "read": true, 00:17:48.065 "write": true, 00:17:48.065 "unmap": true, 00:17:48.065 "flush": true, 00:17:48.065 "reset": true, 00:17:48.065 "nvme_admin": false, 00:17:48.065 "nvme_io": false, 00:17:48.065 "nvme_io_md": false, 00:17:48.065 "write_zeroes": true, 00:17:48.065 "zcopy": true, 00:17:48.065 "get_zone_info": false, 00:17:48.065 "zone_management": false, 00:17:48.065 "zone_append": false, 00:17:48.065 "compare": false, 00:17:48.065 "compare_and_write": false, 00:17:48.065 "abort": true, 00:17:48.065 "seek_hole": false, 00:17:48.065 "seek_data": false, 00:17:48.065 "copy": true, 00:17:48.065 "nvme_iov_md": false 00:17:48.065 }, 00:17:48.065 "memory_domains": [ 00:17:48.065 { 00:17:48.065 "dma_device_id": "system", 00:17:48.065 "dma_device_type": 1 00:17:48.065 }, 00:17:48.065 { 00:17:48.065 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:48.065 "dma_device_type": 2 00:17:48.065 } 00:17:48.065 ], 00:17:48.065 "driver_specific": { 00:17:48.065 "passthru": { 00:17:48.065 "name": "pt1", 00:17:48.065 "base_bdev_name": "malloc1" 00:17:48.065 } 00:17:48.065 } 00:17:48.065 }' 00:17:48.065 19:53:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:48.065 19:53:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:48.324 19:53:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:48.324 19:53:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:48.324 19:53:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:48.324 19:53:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:48.324 19:53:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:48.324 19:53:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:48.324 19:53:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:48.324 19:53:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:48.325 19:53:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:48.584 19:53:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:48.584 19:53:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:48.584 19:53:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:48.584 19:53:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:48.584 19:53:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:48.584 "name": "pt2", 00:17:48.584 "aliases": [ 00:17:48.584 "00000000-0000-0000-0000-000000000002" 00:17:48.584 ], 00:17:48.584 "product_name": "passthru", 00:17:48.584 "block_size": 512, 00:17:48.584 "num_blocks": 65536, 00:17:48.584 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:48.584 "assigned_rate_limits": { 00:17:48.584 "rw_ios_per_sec": 0, 00:17:48.584 "rw_mbytes_per_sec": 0, 00:17:48.584 "r_mbytes_per_sec": 0, 00:17:48.584 "w_mbytes_per_sec": 0 00:17:48.584 }, 00:17:48.584 "claimed": true, 00:17:48.584 "claim_type": "exclusive_write", 00:17:48.584 "zoned": false, 00:17:48.584 "supported_io_types": { 00:17:48.584 "read": true, 00:17:48.584 "write": true, 00:17:48.584 "unmap": true, 00:17:48.584 "flush": true, 00:17:48.584 "reset": true, 00:17:48.584 "nvme_admin": false, 00:17:48.584 "nvme_io": false, 00:17:48.584 "nvme_io_md": false, 00:17:48.584 "write_zeroes": true, 00:17:48.584 "zcopy": true, 00:17:48.584 "get_zone_info": false, 00:17:48.584 "zone_management": false, 00:17:48.584 "zone_append": false, 00:17:48.584 "compare": false, 00:17:48.584 "compare_and_write": false, 00:17:48.584 "abort": true, 00:17:48.584 "seek_hole": false, 00:17:48.584 "seek_data": false, 00:17:48.584 "copy": true, 00:17:48.584 "nvme_iov_md": false 00:17:48.584 }, 00:17:48.584 "memory_domains": [ 00:17:48.584 { 00:17:48.584 "dma_device_id": "system", 00:17:48.584 "dma_device_type": 1 00:17:48.584 }, 00:17:48.584 { 00:17:48.584 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:48.584 "dma_device_type": 2 00:17:48.584 } 00:17:48.584 ], 00:17:48.584 "driver_specific": { 00:17:48.584 "passthru": { 00:17:48.584 "name": "pt2", 00:17:48.584 "base_bdev_name": "malloc2" 00:17:48.584 } 00:17:48.584 } 00:17:48.584 }' 00:17:48.584 19:53:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:48.844 19:53:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:48.844 19:53:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:48.844 19:53:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:48.844 19:53:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:48.844 19:53:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:48.844 19:53:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:48.844 19:53:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:49.103 19:53:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:49.103 19:53:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:49.103 19:53:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:49.103 19:53:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:49.103 19:53:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:49.103 19:53:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:49.103 19:53:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:49.363 19:53:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:49.363 "name": "pt3", 00:17:49.363 "aliases": [ 00:17:49.363 "00000000-0000-0000-0000-000000000003" 00:17:49.363 ], 00:17:49.363 "product_name": "passthru", 00:17:49.363 "block_size": 512, 00:17:49.363 "num_blocks": 65536, 00:17:49.363 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:49.363 "assigned_rate_limits": { 00:17:49.363 "rw_ios_per_sec": 0, 00:17:49.363 "rw_mbytes_per_sec": 0, 00:17:49.363 "r_mbytes_per_sec": 0, 00:17:49.363 "w_mbytes_per_sec": 0 00:17:49.363 }, 00:17:49.363 "claimed": true, 00:17:49.363 "claim_type": "exclusive_write", 00:17:49.363 "zoned": false, 00:17:49.363 "supported_io_types": { 00:17:49.363 "read": true, 00:17:49.363 "write": true, 00:17:49.363 "unmap": true, 00:17:49.363 "flush": true, 00:17:49.363 "reset": true, 00:17:49.363 "nvme_admin": false, 00:17:49.363 "nvme_io": false, 00:17:49.363 "nvme_io_md": false, 00:17:49.363 "write_zeroes": true, 00:17:49.363 "zcopy": true, 00:17:49.363 "get_zone_info": false, 00:17:49.363 "zone_management": false, 00:17:49.363 "zone_append": false, 00:17:49.363 "compare": false, 00:17:49.363 "compare_and_write": false, 00:17:49.363 "abort": true, 00:17:49.363 "seek_hole": false, 00:17:49.363 "seek_data": false, 00:17:49.363 "copy": true, 00:17:49.363 "nvme_iov_md": false 00:17:49.363 }, 00:17:49.363 "memory_domains": [ 00:17:49.363 { 00:17:49.363 "dma_device_id": "system", 00:17:49.363 "dma_device_type": 1 00:17:49.363 }, 00:17:49.363 { 00:17:49.363 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:49.363 "dma_device_type": 2 00:17:49.363 } 00:17:49.363 ], 00:17:49.363 "driver_specific": { 00:17:49.363 "passthru": { 00:17:49.363 "name": "pt3", 00:17:49.363 "base_bdev_name": "malloc3" 00:17:49.363 } 00:17:49.363 } 00:17:49.363 }' 00:17:49.363 19:53:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:49.363 19:53:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:49.363 19:53:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:49.363 19:53:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:49.363 19:53:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:49.625 19:53:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:49.625 19:53:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:49.625 19:53:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:49.625 19:53:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:49.625 19:53:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:49.625 19:53:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:49.625 19:53:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:49.625 19:53:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:49.625 19:53:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:17:49.945 [2024-07-24 19:53:41.375141] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:49.945 19:53:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=ec4f620e-ef9a-4a82-b71c-8b767b8ba72a 00:17:49.945 19:53:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z ec4f620e-ef9a-4a82-b71c-8b767b8ba72a ']' 00:17:49.945 19:53:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:50.204 [2024-07-24 19:53:41.619500] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:50.204 [2024-07-24 19:53:41.619521] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:50.204 [2024-07-24 19:53:41.619571] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:50.204 [2024-07-24 19:53:41.619639] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:50.204 [2024-07-24 19:53:41.619651] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bbdd10 name raid_bdev1, state offline 00:17:50.204 19:53:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:50.204 19:53:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:17:50.463 19:53:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:17:50.463 19:53:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:17:50.463 19:53:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:17:50.463 19:53:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:50.722 19:53:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:17:50.722 19:53:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:51.290 19:53:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:17:51.290 19:53:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:51.550 19:53:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:17:51.550 19:53:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:17:51.550 19:53:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:17:51.550 19:53:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:51.550 19:53:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:17:51.550 19:53:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:51.550 19:53:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:51.550 19:53:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:17:51.550 19:53:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:51.550 19:53:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:17:51.550 19:53:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:51.550 19:53:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:17:51.550 19:53:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:51.550 19:53:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:17:51.550 19:53:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:51.809 [2024-07-24 19:53:43.360035] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:17:51.809 [2024-07-24 19:53:43.361373] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:17:51.809 [2024-07-24 19:53:43.361426] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:17:51.809 [2024-07-24 19:53:43.361470] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:17:51.809 [2024-07-24 19:53:43.361508] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:17:51.809 [2024-07-24 19:53:43.361532] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:17:51.809 [2024-07-24 19:53:43.361550] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:51.809 [2024-07-24 19:53:43.361560] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a0cdb0 name raid_bdev1, state configuring 00:17:51.809 request: 00:17:51.809 { 00:17:51.809 "name": "raid_bdev1", 00:17:51.809 "raid_level": "raid1", 00:17:51.809 "base_bdevs": [ 00:17:51.809 "malloc1", 00:17:51.809 "malloc2", 00:17:51.809 "malloc3" 00:17:51.809 ], 00:17:51.809 "superblock": false, 00:17:51.809 "method": "bdev_raid_create", 00:17:51.809 "req_id": 1 00:17:51.809 } 00:17:51.809 Got JSON-RPC error response 00:17:51.809 response: 00:17:51.809 { 00:17:51.809 "code": -17, 00:17:51.809 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:17:51.809 } 00:17:51.809 19:53:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:17:51.809 19:53:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:17:51.809 19:53:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:17:51.809 19:53:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:17:51.809 19:53:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:51.809 19:53:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:17:52.067 19:53:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:17:52.067 19:53:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:17:52.067 19:53:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:52.327 [2024-07-24 19:53:43.841240] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:52.327 [2024-07-24 19:53:43.841282] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:52.327 [2024-07-24 19:53:43.841300] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a0c9f0 00:17:52.327 [2024-07-24 19:53:43.841312] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:52.327 [2024-07-24 19:53:43.842954] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:52.327 [2024-07-24 19:53:43.842990] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:52.327 [2024-07-24 19:53:43.843057] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:52.327 [2024-07-24 19:53:43.843084] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:52.327 pt1 00:17:52.327 19:53:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:52.327 19:53:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:52.327 19:53:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:52.327 19:53:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:52.327 19:53:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:52.327 19:53:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:52.327 19:53:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:52.327 19:53:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:52.327 19:53:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:52.327 19:53:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:52.327 19:53:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:52.327 19:53:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:52.587 19:53:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:52.587 "name": "raid_bdev1", 00:17:52.587 "uuid": "ec4f620e-ef9a-4a82-b71c-8b767b8ba72a", 00:17:52.587 "strip_size_kb": 0, 00:17:52.587 "state": "configuring", 00:17:52.587 "raid_level": "raid1", 00:17:52.587 "superblock": true, 00:17:52.587 "num_base_bdevs": 3, 00:17:52.587 "num_base_bdevs_discovered": 1, 00:17:52.587 "num_base_bdevs_operational": 3, 00:17:52.587 "base_bdevs_list": [ 00:17:52.587 { 00:17:52.587 "name": "pt1", 00:17:52.587 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:52.587 "is_configured": true, 00:17:52.587 "data_offset": 2048, 00:17:52.587 "data_size": 63488 00:17:52.587 }, 00:17:52.587 { 00:17:52.587 "name": null, 00:17:52.587 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:52.587 "is_configured": false, 00:17:52.587 "data_offset": 2048, 00:17:52.587 "data_size": 63488 00:17:52.587 }, 00:17:52.587 { 00:17:52.587 "name": null, 00:17:52.587 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:52.587 "is_configured": false, 00:17:52.587 "data_offset": 2048, 00:17:52.587 "data_size": 63488 00:17:52.587 } 00:17:52.587 ] 00:17:52.587 }' 00:17:52.587 19:53:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:52.587 19:53:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:53.155 19:53:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 3 -gt 2 ']' 00:17:53.155 19:53:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:53.415 [2024-07-24 19:53:44.928141] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:53.415 [2024-07-24 19:53:44.928191] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:53.415 [2024-07-24 19:53:44.928210] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bbb8c0 00:17:53.415 [2024-07-24 19:53:44.928223] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:53.415 [2024-07-24 19:53:44.928562] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:53.415 [2024-07-24 19:53:44.928580] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:53.415 [2024-07-24 19:53:44.928643] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:53.415 [2024-07-24 19:53:44.928662] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:53.415 pt2 00:17:53.415 19:53:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@488 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:53.674 [2024-07-24 19:53:45.176823] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:17:53.674 19:53:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@489 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:53.674 19:53:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:53.674 19:53:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:53.674 19:53:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:53.674 19:53:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:53.674 19:53:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:53.674 19:53:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:53.674 19:53:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:53.674 19:53:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:53.674 19:53:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:53.674 19:53:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:53.674 19:53:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:53.933 19:53:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:53.933 "name": "raid_bdev1", 00:17:53.933 "uuid": "ec4f620e-ef9a-4a82-b71c-8b767b8ba72a", 00:17:53.933 "strip_size_kb": 0, 00:17:53.933 "state": "configuring", 00:17:53.933 "raid_level": "raid1", 00:17:53.933 "superblock": true, 00:17:53.933 "num_base_bdevs": 3, 00:17:53.933 "num_base_bdevs_discovered": 1, 00:17:53.933 "num_base_bdevs_operational": 3, 00:17:53.933 "base_bdevs_list": [ 00:17:53.933 { 00:17:53.933 "name": "pt1", 00:17:53.934 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:53.934 "is_configured": true, 00:17:53.934 "data_offset": 2048, 00:17:53.934 "data_size": 63488 00:17:53.934 }, 00:17:53.934 { 00:17:53.934 "name": null, 00:17:53.934 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:53.934 "is_configured": false, 00:17:53.934 "data_offset": 2048, 00:17:53.934 "data_size": 63488 00:17:53.934 }, 00:17:53.934 { 00:17:53.934 "name": null, 00:17:53.934 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:53.934 "is_configured": false, 00:17:53.934 "data_offset": 2048, 00:17:53.934 "data_size": 63488 00:17:53.934 } 00:17:53.934 ] 00:17:53.934 }' 00:17:53.934 19:53:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:53.934 19:53:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:54.501 19:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:17:54.501 19:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:17:54.501 19:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:54.760 [2024-07-24 19:53:46.287750] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:54.760 [2024-07-24 19:53:46.287793] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:54.760 [2024-07-24 19:53:46.287811] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bbac70 00:17:54.760 [2024-07-24 19:53:46.287823] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:54.760 [2024-07-24 19:53:46.288148] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:54.760 [2024-07-24 19:53:46.288164] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:54.760 [2024-07-24 19:53:46.288223] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:54.760 [2024-07-24 19:53:46.288242] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:54.760 pt2 00:17:54.760 19:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:17:54.760 19:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:17:54.760 19:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:55.019 [2024-07-24 19:53:46.472236] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:55.019 [2024-07-24 19:53:46.472268] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:55.019 [2024-07-24 19:53:46.472283] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a0d260 00:17:55.019 [2024-07-24 19:53:46.472295] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:55.019 [2024-07-24 19:53:46.472591] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:55.019 [2024-07-24 19:53:46.472609] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:55.019 [2024-07-24 19:53:46.472661] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:55.019 [2024-07-24 19:53:46.472679] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:55.019 [2024-07-24 19:53:46.472786] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bc6ec0 00:17:55.019 [2024-07-24 19:53:46.472796] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:55.019 [2024-07-24 19:53:46.472956] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a10710 00:17:55.019 [2024-07-24 19:53:46.473085] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bc6ec0 00:17:55.019 [2024-07-24 19:53:46.473095] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1bc6ec0 00:17:55.019 [2024-07-24 19:53:46.473191] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:55.019 pt3 00:17:55.019 19:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:17:55.019 19:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:17:55.019 19:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:55.019 19:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:55.019 19:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:55.019 19:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:55.019 19:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:55.019 19:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:55.019 19:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:55.019 19:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:55.019 19:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:55.019 19:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:55.019 19:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:55.019 19:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:55.278 19:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:55.278 "name": "raid_bdev1", 00:17:55.278 "uuid": "ec4f620e-ef9a-4a82-b71c-8b767b8ba72a", 00:17:55.278 "strip_size_kb": 0, 00:17:55.278 "state": "online", 00:17:55.278 "raid_level": "raid1", 00:17:55.278 "superblock": true, 00:17:55.278 "num_base_bdevs": 3, 00:17:55.278 "num_base_bdevs_discovered": 3, 00:17:55.278 "num_base_bdevs_operational": 3, 00:17:55.278 "base_bdevs_list": [ 00:17:55.278 { 00:17:55.278 "name": "pt1", 00:17:55.278 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:55.278 "is_configured": true, 00:17:55.278 "data_offset": 2048, 00:17:55.278 "data_size": 63488 00:17:55.278 }, 00:17:55.278 { 00:17:55.278 "name": "pt2", 00:17:55.278 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:55.278 "is_configured": true, 00:17:55.278 "data_offset": 2048, 00:17:55.278 "data_size": 63488 00:17:55.278 }, 00:17:55.278 { 00:17:55.278 "name": "pt3", 00:17:55.278 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:55.278 "is_configured": true, 00:17:55.278 "data_offset": 2048, 00:17:55.278 "data_size": 63488 00:17:55.278 } 00:17:55.278 ] 00:17:55.278 }' 00:17:55.278 19:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:55.278 19:53:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:55.846 19:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:17:55.846 19:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:55.846 19:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:55.846 19:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:55.846 19:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:55.846 19:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:55.846 19:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:55.846 19:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:56.106 [2024-07-24 19:53:47.587465] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:56.106 19:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:56.106 "name": "raid_bdev1", 00:17:56.106 "aliases": [ 00:17:56.106 "ec4f620e-ef9a-4a82-b71c-8b767b8ba72a" 00:17:56.106 ], 00:17:56.106 "product_name": "Raid Volume", 00:17:56.106 "block_size": 512, 00:17:56.106 "num_blocks": 63488, 00:17:56.106 "uuid": "ec4f620e-ef9a-4a82-b71c-8b767b8ba72a", 00:17:56.106 "assigned_rate_limits": { 00:17:56.106 "rw_ios_per_sec": 0, 00:17:56.106 "rw_mbytes_per_sec": 0, 00:17:56.106 "r_mbytes_per_sec": 0, 00:17:56.106 "w_mbytes_per_sec": 0 00:17:56.106 }, 00:17:56.106 "claimed": false, 00:17:56.106 "zoned": false, 00:17:56.106 "supported_io_types": { 00:17:56.106 "read": true, 00:17:56.106 "write": true, 00:17:56.106 "unmap": false, 00:17:56.106 "flush": false, 00:17:56.106 "reset": true, 00:17:56.106 "nvme_admin": false, 00:17:56.106 "nvme_io": false, 00:17:56.106 "nvme_io_md": false, 00:17:56.106 "write_zeroes": true, 00:17:56.106 "zcopy": false, 00:17:56.106 "get_zone_info": false, 00:17:56.106 "zone_management": false, 00:17:56.106 "zone_append": false, 00:17:56.106 "compare": false, 00:17:56.106 "compare_and_write": false, 00:17:56.106 "abort": false, 00:17:56.106 "seek_hole": false, 00:17:56.106 "seek_data": false, 00:17:56.106 "copy": false, 00:17:56.106 "nvme_iov_md": false 00:17:56.106 }, 00:17:56.106 "memory_domains": [ 00:17:56.106 { 00:17:56.106 "dma_device_id": "system", 00:17:56.106 "dma_device_type": 1 00:17:56.106 }, 00:17:56.106 { 00:17:56.106 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:56.106 "dma_device_type": 2 00:17:56.106 }, 00:17:56.106 { 00:17:56.106 "dma_device_id": "system", 00:17:56.106 "dma_device_type": 1 00:17:56.106 }, 00:17:56.106 { 00:17:56.106 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:56.106 "dma_device_type": 2 00:17:56.106 }, 00:17:56.106 { 00:17:56.106 "dma_device_id": "system", 00:17:56.106 "dma_device_type": 1 00:17:56.106 }, 00:17:56.106 { 00:17:56.106 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:56.106 "dma_device_type": 2 00:17:56.106 } 00:17:56.106 ], 00:17:56.106 "driver_specific": { 00:17:56.106 "raid": { 00:17:56.106 "uuid": "ec4f620e-ef9a-4a82-b71c-8b767b8ba72a", 00:17:56.106 "strip_size_kb": 0, 00:17:56.106 "state": "online", 00:17:56.106 "raid_level": "raid1", 00:17:56.106 "superblock": true, 00:17:56.106 "num_base_bdevs": 3, 00:17:56.106 "num_base_bdevs_discovered": 3, 00:17:56.106 "num_base_bdevs_operational": 3, 00:17:56.106 "base_bdevs_list": [ 00:17:56.106 { 00:17:56.106 "name": "pt1", 00:17:56.106 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:56.106 "is_configured": true, 00:17:56.106 "data_offset": 2048, 00:17:56.106 "data_size": 63488 00:17:56.106 }, 00:17:56.106 { 00:17:56.106 "name": "pt2", 00:17:56.106 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:56.106 "is_configured": true, 00:17:56.106 "data_offset": 2048, 00:17:56.106 "data_size": 63488 00:17:56.106 }, 00:17:56.106 { 00:17:56.106 "name": "pt3", 00:17:56.106 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:56.106 "is_configured": true, 00:17:56.106 "data_offset": 2048, 00:17:56.106 "data_size": 63488 00:17:56.106 } 00:17:56.106 ] 00:17:56.106 } 00:17:56.106 } 00:17:56.106 }' 00:17:56.106 19:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:56.106 19:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:56.106 pt2 00:17:56.106 pt3' 00:17:56.106 19:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:56.106 19:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:56.106 19:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:56.365 19:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:56.365 "name": "pt1", 00:17:56.365 "aliases": [ 00:17:56.365 "00000000-0000-0000-0000-000000000001" 00:17:56.365 ], 00:17:56.365 "product_name": "passthru", 00:17:56.365 "block_size": 512, 00:17:56.365 "num_blocks": 65536, 00:17:56.365 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:56.365 "assigned_rate_limits": { 00:17:56.365 "rw_ios_per_sec": 0, 00:17:56.365 "rw_mbytes_per_sec": 0, 00:17:56.365 "r_mbytes_per_sec": 0, 00:17:56.365 "w_mbytes_per_sec": 0 00:17:56.365 }, 00:17:56.365 "claimed": true, 00:17:56.365 "claim_type": "exclusive_write", 00:17:56.365 "zoned": false, 00:17:56.365 "supported_io_types": { 00:17:56.365 "read": true, 00:17:56.365 "write": true, 00:17:56.365 "unmap": true, 00:17:56.365 "flush": true, 00:17:56.365 "reset": true, 00:17:56.365 "nvme_admin": false, 00:17:56.365 "nvme_io": false, 00:17:56.365 "nvme_io_md": false, 00:17:56.365 "write_zeroes": true, 00:17:56.365 "zcopy": true, 00:17:56.365 "get_zone_info": false, 00:17:56.365 "zone_management": false, 00:17:56.365 "zone_append": false, 00:17:56.365 "compare": false, 00:17:56.365 "compare_and_write": false, 00:17:56.365 "abort": true, 00:17:56.365 "seek_hole": false, 00:17:56.365 "seek_data": false, 00:17:56.365 "copy": true, 00:17:56.365 "nvme_iov_md": false 00:17:56.365 }, 00:17:56.365 "memory_domains": [ 00:17:56.365 { 00:17:56.365 "dma_device_id": "system", 00:17:56.365 "dma_device_type": 1 00:17:56.365 }, 00:17:56.365 { 00:17:56.365 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:56.365 "dma_device_type": 2 00:17:56.365 } 00:17:56.365 ], 00:17:56.365 "driver_specific": { 00:17:56.365 "passthru": { 00:17:56.365 "name": "pt1", 00:17:56.365 "base_bdev_name": "malloc1" 00:17:56.365 } 00:17:56.365 } 00:17:56.365 }' 00:17:56.365 19:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:56.365 19:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:56.624 19:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:56.624 19:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:56.624 19:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:56.624 19:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:56.624 19:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:56.624 19:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:56.624 19:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:56.624 19:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:56.624 19:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:56.883 19:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:56.883 19:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:56.883 19:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:56.883 19:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:57.142 19:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:57.142 "name": "pt2", 00:17:57.142 "aliases": [ 00:17:57.142 "00000000-0000-0000-0000-000000000002" 00:17:57.142 ], 00:17:57.142 "product_name": "passthru", 00:17:57.142 "block_size": 512, 00:17:57.142 "num_blocks": 65536, 00:17:57.142 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:57.142 "assigned_rate_limits": { 00:17:57.142 "rw_ios_per_sec": 0, 00:17:57.142 "rw_mbytes_per_sec": 0, 00:17:57.142 "r_mbytes_per_sec": 0, 00:17:57.142 "w_mbytes_per_sec": 0 00:17:57.142 }, 00:17:57.142 "claimed": true, 00:17:57.142 "claim_type": "exclusive_write", 00:17:57.142 "zoned": false, 00:17:57.142 "supported_io_types": { 00:17:57.142 "read": true, 00:17:57.142 "write": true, 00:17:57.142 "unmap": true, 00:17:57.142 "flush": true, 00:17:57.142 "reset": true, 00:17:57.142 "nvme_admin": false, 00:17:57.142 "nvme_io": false, 00:17:57.142 "nvme_io_md": false, 00:17:57.142 "write_zeroes": true, 00:17:57.142 "zcopy": true, 00:17:57.142 "get_zone_info": false, 00:17:57.142 "zone_management": false, 00:17:57.142 "zone_append": false, 00:17:57.142 "compare": false, 00:17:57.142 "compare_and_write": false, 00:17:57.142 "abort": true, 00:17:57.142 "seek_hole": false, 00:17:57.142 "seek_data": false, 00:17:57.142 "copy": true, 00:17:57.142 "nvme_iov_md": false 00:17:57.142 }, 00:17:57.142 "memory_domains": [ 00:17:57.142 { 00:17:57.142 "dma_device_id": "system", 00:17:57.142 "dma_device_type": 1 00:17:57.142 }, 00:17:57.142 { 00:17:57.142 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:57.142 "dma_device_type": 2 00:17:57.142 } 00:17:57.142 ], 00:17:57.142 "driver_specific": { 00:17:57.142 "passthru": { 00:17:57.142 "name": "pt2", 00:17:57.142 "base_bdev_name": "malloc2" 00:17:57.142 } 00:17:57.142 } 00:17:57.142 }' 00:17:57.142 19:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:57.142 19:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:57.142 19:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:57.142 19:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:57.142 19:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:57.142 19:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:57.142 19:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:57.142 19:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:57.142 19:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:57.142 19:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:57.401 19:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:57.401 19:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:57.401 19:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:57.401 19:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:57.401 19:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:57.660 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:57.660 "name": "pt3", 00:17:57.660 "aliases": [ 00:17:57.660 "00000000-0000-0000-0000-000000000003" 00:17:57.660 ], 00:17:57.660 "product_name": "passthru", 00:17:57.660 "block_size": 512, 00:17:57.660 "num_blocks": 65536, 00:17:57.660 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:57.660 "assigned_rate_limits": { 00:17:57.660 "rw_ios_per_sec": 0, 00:17:57.660 "rw_mbytes_per_sec": 0, 00:17:57.660 "r_mbytes_per_sec": 0, 00:17:57.660 "w_mbytes_per_sec": 0 00:17:57.660 }, 00:17:57.660 "claimed": true, 00:17:57.661 "claim_type": "exclusive_write", 00:17:57.661 "zoned": false, 00:17:57.661 "supported_io_types": { 00:17:57.661 "read": true, 00:17:57.661 "write": true, 00:17:57.661 "unmap": true, 00:17:57.661 "flush": true, 00:17:57.661 "reset": true, 00:17:57.661 "nvme_admin": false, 00:17:57.661 "nvme_io": false, 00:17:57.661 "nvme_io_md": false, 00:17:57.661 "write_zeroes": true, 00:17:57.661 "zcopy": true, 00:17:57.661 "get_zone_info": false, 00:17:57.661 "zone_management": false, 00:17:57.661 "zone_append": false, 00:17:57.661 "compare": false, 00:17:57.661 "compare_and_write": false, 00:17:57.661 "abort": true, 00:17:57.661 "seek_hole": false, 00:17:57.661 "seek_data": false, 00:17:57.661 "copy": true, 00:17:57.661 "nvme_iov_md": false 00:17:57.661 }, 00:17:57.661 "memory_domains": [ 00:17:57.661 { 00:17:57.661 "dma_device_id": "system", 00:17:57.661 "dma_device_type": 1 00:17:57.661 }, 00:17:57.661 { 00:17:57.661 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:57.661 "dma_device_type": 2 00:17:57.661 } 00:17:57.661 ], 00:17:57.661 "driver_specific": { 00:17:57.661 "passthru": { 00:17:57.661 "name": "pt3", 00:17:57.661 "base_bdev_name": "malloc3" 00:17:57.661 } 00:17:57.661 } 00:17:57.661 }' 00:17:57.661 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:57.661 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:57.661 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:57.661 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:57.661 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:57.661 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:57.661 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:57.920 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:57.920 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:57.920 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:57.920 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:57.920 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:57.920 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:57.920 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:17:58.179 [2024-07-24 19:53:49.620859] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:58.179 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' ec4f620e-ef9a-4a82-b71c-8b767b8ba72a '!=' ec4f620e-ef9a-4a82-b71c-8b767b8ba72a ']' 00:17:58.179 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy raid1 00:17:58.179 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:58.179 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:58.179 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@508 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:58.437 [2024-07-24 19:53:49.869267] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:17:58.437 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:58.437 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:58.437 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:58.437 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:58.437 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:58.437 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:58.437 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:58.438 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:58.438 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:58.438 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:58.438 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:58.438 19:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:58.696 19:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:58.696 "name": "raid_bdev1", 00:17:58.696 "uuid": "ec4f620e-ef9a-4a82-b71c-8b767b8ba72a", 00:17:58.696 "strip_size_kb": 0, 00:17:58.696 "state": "online", 00:17:58.696 "raid_level": "raid1", 00:17:58.696 "superblock": true, 00:17:58.696 "num_base_bdevs": 3, 00:17:58.696 "num_base_bdevs_discovered": 2, 00:17:58.696 "num_base_bdevs_operational": 2, 00:17:58.696 "base_bdevs_list": [ 00:17:58.696 { 00:17:58.696 "name": null, 00:17:58.696 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:58.696 "is_configured": false, 00:17:58.696 "data_offset": 2048, 00:17:58.696 "data_size": 63488 00:17:58.696 }, 00:17:58.696 { 00:17:58.696 "name": "pt2", 00:17:58.696 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:58.696 "is_configured": true, 00:17:58.696 "data_offset": 2048, 00:17:58.696 "data_size": 63488 00:17:58.696 }, 00:17:58.696 { 00:17:58.696 "name": "pt3", 00:17:58.696 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:58.696 "is_configured": true, 00:17:58.696 "data_offset": 2048, 00:17:58.696 "data_size": 63488 00:17:58.696 } 00:17:58.696 ] 00:17:58.696 }' 00:17:58.696 19:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:58.696 19:53:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:59.262 19:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:59.519 [2024-07-24 19:53:50.948104] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:59.519 [2024-07-24 19:53:50.948132] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:59.519 [2024-07-24 19:53:50.948183] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:59.519 [2024-07-24 19:53:50.948239] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:59.519 [2024-07-24 19:53:50.948251] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bc6ec0 name raid_bdev1, state offline 00:17:59.519 19:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # jq -r '.[]' 00:17:59.519 19:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:59.776 19:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # raid_bdev= 00:17:59.776 19:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@516 -- # '[' -n '' ']' 00:17:59.776 19:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i = 1 )) 00:17:59.776 19:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:17:59.776 19:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:00.034 19:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:18:00.034 19:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:18:00.034 19:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:00.292 19:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:18:00.292 19:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:18:00.292 19:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i = 1 )) 00:18:00.292 19:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:18:00.292 19:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:00.550 [2024-07-24 19:53:51.942678] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:00.550 [2024-07-24 19:53:51.942729] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:00.550 [2024-07-24 19:53:51.942748] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bbcb50 00:18:00.550 [2024-07-24 19:53:51.942765] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:00.550 [2024-07-24 19:53:51.944379] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:00.550 [2024-07-24 19:53:51.944423] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:00.550 [2024-07-24 19:53:51.944492] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:00.550 [2024-07-24 19:53:51.944521] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:00.550 pt2 00:18:00.550 19:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@530 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:18:00.550 19:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:00.550 19:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:00.550 19:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:00.550 19:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:00.550 19:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:00.550 19:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:00.550 19:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:00.550 19:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:00.550 19:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:00.550 19:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:00.550 19:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:00.808 19:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:00.808 "name": "raid_bdev1", 00:18:00.808 "uuid": "ec4f620e-ef9a-4a82-b71c-8b767b8ba72a", 00:18:00.808 "strip_size_kb": 0, 00:18:00.808 "state": "configuring", 00:18:00.808 "raid_level": "raid1", 00:18:00.808 "superblock": true, 00:18:00.808 "num_base_bdevs": 3, 00:18:00.808 "num_base_bdevs_discovered": 1, 00:18:00.808 "num_base_bdevs_operational": 2, 00:18:00.808 "base_bdevs_list": [ 00:18:00.808 { 00:18:00.808 "name": null, 00:18:00.808 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:00.808 "is_configured": false, 00:18:00.808 "data_offset": 2048, 00:18:00.808 "data_size": 63488 00:18:00.808 }, 00:18:00.808 { 00:18:00.808 "name": "pt2", 00:18:00.808 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:00.808 "is_configured": true, 00:18:00.808 "data_offset": 2048, 00:18:00.808 "data_size": 63488 00:18:00.808 }, 00:18:00.808 { 00:18:00.808 "name": null, 00:18:00.808 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:00.808 "is_configured": false, 00:18:00.808 "data_offset": 2048, 00:18:00.808 "data_size": 63488 00:18:00.808 } 00:18:00.808 ] 00:18:00.808 }' 00:18:00.808 19:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:00.808 19:53:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:01.373 19:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i++ )) 00:18:01.373 19:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:18:01.373 19:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # i=2 00:18:01.373 19:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:01.631 [2024-07-24 19:53:53.033584] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:01.631 [2024-07-24 19:53:53.033636] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:01.631 [2024-07-24 19:53:53.033654] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bb9ed0 00:18:01.631 [2024-07-24 19:53:53.033668] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:01.631 [2024-07-24 19:53:53.034008] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:01.631 [2024-07-24 19:53:53.034025] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:01.631 [2024-07-24 19:53:53.034090] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:01.631 [2024-07-24 19:53:53.034109] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:01.631 [2024-07-24 19:53:53.034205] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bba850 00:18:01.631 [2024-07-24 19:53:53.034215] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:01.631 [2024-07-24 19:53:53.034376] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a0f810 00:18:01.631 [2024-07-24 19:53:53.034514] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bba850 00:18:01.631 [2024-07-24 19:53:53.034525] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1bba850 00:18:01.631 [2024-07-24 19:53:53.034622] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:01.631 pt3 00:18:01.631 19:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@538 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:01.631 19:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:01.631 19:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:01.631 19:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:01.631 19:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:01.631 19:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:01.631 19:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:01.631 19:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:01.631 19:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:01.631 19:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:01.631 19:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:01.631 19:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:01.890 19:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:01.890 "name": "raid_bdev1", 00:18:01.890 "uuid": "ec4f620e-ef9a-4a82-b71c-8b767b8ba72a", 00:18:01.890 "strip_size_kb": 0, 00:18:01.890 "state": "online", 00:18:01.890 "raid_level": "raid1", 00:18:01.890 "superblock": true, 00:18:01.890 "num_base_bdevs": 3, 00:18:01.890 "num_base_bdevs_discovered": 2, 00:18:01.890 "num_base_bdevs_operational": 2, 00:18:01.890 "base_bdevs_list": [ 00:18:01.890 { 00:18:01.890 "name": null, 00:18:01.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:01.890 "is_configured": false, 00:18:01.890 "data_offset": 2048, 00:18:01.890 "data_size": 63488 00:18:01.890 }, 00:18:01.890 { 00:18:01.890 "name": "pt2", 00:18:01.890 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:01.890 "is_configured": true, 00:18:01.890 "data_offset": 2048, 00:18:01.890 "data_size": 63488 00:18:01.890 }, 00:18:01.890 { 00:18:01.890 "name": "pt3", 00:18:01.890 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:01.890 "is_configured": true, 00:18:01.890 "data_offset": 2048, 00:18:01.890 "data_size": 63488 00:18:01.890 } 00:18:01.890 ] 00:18:01.890 }' 00:18:01.890 19:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:01.890 19:53:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:02.822 19:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:02.822 [2024-07-24 19:53:54.413246] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:02.822 [2024-07-24 19:53:54.413275] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:02.822 [2024-07-24 19:53:54.413327] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:02.822 [2024-07-24 19:53:54.413383] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:02.822 [2024-07-24 19:53:54.413406] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bba850 name raid_bdev1, state offline 00:18:03.079 19:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:03.079 19:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # jq -r '.[]' 00:18:03.337 19:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # raid_bdev= 00:18:03.337 19:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@543 -- # '[' -n '' ']' 00:18:03.337 19:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@547 -- # '[' 3 -gt 2 ']' 00:18:03.337 19:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@549 -- # i=2 00:18:03.337 19:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@550 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:03.596 19:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:03.596 [2024-07-24 19:53:55.167214] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:03.596 [2024-07-24 19:53:55.167266] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:03.596 [2024-07-24 19:53:55.167284] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a0c150 00:18:03.596 [2024-07-24 19:53:55.167297] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:03.596 [2024-07-24 19:53:55.168895] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:03.596 [2024-07-24 19:53:55.168924] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:03.596 [2024-07-24 19:53:55.168987] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:18:03.596 [2024-07-24 19:53:55.169014] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:03.596 [2024-07-24 19:53:55.169113] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:18:03.596 [2024-07-24 19:53:55.169126] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:03.596 [2024-07-24 19:53:55.169140] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a0d0e0 name raid_bdev1, state configuring 00:18:03.596 [2024-07-24 19:53:55.169164] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:03.596 pt1 00:18:03.854 19:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 3 -gt 2 ']' 00:18:03.854 19:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@560 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:18:03.854 19:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:03.854 19:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:03.854 19:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:03.854 19:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:03.854 19:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:03.854 19:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:03.855 19:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:03.855 19:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:03.855 19:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:03.855 19:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:03.855 19:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:03.855 19:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:03.855 "name": "raid_bdev1", 00:18:03.855 "uuid": "ec4f620e-ef9a-4a82-b71c-8b767b8ba72a", 00:18:03.855 "strip_size_kb": 0, 00:18:03.855 "state": "configuring", 00:18:03.855 "raid_level": "raid1", 00:18:03.855 "superblock": true, 00:18:03.855 "num_base_bdevs": 3, 00:18:03.855 "num_base_bdevs_discovered": 1, 00:18:03.855 "num_base_bdevs_operational": 2, 00:18:03.855 "base_bdevs_list": [ 00:18:03.855 { 00:18:03.855 "name": null, 00:18:03.855 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:03.855 "is_configured": false, 00:18:03.855 "data_offset": 2048, 00:18:03.855 "data_size": 63488 00:18:03.855 }, 00:18:03.855 { 00:18:03.855 "name": "pt2", 00:18:03.855 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:03.855 "is_configured": true, 00:18:03.855 "data_offset": 2048, 00:18:03.855 "data_size": 63488 00:18:03.855 }, 00:18:03.855 { 00:18:03.855 "name": null, 00:18:03.855 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:03.855 "is_configured": false, 00:18:03.855 "data_offset": 2048, 00:18:03.855 "data_size": 63488 00:18:03.855 } 00:18:03.855 ] 00:18:03.855 }' 00:18:03.855 19:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:03.855 19:53:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:04.787 19:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@561 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:18:04.787 19:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@561 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:18:05.354 19:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@561 -- # [[ false == \f\a\l\s\e ]] 00:18:05.354 19:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:05.354 [2024-07-24 19:53:56.875748] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:05.354 [2024-07-24 19:53:56.875797] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:05.354 [2024-07-24 19:53:56.875819] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a0c5c0 00:18:05.354 [2024-07-24 19:53:56.875831] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:05.354 [2024-07-24 19:53:56.876173] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:05.354 [2024-07-24 19:53:56.876190] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:05.354 [2024-07-24 19:53:56.876256] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:05.354 [2024-07-24 19:53:56.876277] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:05.354 [2024-07-24 19:53:56.876378] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bb9fa0 00:18:05.354 [2024-07-24 19:53:56.876398] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:05.354 [2024-07-24 19:53:56.876564] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a0f850 00:18:05.354 [2024-07-24 19:53:56.876700] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bb9fa0 00:18:05.354 [2024-07-24 19:53:56.876710] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1bb9fa0 00:18:05.354 [2024-07-24 19:53:56.876807] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:05.354 pt3 00:18:05.354 19:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@569 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:05.354 19:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:05.354 19:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:05.354 19:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:05.354 19:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:05.354 19:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:05.354 19:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:05.354 19:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:05.354 19:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:05.354 19:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:05.354 19:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:05.354 19:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:05.613 19:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:05.613 "name": "raid_bdev1", 00:18:05.613 "uuid": "ec4f620e-ef9a-4a82-b71c-8b767b8ba72a", 00:18:05.613 "strip_size_kb": 0, 00:18:05.613 "state": "online", 00:18:05.613 "raid_level": "raid1", 00:18:05.613 "superblock": true, 00:18:05.613 "num_base_bdevs": 3, 00:18:05.613 "num_base_bdevs_discovered": 2, 00:18:05.613 "num_base_bdevs_operational": 2, 00:18:05.613 "base_bdevs_list": [ 00:18:05.613 { 00:18:05.613 "name": null, 00:18:05.613 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:05.613 "is_configured": false, 00:18:05.613 "data_offset": 2048, 00:18:05.613 "data_size": 63488 00:18:05.613 }, 00:18:05.613 { 00:18:05.613 "name": "pt2", 00:18:05.613 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:05.613 "is_configured": true, 00:18:05.613 "data_offset": 2048, 00:18:05.613 "data_size": 63488 00:18:05.613 }, 00:18:05.613 { 00:18:05.613 "name": "pt3", 00:18:05.613 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:05.613 "is_configured": true, 00:18:05.613 "data_offset": 2048, 00:18:05.613 "data_size": 63488 00:18:05.613 } 00:18:05.613 ] 00:18:05.613 }' 00:18:05.613 19:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:05.613 19:53:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:06.554 19:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:18:06.554 19:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:18:06.829 19:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # [[ false == \f\a\l\s\e ]] 00:18:06.829 19:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:06.829 19:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # jq -r '.[] | .uuid' 00:18:07.397 [2024-07-24 19:53:58.740926] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:07.397 19:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # '[' ec4f620e-ef9a-4a82-b71c-8b767b8ba72a '!=' ec4f620e-ef9a-4a82-b71c-8b767b8ba72a ']' 00:18:07.397 19:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 1430095 00:18:07.397 19:53:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1430095 ']' 00:18:07.397 19:53:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1430095 00:18:07.397 19:53:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:18:07.397 19:53:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:18:07.397 19:53:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1430095 00:18:07.397 19:53:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:18:07.397 19:53:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:18:07.397 19:53:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1430095' 00:18:07.397 killing process with pid 1430095 00:18:07.397 19:53:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 1430095 00:18:07.397 [2024-07-24 19:53:58.824176] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:07.397 [2024-07-24 19:53:58.824236] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:07.397 [2024-07-24 19:53:58.824294] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:07.397 [2024-07-24 19:53:58.824309] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bb9fa0 name raid_bdev1, state offline 00:18:07.397 19:53:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 1430095 00:18:07.397 [2024-07-24 19:53:58.854011] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:07.656 19:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:18:07.656 00:18:07.656 real 0m25.071s 00:18:07.656 user 0m46.057s 00:18:07.656 sys 0m4.305s 00:18:07.656 19:53:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:18:07.656 19:53:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:07.656 ************************************ 00:18:07.656 END TEST raid_superblock_test 00:18:07.656 ************************************ 00:18:07.656 19:53:59 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:18:07.656 19:53:59 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:18:07.656 19:53:59 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:18:07.656 19:53:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:07.656 ************************************ 00:18:07.656 START TEST raid_read_error_test 00:18:07.656 ************************************ 00:18:07.656 19:53:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 3 read 00:18:07.656 19:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid1 00:18:07.656 19:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=3 00:18:07.656 19:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:18:07.656 19:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:18:07.656 19:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:07.656 19:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:18:07.656 19:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:18:07.656 19:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:07.656 19:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:18:07.656 19:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:18:07.656 19:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:07.656 19:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:18:07.656 19:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:18:07.656 19:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:07.656 19:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:18:07.656 19:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:18:07.656 19:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:18:07.656 19:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:18:07.656 19:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:18:07.657 19:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:18:07.657 19:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:18:07.657 19:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid1 '!=' raid1 ']' 00:18:07.657 19:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@818 -- # strip_size=0 00:18:07.657 19:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:18:07.657 19:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.9HtKziUkVM 00:18:07.657 19:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1433838 00:18:07.657 19:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1433838 /var/tmp/spdk-raid.sock 00:18:07.657 19:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:07.657 19:53:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 1433838 ']' 00:18:07.657 19:53:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:07.657 19:53:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:18:07.657 19:53:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:07.657 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:07.657 19:53:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:18:07.657 19:53:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:07.657 [2024-07-24 19:53:59.239195] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:18:07.657 [2024-07-24 19:53:59.239251] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1433838 ] 00:18:07.915 [2024-07-24 19:53:59.353559] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:07.915 [2024-07-24 19:53:59.454548] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:08.173 [2024-07-24 19:53:59.518064] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:08.173 [2024-07-24 19:53:59.518099] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:08.740 19:54:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:18:08.740 19:54:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:18:08.740 19:54:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:18:08.740 19:54:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:08.740 BaseBdev1_malloc 00:18:08.740 19:54:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:08.997 true 00:18:08.997 19:54:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:09.255 [2024-07-24 19:54:00.651284] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:09.255 [2024-07-24 19:54:00.651332] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:09.255 [2024-07-24 19:54:00.651354] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25b23a0 00:18:09.255 [2024-07-24 19:54:00.651367] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:09.255 [2024-07-24 19:54:00.653074] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:09.255 [2024-07-24 19:54:00.653104] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:09.255 BaseBdev1 00:18:09.255 19:54:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:18:09.255 19:54:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:09.255 BaseBdev2_malloc 00:18:09.513 19:54:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:09.513 true 00:18:09.513 19:54:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:09.771 [2024-07-24 19:54:01.197387] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:09.771 [2024-07-24 19:54:01.197442] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:09.771 [2024-07-24 19:54:01.197466] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2671370 00:18:09.771 [2024-07-24 19:54:01.197479] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:09.771 [2024-07-24 19:54:01.199039] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:09.771 [2024-07-24 19:54:01.199069] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:09.771 BaseBdev2 00:18:09.771 19:54:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:18:09.771 19:54:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:10.030 BaseBdev3_malloc 00:18:10.030 19:54:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:10.289 true 00:18:10.289 19:54:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:10.289 [2024-07-24 19:54:01.827783] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:10.289 [2024-07-24 19:54:01.827830] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:10.289 [2024-07-24 19:54:01.827856] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25a72d0 00:18:10.289 [2024-07-24 19:54:01.827868] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:10.289 [2024-07-24 19:54:01.829362] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:10.289 [2024-07-24 19:54:01.829404] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:10.289 BaseBdev3 00:18:10.289 19:54:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:18:10.548 [2024-07-24 19:54:02.016309] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:10.548 [2024-07-24 19:54:02.017563] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:10.548 [2024-07-24 19:54:02.017631] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:10.548 [2024-07-24 19:54:02.017849] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x25a8860 00:18:10.548 [2024-07-24 19:54:02.017861] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:10.548 [2024-07-24 19:54:02.018054] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25aa6a0 00:18:10.548 [2024-07-24 19:54:02.018207] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25a8860 00:18:10.548 [2024-07-24 19:54:02.018217] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25a8860 00:18:10.548 [2024-07-24 19:54:02.018323] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:10.548 19:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:10.548 19:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:10.548 19:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:10.548 19:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:10.548 19:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:10.548 19:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:10.548 19:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:10.548 19:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:10.548 19:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:10.548 19:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:10.548 19:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:10.548 19:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:10.807 19:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:10.807 "name": "raid_bdev1", 00:18:10.807 "uuid": "9e5d9789-4451-476f-a363-70d43e03de68", 00:18:10.807 "strip_size_kb": 0, 00:18:10.807 "state": "online", 00:18:10.807 "raid_level": "raid1", 00:18:10.807 "superblock": true, 00:18:10.807 "num_base_bdevs": 3, 00:18:10.807 "num_base_bdevs_discovered": 3, 00:18:10.807 "num_base_bdevs_operational": 3, 00:18:10.807 "base_bdevs_list": [ 00:18:10.807 { 00:18:10.807 "name": "BaseBdev1", 00:18:10.807 "uuid": "a2609e7a-558c-5958-a8ad-d1878c36cd62", 00:18:10.807 "is_configured": true, 00:18:10.807 "data_offset": 2048, 00:18:10.807 "data_size": 63488 00:18:10.807 }, 00:18:10.807 { 00:18:10.807 "name": "BaseBdev2", 00:18:10.807 "uuid": "a7a32d1a-5bdd-5752-8347-2e5cde45daa5", 00:18:10.807 "is_configured": true, 00:18:10.807 "data_offset": 2048, 00:18:10.807 "data_size": 63488 00:18:10.807 }, 00:18:10.807 { 00:18:10.807 "name": "BaseBdev3", 00:18:10.807 "uuid": "24045a09-3a54-523c-adb3-2a47d5cfa6dc", 00:18:10.807 "is_configured": true, 00:18:10.807 "data_offset": 2048, 00:18:10.807 "data_size": 63488 00:18:10.807 } 00:18:10.807 ] 00:18:10.807 }' 00:18:10.807 19:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:10.807 19:54:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:11.375 19:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:18:11.375 19:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:11.375 [2024-07-24 19:54:02.955142] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25ae3e0 00:18:12.313 19:54:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:18:12.572 19:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:18:12.572 19:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid1 = \r\a\i\d\1 ]] 00:18:12.572 19:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ read = \w\r\i\t\e ]] 00:18:12.572 19:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=3 00:18:12.572 19:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:12.572 19:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:12.572 19:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:12.572 19:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:12.572 19:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:12.572 19:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:12.572 19:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:12.572 19:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:12.572 19:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:12.572 19:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:12.572 19:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:12.572 19:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:12.831 19:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:12.831 "name": "raid_bdev1", 00:18:12.831 "uuid": "9e5d9789-4451-476f-a363-70d43e03de68", 00:18:12.831 "strip_size_kb": 0, 00:18:12.831 "state": "online", 00:18:12.831 "raid_level": "raid1", 00:18:12.831 "superblock": true, 00:18:12.831 "num_base_bdevs": 3, 00:18:12.831 "num_base_bdevs_discovered": 3, 00:18:12.831 "num_base_bdevs_operational": 3, 00:18:12.831 "base_bdevs_list": [ 00:18:12.831 { 00:18:12.831 "name": "BaseBdev1", 00:18:12.831 "uuid": "a2609e7a-558c-5958-a8ad-d1878c36cd62", 00:18:12.831 "is_configured": true, 00:18:12.831 "data_offset": 2048, 00:18:12.831 "data_size": 63488 00:18:12.831 }, 00:18:12.831 { 00:18:12.831 "name": "BaseBdev2", 00:18:12.831 "uuid": "a7a32d1a-5bdd-5752-8347-2e5cde45daa5", 00:18:12.831 "is_configured": true, 00:18:12.831 "data_offset": 2048, 00:18:12.831 "data_size": 63488 00:18:12.831 }, 00:18:12.831 { 00:18:12.831 "name": "BaseBdev3", 00:18:12.831 "uuid": "24045a09-3a54-523c-adb3-2a47d5cfa6dc", 00:18:12.831 "is_configured": true, 00:18:12.831 "data_offset": 2048, 00:18:12.831 "data_size": 63488 00:18:12.831 } 00:18:12.831 ] 00:18:12.831 }' 00:18:12.831 19:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:12.831 19:54:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:13.398 19:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:13.657 [2024-07-24 19:54:05.114493] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:13.657 [2024-07-24 19:54:05.114531] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:13.657 [2024-07-24 19:54:05.117679] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:13.657 [2024-07-24 19:54:05.117717] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:13.657 [2024-07-24 19:54:05.117813] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:13.657 [2024-07-24 19:54:05.117826] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25a8860 name raid_bdev1, state offline 00:18:13.657 0 00:18:13.657 19:54:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1433838 00:18:13.657 19:54:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 1433838 ']' 00:18:13.657 19:54:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 1433838 00:18:13.657 19:54:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:18:13.657 19:54:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:18:13.657 19:54:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1433838 00:18:13.657 19:54:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:18:13.657 19:54:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:18:13.657 19:54:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1433838' 00:18:13.657 killing process with pid 1433838 00:18:13.657 19:54:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 1433838 00:18:13.657 [2024-07-24 19:54:05.198263] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:13.657 19:54:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 1433838 00:18:13.657 [2024-07-24 19:54:05.218727] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:13.916 19:54:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.9HtKziUkVM 00:18:13.916 19:54:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:18:13.916 19:54:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:18:13.916 19:54:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.00 00:18:13.916 19:54:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid1 00:18:13.916 19:54:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:13.916 19:54:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:18:13.916 19:54:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@861 -- # [[ 0.00 = \0\.\0\0 ]] 00:18:13.916 00:18:13.916 real 0m6.272s 00:18:13.916 user 0m9.784s 00:18:13.916 sys 0m1.149s 00:18:13.916 19:54:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:18:13.916 19:54:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:13.916 ************************************ 00:18:13.916 END TEST raid_read_error_test 00:18:13.916 ************************************ 00:18:13.916 19:54:05 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:18:13.916 19:54:05 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:18:13.916 19:54:05 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:18:13.916 19:54:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:14.176 ************************************ 00:18:14.176 START TEST raid_write_error_test 00:18:14.176 ************************************ 00:18:14.176 19:54:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 3 write 00:18:14.176 19:54:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid1 00:18:14.176 19:54:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=3 00:18:14.176 19:54:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:18:14.176 19:54:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:18:14.176 19:54:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:14.176 19:54:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:18:14.176 19:54:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:18:14.176 19:54:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:14.176 19:54:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:18:14.176 19:54:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:18:14.176 19:54:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:14.176 19:54:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:18:14.176 19:54:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:18:14.176 19:54:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:14.176 19:54:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:18:14.176 19:54:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:18:14.176 19:54:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:18:14.177 19:54:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:18:14.177 19:54:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:18:14.177 19:54:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:18:14.177 19:54:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:18:14.177 19:54:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid1 '!=' raid1 ']' 00:18:14.177 19:54:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@818 -- # strip_size=0 00:18:14.177 19:54:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:18:14.177 19:54:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.BDJxIZI1j4 00:18:14.177 19:54:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1435173 00:18:14.177 19:54:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1435173 /var/tmp/spdk-raid.sock 00:18:14.177 19:54:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:14.177 19:54:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 1435173 ']' 00:18:14.177 19:54:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:14.177 19:54:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:18:14.177 19:54:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:14.177 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:14.177 19:54:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:18:14.177 19:54:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:14.177 [2024-07-24 19:54:05.612004] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:18:14.177 [2024-07-24 19:54:05.612075] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1435173 ] 00:18:14.177 [2024-07-24 19:54:05.741363] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:14.436 [2024-07-24 19:54:05.844368] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:14.436 [2024-07-24 19:54:05.907494] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:14.436 [2024-07-24 19:54:05.907534] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:15.372 19:54:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:18:15.372 19:54:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:18:15.372 19:54:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:18:15.372 19:54:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:15.631 BaseBdev1_malloc 00:18:15.631 19:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:16.197 true 00:18:16.197 19:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:16.456 [2024-07-24 19:54:07.803738] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:16.456 [2024-07-24 19:54:07.803786] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:16.456 [2024-07-24 19:54:07.803806] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf283a0 00:18:16.456 [2024-07-24 19:54:07.803819] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:16.456 [2024-07-24 19:54:07.805613] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:16.456 [2024-07-24 19:54:07.805641] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:16.456 BaseBdev1 00:18:16.456 19:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:18:16.456 19:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:17.023 BaseBdev2_malloc 00:18:17.023 19:54:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:17.023 true 00:18:17.023 19:54:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:17.588 [2024-07-24 19:54:09.068888] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:17.588 [2024-07-24 19:54:09.068934] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:17.588 [2024-07-24 19:54:09.068959] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfe7370 00:18:17.588 [2024-07-24 19:54:09.068978] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:17.588 [2024-07-24 19:54:09.070609] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:17.588 [2024-07-24 19:54:09.070638] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:17.588 BaseBdev2 00:18:17.588 19:54:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:18:17.588 19:54:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:17.846 BaseBdev3_malloc 00:18:17.846 19:54:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:18.414 true 00:18:18.414 19:54:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:18.983 [2024-07-24 19:54:10.350048] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:18.983 [2024-07-24 19:54:10.350093] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:18.984 [2024-07-24 19:54:10.350117] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf1d2d0 00:18:18.984 [2024-07-24 19:54:10.350130] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:18.984 [2024-07-24 19:54:10.351723] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:18.984 [2024-07-24 19:54:10.351751] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:18.984 BaseBdev3 00:18:18.984 19:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:18:19.242 [2024-07-24 19:54:10.606760] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:19.242 [2024-07-24 19:54:10.608087] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:19.242 [2024-07-24 19:54:10.608157] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:19.242 [2024-07-24 19:54:10.608382] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xf1e860 00:18:19.242 [2024-07-24 19:54:10.608402] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:19.242 [2024-07-24 19:54:10.608599] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf206a0 00:18:19.242 [2024-07-24 19:54:10.608757] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf1e860 00:18:19.242 [2024-07-24 19:54:10.608767] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf1e860 00:18:19.242 [2024-07-24 19:54:10.608878] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:19.242 19:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:19.242 19:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:19.242 19:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:19.242 19:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:19.242 19:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:19.242 19:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:19.242 19:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:19.242 19:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:19.242 19:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:19.242 19:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:19.242 19:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:19.242 19:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:19.500 19:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:19.500 "name": "raid_bdev1", 00:18:19.500 "uuid": "1549adbb-fbcb-43d9-8160-3d8b18c76cce", 00:18:19.500 "strip_size_kb": 0, 00:18:19.500 "state": "online", 00:18:19.500 "raid_level": "raid1", 00:18:19.500 "superblock": true, 00:18:19.500 "num_base_bdevs": 3, 00:18:19.500 "num_base_bdevs_discovered": 3, 00:18:19.500 "num_base_bdevs_operational": 3, 00:18:19.500 "base_bdevs_list": [ 00:18:19.500 { 00:18:19.500 "name": "BaseBdev1", 00:18:19.500 "uuid": "3bfc72c0-4126-5008-9529-dcf896d64053", 00:18:19.500 "is_configured": true, 00:18:19.500 "data_offset": 2048, 00:18:19.500 "data_size": 63488 00:18:19.501 }, 00:18:19.501 { 00:18:19.501 "name": "BaseBdev2", 00:18:19.501 "uuid": "8359af7d-4acc-571b-9b38-8e0f605e4bda", 00:18:19.501 "is_configured": true, 00:18:19.501 "data_offset": 2048, 00:18:19.501 "data_size": 63488 00:18:19.501 }, 00:18:19.501 { 00:18:19.501 "name": "BaseBdev3", 00:18:19.501 "uuid": "d41f747e-356d-5cec-b211-ef1f2dca0b6b", 00:18:19.501 "is_configured": true, 00:18:19.501 "data_offset": 2048, 00:18:19.501 "data_size": 63488 00:18:19.501 } 00:18:19.501 ] 00:18:19.501 }' 00:18:19.501 19:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:19.501 19:54:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:20.068 19:54:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:18:20.068 19:54:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:20.068 [2024-07-24 19:54:11.557595] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf243e0 00:18:21.005 19:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:18:21.264 [2024-07-24 19:54:12.758668] bdev_raid.c:2247:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:18:21.264 [2024-07-24 19:54:12.758727] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:21.264 [2024-07-24 19:54:12.758926] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xf243e0 00:18:21.264 19:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:18:21.264 19:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid1 = \r\a\i\d\1 ]] 00:18:21.264 19:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ write = \w\r\i\t\e ]] 00:18:21.264 19:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # expected_num_base_bdevs=2 00:18:21.264 19:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:21.264 19:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:21.264 19:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:21.264 19:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:21.264 19:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:21.264 19:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:21.264 19:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:21.264 19:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:21.264 19:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:21.264 19:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:21.264 19:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:21.264 19:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:21.523 19:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:21.523 "name": "raid_bdev1", 00:18:21.523 "uuid": "1549adbb-fbcb-43d9-8160-3d8b18c76cce", 00:18:21.523 "strip_size_kb": 0, 00:18:21.523 "state": "online", 00:18:21.523 "raid_level": "raid1", 00:18:21.523 "superblock": true, 00:18:21.523 "num_base_bdevs": 3, 00:18:21.523 "num_base_bdevs_discovered": 2, 00:18:21.523 "num_base_bdevs_operational": 2, 00:18:21.523 "base_bdevs_list": [ 00:18:21.523 { 00:18:21.523 "name": null, 00:18:21.523 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:21.523 "is_configured": false, 00:18:21.523 "data_offset": 2048, 00:18:21.523 "data_size": 63488 00:18:21.523 }, 00:18:21.523 { 00:18:21.523 "name": "BaseBdev2", 00:18:21.523 "uuid": "8359af7d-4acc-571b-9b38-8e0f605e4bda", 00:18:21.523 "is_configured": true, 00:18:21.523 "data_offset": 2048, 00:18:21.523 "data_size": 63488 00:18:21.523 }, 00:18:21.523 { 00:18:21.523 "name": "BaseBdev3", 00:18:21.523 "uuid": "d41f747e-356d-5cec-b211-ef1f2dca0b6b", 00:18:21.523 "is_configured": true, 00:18:21.523 "data_offset": 2048, 00:18:21.523 "data_size": 63488 00:18:21.523 } 00:18:21.523 ] 00:18:21.523 }' 00:18:21.523 19:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:21.523 19:54:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:22.092 19:54:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:22.351 [2024-07-24 19:54:13.809465] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:22.351 [2024-07-24 19:54:13.809501] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:22.351 [2024-07-24 19:54:13.812667] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:22.351 [2024-07-24 19:54:13.812700] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:22.351 [2024-07-24 19:54:13.812771] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:22.351 [2024-07-24 19:54:13.812783] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf1e860 name raid_bdev1, state offline 00:18:22.351 0 00:18:22.351 19:54:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1435173 00:18:22.351 19:54:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 1435173 ']' 00:18:22.351 19:54:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 1435173 00:18:22.351 19:54:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:18:22.351 19:54:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:18:22.351 19:54:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1435173 00:18:22.351 19:54:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:18:22.351 19:54:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:18:22.351 19:54:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1435173' 00:18:22.351 killing process with pid 1435173 00:18:22.351 19:54:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 1435173 00:18:22.351 [2024-07-24 19:54:13.880923] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:22.351 19:54:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 1435173 00:18:22.351 [2024-07-24 19:54:13.904705] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:22.609 19:54:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.BDJxIZI1j4 00:18:22.609 19:54:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:18:22.609 19:54:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:18:22.609 19:54:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.00 00:18:22.609 19:54:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid1 00:18:22.609 19:54:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:22.609 19:54:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:18:22.609 19:54:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@861 -- # [[ 0.00 = \0\.\0\0 ]] 00:18:22.609 00:18:22.609 real 0m8.617s 00:18:22.609 user 0m14.066s 00:18:22.609 sys 0m1.439s 00:18:22.609 19:54:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:18:22.609 19:54:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:22.609 ************************************ 00:18:22.609 END TEST raid_write_error_test 00:18:22.609 ************************************ 00:18:22.609 19:54:14 bdev_raid -- bdev/bdev_raid.sh@945 -- # for n in {2..4} 00:18:22.609 19:54:14 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:18:22.609 19:54:14 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:18:22.609 19:54:14 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:18:22.609 19:54:14 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:18:22.609 19:54:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:22.869 ************************************ 00:18:22.869 START TEST raid_state_function_test 00:18:22.869 ************************************ 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 4 false 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1436476 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1436476' 00:18:22.869 Process raid pid: 1436476 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1436476 /var/tmp/spdk-raid.sock 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 1436476 ']' 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:22.869 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:18:22.869 19:54:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:22.869 [2024-07-24 19:54:14.307239] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:18:22.869 [2024-07-24 19:54:14.307313] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:22.869 [2024-07-24 19:54:14.438910] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:23.128 [2024-07-24 19:54:14.541939] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:23.128 [2024-07-24 19:54:14.609661] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:23.128 [2024-07-24 19:54:14.609686] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:23.728 19:54:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:18:23.728 19:54:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:18:23.728 19:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:23.997 [2024-07-24 19:54:15.441560] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:23.997 [2024-07-24 19:54:15.441602] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:23.997 [2024-07-24 19:54:15.441613] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:23.997 [2024-07-24 19:54:15.441625] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:23.997 [2024-07-24 19:54:15.441633] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:23.997 [2024-07-24 19:54:15.441645] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:23.997 [2024-07-24 19:54:15.441653] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:23.997 [2024-07-24 19:54:15.441664] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:23.997 19:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:23.997 19:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:23.998 19:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:23.998 19:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:23.998 19:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:23.998 19:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:23.998 19:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:23.998 19:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:23.998 19:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:23.998 19:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:23.998 19:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:23.998 19:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:24.257 19:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:24.257 "name": "Existed_Raid", 00:18:24.257 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:24.257 "strip_size_kb": 64, 00:18:24.257 "state": "configuring", 00:18:24.257 "raid_level": "raid0", 00:18:24.257 "superblock": false, 00:18:24.257 "num_base_bdevs": 4, 00:18:24.257 "num_base_bdevs_discovered": 0, 00:18:24.257 "num_base_bdevs_operational": 4, 00:18:24.257 "base_bdevs_list": [ 00:18:24.257 { 00:18:24.257 "name": "BaseBdev1", 00:18:24.257 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:24.257 "is_configured": false, 00:18:24.257 "data_offset": 0, 00:18:24.257 "data_size": 0 00:18:24.257 }, 00:18:24.257 { 00:18:24.257 "name": "BaseBdev2", 00:18:24.257 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:24.257 "is_configured": false, 00:18:24.257 "data_offset": 0, 00:18:24.257 "data_size": 0 00:18:24.257 }, 00:18:24.257 { 00:18:24.257 "name": "BaseBdev3", 00:18:24.257 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:24.257 "is_configured": false, 00:18:24.257 "data_offset": 0, 00:18:24.257 "data_size": 0 00:18:24.257 }, 00:18:24.257 { 00:18:24.257 "name": "BaseBdev4", 00:18:24.257 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:24.257 "is_configured": false, 00:18:24.257 "data_offset": 0, 00:18:24.257 "data_size": 0 00:18:24.257 } 00:18:24.257 ] 00:18:24.257 }' 00:18:24.257 19:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:24.257 19:54:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:24.825 19:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:25.084 [2024-07-24 19:54:16.524334] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:25.084 [2024-07-24 19:54:16.524363] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b7ca30 name Existed_Raid, state configuring 00:18:25.084 19:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:25.343 [2024-07-24 19:54:16.773085] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:25.343 [2024-07-24 19:54:16.773118] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:25.343 [2024-07-24 19:54:16.773128] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:25.343 [2024-07-24 19:54:16.773140] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:25.343 [2024-07-24 19:54:16.773148] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:25.343 [2024-07-24 19:54:16.773159] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:25.343 [2024-07-24 19:54:16.773168] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:25.343 [2024-07-24 19:54:16.773179] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:25.343 19:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:25.603 [2024-07-24 19:54:17.031683] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:25.603 BaseBdev1 00:18:25.603 19:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:18:25.603 19:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:18:25.603 19:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:25.603 19:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:18:25.603 19:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:25.603 19:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:25.603 19:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:25.862 19:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:26.122 [ 00:18:26.122 { 00:18:26.122 "name": "BaseBdev1", 00:18:26.122 "aliases": [ 00:18:26.122 "381997e7-acdc-48da-baac-1c9a8c411515" 00:18:26.122 ], 00:18:26.122 "product_name": "Malloc disk", 00:18:26.122 "block_size": 512, 00:18:26.122 "num_blocks": 65536, 00:18:26.122 "uuid": "381997e7-acdc-48da-baac-1c9a8c411515", 00:18:26.122 "assigned_rate_limits": { 00:18:26.122 "rw_ios_per_sec": 0, 00:18:26.122 "rw_mbytes_per_sec": 0, 00:18:26.122 "r_mbytes_per_sec": 0, 00:18:26.122 "w_mbytes_per_sec": 0 00:18:26.122 }, 00:18:26.122 "claimed": true, 00:18:26.122 "claim_type": "exclusive_write", 00:18:26.122 "zoned": false, 00:18:26.122 "supported_io_types": { 00:18:26.122 "read": true, 00:18:26.122 "write": true, 00:18:26.122 "unmap": true, 00:18:26.122 "flush": true, 00:18:26.122 "reset": true, 00:18:26.122 "nvme_admin": false, 00:18:26.122 "nvme_io": false, 00:18:26.122 "nvme_io_md": false, 00:18:26.122 "write_zeroes": true, 00:18:26.122 "zcopy": true, 00:18:26.122 "get_zone_info": false, 00:18:26.122 "zone_management": false, 00:18:26.122 "zone_append": false, 00:18:26.122 "compare": false, 00:18:26.122 "compare_and_write": false, 00:18:26.122 "abort": true, 00:18:26.122 "seek_hole": false, 00:18:26.122 "seek_data": false, 00:18:26.122 "copy": true, 00:18:26.122 "nvme_iov_md": false 00:18:26.122 }, 00:18:26.122 "memory_domains": [ 00:18:26.122 { 00:18:26.122 "dma_device_id": "system", 00:18:26.122 "dma_device_type": 1 00:18:26.122 }, 00:18:26.122 { 00:18:26.122 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:26.122 "dma_device_type": 2 00:18:26.122 } 00:18:26.122 ], 00:18:26.122 "driver_specific": {} 00:18:26.122 } 00:18:26.122 ] 00:18:26.122 19:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:18:26.122 19:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:26.122 19:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:26.122 19:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:26.122 19:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:26.122 19:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:26.122 19:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:26.122 19:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:26.122 19:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:26.122 19:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:26.122 19:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:26.122 19:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:26.122 19:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:26.381 19:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:26.381 "name": "Existed_Raid", 00:18:26.381 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:26.381 "strip_size_kb": 64, 00:18:26.381 "state": "configuring", 00:18:26.381 "raid_level": "raid0", 00:18:26.381 "superblock": false, 00:18:26.381 "num_base_bdevs": 4, 00:18:26.381 "num_base_bdevs_discovered": 1, 00:18:26.381 "num_base_bdevs_operational": 4, 00:18:26.381 "base_bdevs_list": [ 00:18:26.381 { 00:18:26.381 "name": "BaseBdev1", 00:18:26.381 "uuid": "381997e7-acdc-48da-baac-1c9a8c411515", 00:18:26.381 "is_configured": true, 00:18:26.381 "data_offset": 0, 00:18:26.381 "data_size": 65536 00:18:26.381 }, 00:18:26.381 { 00:18:26.381 "name": "BaseBdev2", 00:18:26.381 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:26.381 "is_configured": false, 00:18:26.381 "data_offset": 0, 00:18:26.381 "data_size": 0 00:18:26.381 }, 00:18:26.381 { 00:18:26.381 "name": "BaseBdev3", 00:18:26.381 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:26.381 "is_configured": false, 00:18:26.381 "data_offset": 0, 00:18:26.381 "data_size": 0 00:18:26.381 }, 00:18:26.381 { 00:18:26.381 "name": "BaseBdev4", 00:18:26.381 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:26.381 "is_configured": false, 00:18:26.382 "data_offset": 0, 00:18:26.382 "data_size": 0 00:18:26.382 } 00:18:26.382 ] 00:18:26.382 }' 00:18:26.382 19:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:26.382 19:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:26.949 19:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:27.208 [2024-07-24 19:54:18.627914] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:27.208 [2024-07-24 19:54:18.627955] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b7c2a0 name Existed_Raid, state configuring 00:18:27.208 19:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:27.467 [2024-07-24 19:54:18.876608] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:27.467 [2024-07-24 19:54:18.878025] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:27.467 [2024-07-24 19:54:18.878058] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:27.467 [2024-07-24 19:54:18.878069] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:27.467 [2024-07-24 19:54:18.878080] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:27.467 [2024-07-24 19:54:18.878089] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:27.467 [2024-07-24 19:54:18.878100] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:27.467 19:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:18:27.467 19:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:27.467 19:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:27.467 19:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:27.467 19:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:27.467 19:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:27.467 19:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:27.467 19:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:27.467 19:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:27.467 19:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:27.467 19:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:27.467 19:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:27.467 19:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:27.467 19:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:27.727 19:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:27.727 "name": "Existed_Raid", 00:18:27.727 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.727 "strip_size_kb": 64, 00:18:27.727 "state": "configuring", 00:18:27.727 "raid_level": "raid0", 00:18:27.727 "superblock": false, 00:18:27.727 "num_base_bdevs": 4, 00:18:27.727 "num_base_bdevs_discovered": 1, 00:18:27.727 "num_base_bdevs_operational": 4, 00:18:27.727 "base_bdevs_list": [ 00:18:27.727 { 00:18:27.727 "name": "BaseBdev1", 00:18:27.727 "uuid": "381997e7-acdc-48da-baac-1c9a8c411515", 00:18:27.727 "is_configured": true, 00:18:27.727 "data_offset": 0, 00:18:27.727 "data_size": 65536 00:18:27.727 }, 00:18:27.727 { 00:18:27.727 "name": "BaseBdev2", 00:18:27.727 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.727 "is_configured": false, 00:18:27.727 "data_offset": 0, 00:18:27.727 "data_size": 0 00:18:27.727 }, 00:18:27.727 { 00:18:27.727 "name": "BaseBdev3", 00:18:27.727 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.727 "is_configured": false, 00:18:27.727 "data_offset": 0, 00:18:27.727 "data_size": 0 00:18:27.727 }, 00:18:27.727 { 00:18:27.727 "name": "BaseBdev4", 00:18:27.727 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.727 "is_configured": false, 00:18:27.727 "data_offset": 0, 00:18:27.727 "data_size": 0 00:18:27.727 } 00:18:27.727 ] 00:18:27.727 }' 00:18:27.727 19:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:27.727 19:54:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:28.295 19:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:28.554 [2024-07-24 19:54:19.974870] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:28.554 BaseBdev2 00:18:28.554 19:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:18:28.554 19:54:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:18:28.554 19:54:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:28.554 19:54:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:18:28.554 19:54:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:28.554 19:54:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:28.554 19:54:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:28.813 19:54:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:29.072 [ 00:18:29.072 { 00:18:29.072 "name": "BaseBdev2", 00:18:29.072 "aliases": [ 00:18:29.072 "3aa57b10-8a19-42fc-8fc2-aadf3f986659" 00:18:29.072 ], 00:18:29.072 "product_name": "Malloc disk", 00:18:29.072 "block_size": 512, 00:18:29.072 "num_blocks": 65536, 00:18:29.072 "uuid": "3aa57b10-8a19-42fc-8fc2-aadf3f986659", 00:18:29.072 "assigned_rate_limits": { 00:18:29.072 "rw_ios_per_sec": 0, 00:18:29.072 "rw_mbytes_per_sec": 0, 00:18:29.072 "r_mbytes_per_sec": 0, 00:18:29.072 "w_mbytes_per_sec": 0 00:18:29.072 }, 00:18:29.072 "claimed": true, 00:18:29.072 "claim_type": "exclusive_write", 00:18:29.072 "zoned": false, 00:18:29.072 "supported_io_types": { 00:18:29.072 "read": true, 00:18:29.072 "write": true, 00:18:29.072 "unmap": true, 00:18:29.072 "flush": true, 00:18:29.072 "reset": true, 00:18:29.072 "nvme_admin": false, 00:18:29.072 "nvme_io": false, 00:18:29.072 "nvme_io_md": false, 00:18:29.072 "write_zeroes": true, 00:18:29.072 "zcopy": true, 00:18:29.072 "get_zone_info": false, 00:18:29.072 "zone_management": false, 00:18:29.072 "zone_append": false, 00:18:29.072 "compare": false, 00:18:29.072 "compare_and_write": false, 00:18:29.072 "abort": true, 00:18:29.072 "seek_hole": false, 00:18:29.072 "seek_data": false, 00:18:29.072 "copy": true, 00:18:29.072 "nvme_iov_md": false 00:18:29.072 }, 00:18:29.072 "memory_domains": [ 00:18:29.072 { 00:18:29.072 "dma_device_id": "system", 00:18:29.072 "dma_device_type": 1 00:18:29.072 }, 00:18:29.072 { 00:18:29.072 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:29.072 "dma_device_type": 2 00:18:29.072 } 00:18:29.072 ], 00:18:29.072 "driver_specific": {} 00:18:29.072 } 00:18:29.072 ] 00:18:29.072 19:54:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:18:29.072 19:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:29.072 19:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:29.072 19:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:29.072 19:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:29.072 19:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:29.072 19:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:29.072 19:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:29.072 19:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:29.072 19:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:29.072 19:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:29.072 19:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:29.072 19:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:29.072 19:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.072 19:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:29.331 19:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:29.331 "name": "Existed_Raid", 00:18:29.331 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:29.331 "strip_size_kb": 64, 00:18:29.331 "state": "configuring", 00:18:29.331 "raid_level": "raid0", 00:18:29.331 "superblock": false, 00:18:29.331 "num_base_bdevs": 4, 00:18:29.331 "num_base_bdevs_discovered": 2, 00:18:29.331 "num_base_bdevs_operational": 4, 00:18:29.331 "base_bdevs_list": [ 00:18:29.331 { 00:18:29.331 "name": "BaseBdev1", 00:18:29.331 "uuid": "381997e7-acdc-48da-baac-1c9a8c411515", 00:18:29.331 "is_configured": true, 00:18:29.331 "data_offset": 0, 00:18:29.331 "data_size": 65536 00:18:29.331 }, 00:18:29.331 { 00:18:29.331 "name": "BaseBdev2", 00:18:29.331 "uuid": "3aa57b10-8a19-42fc-8fc2-aadf3f986659", 00:18:29.331 "is_configured": true, 00:18:29.331 "data_offset": 0, 00:18:29.331 "data_size": 65536 00:18:29.331 }, 00:18:29.331 { 00:18:29.331 "name": "BaseBdev3", 00:18:29.331 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:29.331 "is_configured": false, 00:18:29.331 "data_offset": 0, 00:18:29.331 "data_size": 0 00:18:29.331 }, 00:18:29.331 { 00:18:29.331 "name": "BaseBdev4", 00:18:29.331 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:29.331 "is_configured": false, 00:18:29.331 "data_offset": 0, 00:18:29.331 "data_size": 0 00:18:29.331 } 00:18:29.331 ] 00:18:29.331 }' 00:18:29.332 19:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:29.332 19:54:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:29.899 19:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:30.158 [2024-07-24 19:54:21.586533] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:30.158 BaseBdev3 00:18:30.158 19:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:18:30.158 19:54:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:18:30.158 19:54:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:30.158 19:54:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:18:30.158 19:54:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:30.158 19:54:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:30.158 19:54:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:30.416 19:54:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:30.674 [ 00:18:30.674 { 00:18:30.674 "name": "BaseBdev3", 00:18:30.674 "aliases": [ 00:18:30.674 "a06420ff-3012-4cea-8972-04214f8d930f" 00:18:30.674 ], 00:18:30.674 "product_name": "Malloc disk", 00:18:30.674 "block_size": 512, 00:18:30.674 "num_blocks": 65536, 00:18:30.674 "uuid": "a06420ff-3012-4cea-8972-04214f8d930f", 00:18:30.674 "assigned_rate_limits": { 00:18:30.674 "rw_ios_per_sec": 0, 00:18:30.674 "rw_mbytes_per_sec": 0, 00:18:30.674 "r_mbytes_per_sec": 0, 00:18:30.674 "w_mbytes_per_sec": 0 00:18:30.674 }, 00:18:30.674 "claimed": true, 00:18:30.674 "claim_type": "exclusive_write", 00:18:30.674 "zoned": false, 00:18:30.674 "supported_io_types": { 00:18:30.674 "read": true, 00:18:30.674 "write": true, 00:18:30.674 "unmap": true, 00:18:30.674 "flush": true, 00:18:30.674 "reset": true, 00:18:30.674 "nvme_admin": false, 00:18:30.674 "nvme_io": false, 00:18:30.674 "nvme_io_md": false, 00:18:30.674 "write_zeroes": true, 00:18:30.674 "zcopy": true, 00:18:30.674 "get_zone_info": false, 00:18:30.674 "zone_management": false, 00:18:30.674 "zone_append": false, 00:18:30.674 "compare": false, 00:18:30.674 "compare_and_write": false, 00:18:30.674 "abort": true, 00:18:30.674 "seek_hole": false, 00:18:30.674 "seek_data": false, 00:18:30.674 "copy": true, 00:18:30.674 "nvme_iov_md": false 00:18:30.674 }, 00:18:30.674 "memory_domains": [ 00:18:30.674 { 00:18:30.674 "dma_device_id": "system", 00:18:30.674 "dma_device_type": 1 00:18:30.674 }, 00:18:30.674 { 00:18:30.674 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.674 "dma_device_type": 2 00:18:30.674 } 00:18:30.674 ], 00:18:30.674 "driver_specific": {} 00:18:30.674 } 00:18:30.674 ] 00:18:30.674 19:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:18:30.674 19:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:30.674 19:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:30.674 19:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:30.674 19:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:30.674 19:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:30.674 19:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:30.674 19:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:30.674 19:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:30.674 19:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:30.674 19:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:30.674 19:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:30.674 19:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:30.674 19:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:30.674 19:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:30.933 19:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:30.933 "name": "Existed_Raid", 00:18:30.933 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:30.933 "strip_size_kb": 64, 00:18:30.933 "state": "configuring", 00:18:30.933 "raid_level": "raid0", 00:18:30.933 "superblock": false, 00:18:30.933 "num_base_bdevs": 4, 00:18:30.933 "num_base_bdevs_discovered": 3, 00:18:30.933 "num_base_bdevs_operational": 4, 00:18:30.933 "base_bdevs_list": [ 00:18:30.933 { 00:18:30.933 "name": "BaseBdev1", 00:18:30.933 "uuid": "381997e7-acdc-48da-baac-1c9a8c411515", 00:18:30.933 "is_configured": true, 00:18:30.933 "data_offset": 0, 00:18:30.933 "data_size": 65536 00:18:30.933 }, 00:18:30.933 { 00:18:30.933 "name": "BaseBdev2", 00:18:30.933 "uuid": "3aa57b10-8a19-42fc-8fc2-aadf3f986659", 00:18:30.933 "is_configured": true, 00:18:30.933 "data_offset": 0, 00:18:30.933 "data_size": 65536 00:18:30.933 }, 00:18:30.933 { 00:18:30.933 "name": "BaseBdev3", 00:18:30.933 "uuid": "a06420ff-3012-4cea-8972-04214f8d930f", 00:18:30.933 "is_configured": true, 00:18:30.933 "data_offset": 0, 00:18:30.933 "data_size": 65536 00:18:30.933 }, 00:18:30.933 { 00:18:30.933 "name": "BaseBdev4", 00:18:30.933 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:30.933 "is_configured": false, 00:18:30.933 "data_offset": 0, 00:18:30.933 "data_size": 0 00:18:30.933 } 00:18:30.933 ] 00:18:30.933 }' 00:18:30.933 19:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:30.933 19:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:31.499 19:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:31.757 [2024-07-24 19:54:23.178226] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:31.757 [2024-07-24 19:54:23.178264] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b7d300 00:18:31.757 [2024-07-24 19:54:23.178273] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:18:31.757 [2024-07-24 19:54:23.178500] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b7e280 00:18:31.757 [2024-07-24 19:54:23.178634] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b7d300 00:18:31.757 [2024-07-24 19:54:23.178644] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1b7d300 00:18:31.757 [2024-07-24 19:54:23.178809] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:31.757 BaseBdev4 00:18:31.757 19:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:18:31.757 19:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:18:31.758 19:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:31.758 19:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:18:31.758 19:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:31.758 19:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:31.758 19:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:32.015 19:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:32.273 [ 00:18:32.273 { 00:18:32.273 "name": "BaseBdev4", 00:18:32.273 "aliases": [ 00:18:32.273 "43c683a4-5b3c-4abc-a60d-8452fca6b317" 00:18:32.273 ], 00:18:32.273 "product_name": "Malloc disk", 00:18:32.273 "block_size": 512, 00:18:32.273 "num_blocks": 65536, 00:18:32.273 "uuid": "43c683a4-5b3c-4abc-a60d-8452fca6b317", 00:18:32.273 "assigned_rate_limits": { 00:18:32.273 "rw_ios_per_sec": 0, 00:18:32.273 "rw_mbytes_per_sec": 0, 00:18:32.273 "r_mbytes_per_sec": 0, 00:18:32.273 "w_mbytes_per_sec": 0 00:18:32.273 }, 00:18:32.273 "claimed": true, 00:18:32.273 "claim_type": "exclusive_write", 00:18:32.273 "zoned": false, 00:18:32.273 "supported_io_types": { 00:18:32.273 "read": true, 00:18:32.273 "write": true, 00:18:32.273 "unmap": true, 00:18:32.273 "flush": true, 00:18:32.273 "reset": true, 00:18:32.273 "nvme_admin": false, 00:18:32.273 "nvme_io": false, 00:18:32.273 "nvme_io_md": false, 00:18:32.273 "write_zeroes": true, 00:18:32.273 "zcopy": true, 00:18:32.273 "get_zone_info": false, 00:18:32.273 "zone_management": false, 00:18:32.273 "zone_append": false, 00:18:32.273 "compare": false, 00:18:32.273 "compare_and_write": false, 00:18:32.273 "abort": true, 00:18:32.273 "seek_hole": false, 00:18:32.273 "seek_data": false, 00:18:32.273 "copy": true, 00:18:32.273 "nvme_iov_md": false 00:18:32.273 }, 00:18:32.273 "memory_domains": [ 00:18:32.273 { 00:18:32.273 "dma_device_id": "system", 00:18:32.273 "dma_device_type": 1 00:18:32.273 }, 00:18:32.273 { 00:18:32.273 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:32.273 "dma_device_type": 2 00:18:32.273 } 00:18:32.273 ], 00:18:32.273 "driver_specific": {} 00:18:32.273 } 00:18:32.273 ] 00:18:32.273 19:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:18:32.273 19:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:32.273 19:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:32.273 19:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:32.273 19:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:32.273 19:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:32.273 19:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:32.273 19:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:32.273 19:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:32.273 19:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:32.273 19:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:32.273 19:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:32.273 19:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:32.273 19:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:32.273 19:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:32.532 19:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:32.532 "name": "Existed_Raid", 00:18:32.532 "uuid": "c3e9e9a2-724a-42da-b25c-0f9c4b8466d1", 00:18:32.532 "strip_size_kb": 64, 00:18:32.532 "state": "online", 00:18:32.532 "raid_level": "raid0", 00:18:32.532 "superblock": false, 00:18:32.532 "num_base_bdevs": 4, 00:18:32.532 "num_base_bdevs_discovered": 4, 00:18:32.532 "num_base_bdevs_operational": 4, 00:18:32.532 "base_bdevs_list": [ 00:18:32.532 { 00:18:32.532 "name": "BaseBdev1", 00:18:32.532 "uuid": "381997e7-acdc-48da-baac-1c9a8c411515", 00:18:32.532 "is_configured": true, 00:18:32.532 "data_offset": 0, 00:18:32.532 "data_size": 65536 00:18:32.532 }, 00:18:32.532 { 00:18:32.532 "name": "BaseBdev2", 00:18:32.532 "uuid": "3aa57b10-8a19-42fc-8fc2-aadf3f986659", 00:18:32.532 "is_configured": true, 00:18:32.532 "data_offset": 0, 00:18:32.532 "data_size": 65536 00:18:32.532 }, 00:18:32.532 { 00:18:32.532 "name": "BaseBdev3", 00:18:32.532 "uuid": "a06420ff-3012-4cea-8972-04214f8d930f", 00:18:32.532 "is_configured": true, 00:18:32.532 "data_offset": 0, 00:18:32.532 "data_size": 65536 00:18:32.532 }, 00:18:32.532 { 00:18:32.532 "name": "BaseBdev4", 00:18:32.532 "uuid": "43c683a4-5b3c-4abc-a60d-8452fca6b317", 00:18:32.532 "is_configured": true, 00:18:32.532 "data_offset": 0, 00:18:32.532 "data_size": 65536 00:18:32.532 } 00:18:32.532 ] 00:18:32.532 }' 00:18:32.532 19:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:32.532 19:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:33.099 19:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:18:33.099 19:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:33.099 19:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:33.099 19:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:33.099 19:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:33.099 19:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:33.099 19:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:33.099 19:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:33.099 [2024-07-24 19:54:24.618412] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:33.099 19:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:33.099 "name": "Existed_Raid", 00:18:33.099 "aliases": [ 00:18:33.099 "c3e9e9a2-724a-42da-b25c-0f9c4b8466d1" 00:18:33.099 ], 00:18:33.099 "product_name": "Raid Volume", 00:18:33.099 "block_size": 512, 00:18:33.099 "num_blocks": 262144, 00:18:33.099 "uuid": "c3e9e9a2-724a-42da-b25c-0f9c4b8466d1", 00:18:33.099 "assigned_rate_limits": { 00:18:33.099 "rw_ios_per_sec": 0, 00:18:33.099 "rw_mbytes_per_sec": 0, 00:18:33.099 "r_mbytes_per_sec": 0, 00:18:33.099 "w_mbytes_per_sec": 0 00:18:33.099 }, 00:18:33.099 "claimed": false, 00:18:33.099 "zoned": false, 00:18:33.099 "supported_io_types": { 00:18:33.099 "read": true, 00:18:33.099 "write": true, 00:18:33.099 "unmap": true, 00:18:33.099 "flush": true, 00:18:33.099 "reset": true, 00:18:33.099 "nvme_admin": false, 00:18:33.099 "nvme_io": false, 00:18:33.099 "nvme_io_md": false, 00:18:33.099 "write_zeroes": true, 00:18:33.099 "zcopy": false, 00:18:33.099 "get_zone_info": false, 00:18:33.099 "zone_management": false, 00:18:33.099 "zone_append": false, 00:18:33.099 "compare": false, 00:18:33.099 "compare_and_write": false, 00:18:33.099 "abort": false, 00:18:33.099 "seek_hole": false, 00:18:33.099 "seek_data": false, 00:18:33.099 "copy": false, 00:18:33.099 "nvme_iov_md": false 00:18:33.099 }, 00:18:33.099 "memory_domains": [ 00:18:33.099 { 00:18:33.099 "dma_device_id": "system", 00:18:33.099 "dma_device_type": 1 00:18:33.099 }, 00:18:33.099 { 00:18:33.099 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:33.099 "dma_device_type": 2 00:18:33.099 }, 00:18:33.099 { 00:18:33.099 "dma_device_id": "system", 00:18:33.099 "dma_device_type": 1 00:18:33.099 }, 00:18:33.099 { 00:18:33.099 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:33.099 "dma_device_type": 2 00:18:33.099 }, 00:18:33.099 { 00:18:33.099 "dma_device_id": "system", 00:18:33.099 "dma_device_type": 1 00:18:33.099 }, 00:18:33.099 { 00:18:33.099 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:33.099 "dma_device_type": 2 00:18:33.099 }, 00:18:33.099 { 00:18:33.099 "dma_device_id": "system", 00:18:33.099 "dma_device_type": 1 00:18:33.099 }, 00:18:33.099 { 00:18:33.099 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:33.099 "dma_device_type": 2 00:18:33.099 } 00:18:33.099 ], 00:18:33.099 "driver_specific": { 00:18:33.099 "raid": { 00:18:33.099 "uuid": "c3e9e9a2-724a-42da-b25c-0f9c4b8466d1", 00:18:33.099 "strip_size_kb": 64, 00:18:33.099 "state": "online", 00:18:33.099 "raid_level": "raid0", 00:18:33.099 "superblock": false, 00:18:33.099 "num_base_bdevs": 4, 00:18:33.099 "num_base_bdevs_discovered": 4, 00:18:33.099 "num_base_bdevs_operational": 4, 00:18:33.099 "base_bdevs_list": [ 00:18:33.099 { 00:18:33.099 "name": "BaseBdev1", 00:18:33.099 "uuid": "381997e7-acdc-48da-baac-1c9a8c411515", 00:18:33.099 "is_configured": true, 00:18:33.099 "data_offset": 0, 00:18:33.099 "data_size": 65536 00:18:33.099 }, 00:18:33.099 { 00:18:33.099 "name": "BaseBdev2", 00:18:33.099 "uuid": "3aa57b10-8a19-42fc-8fc2-aadf3f986659", 00:18:33.099 "is_configured": true, 00:18:33.099 "data_offset": 0, 00:18:33.099 "data_size": 65536 00:18:33.099 }, 00:18:33.099 { 00:18:33.099 "name": "BaseBdev3", 00:18:33.099 "uuid": "a06420ff-3012-4cea-8972-04214f8d930f", 00:18:33.099 "is_configured": true, 00:18:33.099 "data_offset": 0, 00:18:33.099 "data_size": 65536 00:18:33.099 }, 00:18:33.099 { 00:18:33.099 "name": "BaseBdev4", 00:18:33.099 "uuid": "43c683a4-5b3c-4abc-a60d-8452fca6b317", 00:18:33.099 "is_configured": true, 00:18:33.099 "data_offset": 0, 00:18:33.099 "data_size": 65536 00:18:33.099 } 00:18:33.099 ] 00:18:33.099 } 00:18:33.099 } 00:18:33.099 }' 00:18:33.099 19:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:33.099 19:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:18:33.099 BaseBdev2 00:18:33.099 BaseBdev3 00:18:33.099 BaseBdev4' 00:18:33.099 19:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:33.358 19:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:33.358 19:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:33.358 19:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:33.358 "name": "BaseBdev1", 00:18:33.358 "aliases": [ 00:18:33.358 "381997e7-acdc-48da-baac-1c9a8c411515" 00:18:33.358 ], 00:18:33.358 "product_name": "Malloc disk", 00:18:33.358 "block_size": 512, 00:18:33.358 "num_blocks": 65536, 00:18:33.358 "uuid": "381997e7-acdc-48da-baac-1c9a8c411515", 00:18:33.358 "assigned_rate_limits": { 00:18:33.358 "rw_ios_per_sec": 0, 00:18:33.358 "rw_mbytes_per_sec": 0, 00:18:33.358 "r_mbytes_per_sec": 0, 00:18:33.358 "w_mbytes_per_sec": 0 00:18:33.358 }, 00:18:33.358 "claimed": true, 00:18:33.358 "claim_type": "exclusive_write", 00:18:33.358 "zoned": false, 00:18:33.358 "supported_io_types": { 00:18:33.358 "read": true, 00:18:33.358 "write": true, 00:18:33.358 "unmap": true, 00:18:33.358 "flush": true, 00:18:33.358 "reset": true, 00:18:33.358 "nvme_admin": false, 00:18:33.358 "nvme_io": false, 00:18:33.358 "nvme_io_md": false, 00:18:33.358 "write_zeroes": true, 00:18:33.358 "zcopy": true, 00:18:33.358 "get_zone_info": false, 00:18:33.358 "zone_management": false, 00:18:33.358 "zone_append": false, 00:18:33.358 "compare": false, 00:18:33.358 "compare_and_write": false, 00:18:33.358 "abort": true, 00:18:33.358 "seek_hole": false, 00:18:33.358 "seek_data": false, 00:18:33.358 "copy": true, 00:18:33.358 "nvme_iov_md": false 00:18:33.358 }, 00:18:33.358 "memory_domains": [ 00:18:33.358 { 00:18:33.358 "dma_device_id": "system", 00:18:33.358 "dma_device_type": 1 00:18:33.358 }, 00:18:33.358 { 00:18:33.358 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:33.358 "dma_device_type": 2 00:18:33.358 } 00:18:33.358 ], 00:18:33.358 "driver_specific": {} 00:18:33.358 }' 00:18:33.358 19:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:33.616 19:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:33.616 19:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:33.616 19:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:33.617 19:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:33.617 19:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:33.617 19:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:33.875 19:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:33.875 19:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:33.875 19:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:33.875 19:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:33.875 19:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:33.875 19:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:33.875 19:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:33.875 19:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:34.441 19:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:34.441 "name": "BaseBdev2", 00:18:34.441 "aliases": [ 00:18:34.441 "3aa57b10-8a19-42fc-8fc2-aadf3f986659" 00:18:34.441 ], 00:18:34.441 "product_name": "Malloc disk", 00:18:34.441 "block_size": 512, 00:18:34.441 "num_blocks": 65536, 00:18:34.441 "uuid": "3aa57b10-8a19-42fc-8fc2-aadf3f986659", 00:18:34.441 "assigned_rate_limits": { 00:18:34.441 "rw_ios_per_sec": 0, 00:18:34.441 "rw_mbytes_per_sec": 0, 00:18:34.441 "r_mbytes_per_sec": 0, 00:18:34.441 "w_mbytes_per_sec": 0 00:18:34.441 }, 00:18:34.441 "claimed": true, 00:18:34.441 "claim_type": "exclusive_write", 00:18:34.441 "zoned": false, 00:18:34.441 "supported_io_types": { 00:18:34.441 "read": true, 00:18:34.441 "write": true, 00:18:34.441 "unmap": true, 00:18:34.441 "flush": true, 00:18:34.441 "reset": true, 00:18:34.441 "nvme_admin": false, 00:18:34.441 "nvme_io": false, 00:18:34.441 "nvme_io_md": false, 00:18:34.441 "write_zeroes": true, 00:18:34.441 "zcopy": true, 00:18:34.441 "get_zone_info": false, 00:18:34.441 "zone_management": false, 00:18:34.441 "zone_append": false, 00:18:34.441 "compare": false, 00:18:34.441 "compare_and_write": false, 00:18:34.441 "abort": true, 00:18:34.441 "seek_hole": false, 00:18:34.441 "seek_data": false, 00:18:34.441 "copy": true, 00:18:34.441 "nvme_iov_md": false 00:18:34.441 }, 00:18:34.441 "memory_domains": [ 00:18:34.441 { 00:18:34.441 "dma_device_id": "system", 00:18:34.441 "dma_device_type": 1 00:18:34.441 }, 00:18:34.441 { 00:18:34.441 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:34.441 "dma_device_type": 2 00:18:34.441 } 00:18:34.441 ], 00:18:34.441 "driver_specific": {} 00:18:34.441 }' 00:18:34.441 19:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:34.441 19:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:34.441 19:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:34.441 19:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:34.699 19:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:34.699 19:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:34.699 19:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:34.699 19:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:34.699 19:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:34.699 19:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:34.699 19:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:34.957 19:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:34.957 19:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:34.957 19:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:34.957 19:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:35.523 19:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:35.523 "name": "BaseBdev3", 00:18:35.523 "aliases": [ 00:18:35.523 "a06420ff-3012-4cea-8972-04214f8d930f" 00:18:35.523 ], 00:18:35.523 "product_name": "Malloc disk", 00:18:35.523 "block_size": 512, 00:18:35.523 "num_blocks": 65536, 00:18:35.523 "uuid": "a06420ff-3012-4cea-8972-04214f8d930f", 00:18:35.523 "assigned_rate_limits": { 00:18:35.523 "rw_ios_per_sec": 0, 00:18:35.523 "rw_mbytes_per_sec": 0, 00:18:35.523 "r_mbytes_per_sec": 0, 00:18:35.523 "w_mbytes_per_sec": 0 00:18:35.523 }, 00:18:35.523 "claimed": true, 00:18:35.523 "claim_type": "exclusive_write", 00:18:35.523 "zoned": false, 00:18:35.523 "supported_io_types": { 00:18:35.523 "read": true, 00:18:35.523 "write": true, 00:18:35.523 "unmap": true, 00:18:35.523 "flush": true, 00:18:35.523 "reset": true, 00:18:35.523 "nvme_admin": false, 00:18:35.523 "nvme_io": false, 00:18:35.523 "nvme_io_md": false, 00:18:35.523 "write_zeroes": true, 00:18:35.523 "zcopy": true, 00:18:35.523 "get_zone_info": false, 00:18:35.523 "zone_management": false, 00:18:35.523 "zone_append": false, 00:18:35.523 "compare": false, 00:18:35.523 "compare_and_write": false, 00:18:35.523 "abort": true, 00:18:35.523 "seek_hole": false, 00:18:35.523 "seek_data": false, 00:18:35.523 "copy": true, 00:18:35.523 "nvme_iov_md": false 00:18:35.523 }, 00:18:35.523 "memory_domains": [ 00:18:35.523 { 00:18:35.523 "dma_device_id": "system", 00:18:35.523 "dma_device_type": 1 00:18:35.523 }, 00:18:35.523 { 00:18:35.523 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:35.523 "dma_device_type": 2 00:18:35.523 } 00:18:35.523 ], 00:18:35.523 "driver_specific": {} 00:18:35.523 }' 00:18:35.523 19:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:35.523 19:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:35.523 19:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:35.523 19:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:35.523 19:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:35.523 19:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:35.523 19:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:35.523 19:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:35.523 19:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:35.523 19:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:35.781 19:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:35.781 19:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:35.781 19:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:35.782 19:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:35.782 19:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:36.039 19:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:36.039 "name": "BaseBdev4", 00:18:36.039 "aliases": [ 00:18:36.039 "43c683a4-5b3c-4abc-a60d-8452fca6b317" 00:18:36.039 ], 00:18:36.039 "product_name": "Malloc disk", 00:18:36.039 "block_size": 512, 00:18:36.039 "num_blocks": 65536, 00:18:36.039 "uuid": "43c683a4-5b3c-4abc-a60d-8452fca6b317", 00:18:36.039 "assigned_rate_limits": { 00:18:36.039 "rw_ios_per_sec": 0, 00:18:36.039 "rw_mbytes_per_sec": 0, 00:18:36.039 "r_mbytes_per_sec": 0, 00:18:36.039 "w_mbytes_per_sec": 0 00:18:36.039 }, 00:18:36.039 "claimed": true, 00:18:36.039 "claim_type": "exclusive_write", 00:18:36.039 "zoned": false, 00:18:36.039 "supported_io_types": { 00:18:36.039 "read": true, 00:18:36.039 "write": true, 00:18:36.039 "unmap": true, 00:18:36.039 "flush": true, 00:18:36.039 "reset": true, 00:18:36.039 "nvme_admin": false, 00:18:36.039 "nvme_io": false, 00:18:36.039 "nvme_io_md": false, 00:18:36.039 "write_zeroes": true, 00:18:36.039 "zcopy": true, 00:18:36.039 "get_zone_info": false, 00:18:36.039 "zone_management": false, 00:18:36.039 "zone_append": false, 00:18:36.039 "compare": false, 00:18:36.039 "compare_and_write": false, 00:18:36.039 "abort": true, 00:18:36.039 "seek_hole": false, 00:18:36.039 "seek_data": false, 00:18:36.039 "copy": true, 00:18:36.039 "nvme_iov_md": false 00:18:36.039 }, 00:18:36.039 "memory_domains": [ 00:18:36.039 { 00:18:36.039 "dma_device_id": "system", 00:18:36.039 "dma_device_type": 1 00:18:36.039 }, 00:18:36.039 { 00:18:36.039 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:36.039 "dma_device_type": 2 00:18:36.039 } 00:18:36.039 ], 00:18:36.039 "driver_specific": {} 00:18:36.039 }' 00:18:36.039 19:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:36.039 19:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:36.039 19:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:36.040 19:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:36.040 19:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:36.040 19:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:36.040 19:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:36.297 19:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:36.297 19:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:36.297 19:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:36.297 19:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:36.297 19:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:36.297 19:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:36.555 [2024-07-24 19:54:28.007108] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:36.555 [2024-07-24 19:54:28.007135] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:36.555 [2024-07-24 19:54:28.007183] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:36.555 19:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:36.555 19:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:18:36.555 19:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:36.555 19:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:36.555 19:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:18:36.555 19:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:18:36.555 19:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:36.555 19:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:18:36.555 19:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:36.555 19:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:36.555 19:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:36.555 19:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:36.555 19:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:36.555 19:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:36.555 19:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:36.555 19:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:36.555 19:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:36.814 19:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:36.814 "name": "Existed_Raid", 00:18:36.814 "uuid": "c3e9e9a2-724a-42da-b25c-0f9c4b8466d1", 00:18:36.814 "strip_size_kb": 64, 00:18:36.814 "state": "offline", 00:18:36.814 "raid_level": "raid0", 00:18:36.814 "superblock": false, 00:18:36.814 "num_base_bdevs": 4, 00:18:36.814 "num_base_bdevs_discovered": 3, 00:18:36.814 "num_base_bdevs_operational": 3, 00:18:36.814 "base_bdevs_list": [ 00:18:36.814 { 00:18:36.814 "name": null, 00:18:36.814 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:36.814 "is_configured": false, 00:18:36.814 "data_offset": 0, 00:18:36.814 "data_size": 65536 00:18:36.814 }, 00:18:36.814 { 00:18:36.814 "name": "BaseBdev2", 00:18:36.814 "uuid": "3aa57b10-8a19-42fc-8fc2-aadf3f986659", 00:18:36.814 "is_configured": true, 00:18:36.814 "data_offset": 0, 00:18:36.814 "data_size": 65536 00:18:36.814 }, 00:18:36.814 { 00:18:36.814 "name": "BaseBdev3", 00:18:36.814 "uuid": "a06420ff-3012-4cea-8972-04214f8d930f", 00:18:36.814 "is_configured": true, 00:18:36.814 "data_offset": 0, 00:18:36.814 "data_size": 65536 00:18:36.814 }, 00:18:36.814 { 00:18:36.814 "name": "BaseBdev4", 00:18:36.814 "uuid": "43c683a4-5b3c-4abc-a60d-8452fca6b317", 00:18:36.814 "is_configured": true, 00:18:36.814 "data_offset": 0, 00:18:36.814 "data_size": 65536 00:18:36.814 } 00:18:36.814 ] 00:18:36.814 }' 00:18:36.814 19:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:36.814 19:54:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:37.748 19:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:37.748 19:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:37.748 19:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:37.748 19:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:37.748 19:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:37.748 19:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:37.748 19:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:38.007 [2024-07-24 19:54:29.452026] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:38.007 19:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:38.007 19:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:38.007 19:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:38.007 19:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:38.265 19:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:38.265 19:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:38.265 19:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:38.522 [2024-07-24 19:54:29.957753] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:38.523 19:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:38.523 19:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:38.523 19:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:38.523 19:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:39.089 19:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:39.089 19:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:39.089 19:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:18:39.348 [2024-07-24 19:54:30.740076] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:18:39.348 [2024-07-24 19:54:30.740127] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b7d300 name Existed_Raid, state offline 00:18:39.348 19:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:39.348 19:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:39.348 19:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:39.348 19:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:39.606 19:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:39.606 19:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:39.606 19:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:18:39.606 19:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:39.606 19:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:39.606 19:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:39.871 BaseBdev2 00:18:39.871 19:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:39.871 19:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:18:39.871 19:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:39.871 19:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:18:39.871 19:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:39.871 19:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:39.871 19:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:40.189 19:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:40.189 [ 00:18:40.189 { 00:18:40.189 "name": "BaseBdev2", 00:18:40.189 "aliases": [ 00:18:40.189 "4231a66f-667d-4500-8202-9ef037709552" 00:18:40.189 ], 00:18:40.189 "product_name": "Malloc disk", 00:18:40.189 "block_size": 512, 00:18:40.189 "num_blocks": 65536, 00:18:40.189 "uuid": "4231a66f-667d-4500-8202-9ef037709552", 00:18:40.189 "assigned_rate_limits": { 00:18:40.189 "rw_ios_per_sec": 0, 00:18:40.189 "rw_mbytes_per_sec": 0, 00:18:40.189 "r_mbytes_per_sec": 0, 00:18:40.189 "w_mbytes_per_sec": 0 00:18:40.189 }, 00:18:40.189 "claimed": false, 00:18:40.189 "zoned": false, 00:18:40.189 "supported_io_types": { 00:18:40.189 "read": true, 00:18:40.189 "write": true, 00:18:40.189 "unmap": true, 00:18:40.189 "flush": true, 00:18:40.189 "reset": true, 00:18:40.189 "nvme_admin": false, 00:18:40.189 "nvme_io": false, 00:18:40.189 "nvme_io_md": false, 00:18:40.189 "write_zeroes": true, 00:18:40.189 "zcopy": true, 00:18:40.189 "get_zone_info": false, 00:18:40.189 "zone_management": false, 00:18:40.189 "zone_append": false, 00:18:40.189 "compare": false, 00:18:40.189 "compare_and_write": false, 00:18:40.189 "abort": true, 00:18:40.189 "seek_hole": false, 00:18:40.189 "seek_data": false, 00:18:40.189 "copy": true, 00:18:40.189 "nvme_iov_md": false 00:18:40.189 }, 00:18:40.189 "memory_domains": [ 00:18:40.189 { 00:18:40.189 "dma_device_id": "system", 00:18:40.189 "dma_device_type": 1 00:18:40.189 }, 00:18:40.189 { 00:18:40.189 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:40.189 "dma_device_type": 2 00:18:40.189 } 00:18:40.189 ], 00:18:40.189 "driver_specific": {} 00:18:40.189 } 00:18:40.189 ] 00:18:40.189 19:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:18:40.189 19:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:40.189 19:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:40.189 19:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:40.447 BaseBdev3 00:18:40.447 19:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:40.447 19:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:18:40.447 19:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:40.447 19:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:18:40.447 19:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:40.447 19:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:40.447 19:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:40.704 19:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:40.961 [ 00:18:40.961 { 00:18:40.961 "name": "BaseBdev3", 00:18:40.961 "aliases": [ 00:18:40.961 "45de6d3e-2252-41c8-9163-9baf379a5039" 00:18:40.961 ], 00:18:40.961 "product_name": "Malloc disk", 00:18:40.961 "block_size": 512, 00:18:40.961 "num_blocks": 65536, 00:18:40.961 "uuid": "45de6d3e-2252-41c8-9163-9baf379a5039", 00:18:40.961 "assigned_rate_limits": { 00:18:40.961 "rw_ios_per_sec": 0, 00:18:40.961 "rw_mbytes_per_sec": 0, 00:18:40.961 "r_mbytes_per_sec": 0, 00:18:40.961 "w_mbytes_per_sec": 0 00:18:40.961 }, 00:18:40.961 "claimed": false, 00:18:40.961 "zoned": false, 00:18:40.961 "supported_io_types": { 00:18:40.961 "read": true, 00:18:40.961 "write": true, 00:18:40.961 "unmap": true, 00:18:40.961 "flush": true, 00:18:40.961 "reset": true, 00:18:40.961 "nvme_admin": false, 00:18:40.961 "nvme_io": false, 00:18:40.961 "nvme_io_md": false, 00:18:40.961 "write_zeroes": true, 00:18:40.961 "zcopy": true, 00:18:40.961 "get_zone_info": false, 00:18:40.961 "zone_management": false, 00:18:40.961 "zone_append": false, 00:18:40.961 "compare": false, 00:18:40.961 "compare_and_write": false, 00:18:40.961 "abort": true, 00:18:40.961 "seek_hole": false, 00:18:40.961 "seek_data": false, 00:18:40.961 "copy": true, 00:18:40.961 "nvme_iov_md": false 00:18:40.961 }, 00:18:40.961 "memory_domains": [ 00:18:40.961 { 00:18:40.961 "dma_device_id": "system", 00:18:40.961 "dma_device_type": 1 00:18:40.961 }, 00:18:40.961 { 00:18:40.961 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:40.962 "dma_device_type": 2 00:18:40.962 } 00:18:40.962 ], 00:18:40.962 "driver_specific": {} 00:18:40.962 } 00:18:40.962 ] 00:18:40.962 19:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:18:40.962 19:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:40.962 19:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:40.962 19:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:41.219 BaseBdev4 00:18:41.219 19:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:18:41.219 19:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:18:41.219 19:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:41.219 19:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:18:41.219 19:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:41.219 19:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:41.219 19:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:41.476 19:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:41.734 [ 00:18:41.734 { 00:18:41.734 "name": "BaseBdev4", 00:18:41.734 "aliases": [ 00:18:41.734 "8b2a6983-2045-4ee1-82a7-96d07bdb6bac" 00:18:41.734 ], 00:18:41.734 "product_name": "Malloc disk", 00:18:41.734 "block_size": 512, 00:18:41.734 "num_blocks": 65536, 00:18:41.734 "uuid": "8b2a6983-2045-4ee1-82a7-96d07bdb6bac", 00:18:41.734 "assigned_rate_limits": { 00:18:41.734 "rw_ios_per_sec": 0, 00:18:41.734 "rw_mbytes_per_sec": 0, 00:18:41.734 "r_mbytes_per_sec": 0, 00:18:41.734 "w_mbytes_per_sec": 0 00:18:41.734 }, 00:18:41.734 "claimed": false, 00:18:41.734 "zoned": false, 00:18:41.734 "supported_io_types": { 00:18:41.734 "read": true, 00:18:41.734 "write": true, 00:18:41.734 "unmap": true, 00:18:41.734 "flush": true, 00:18:41.734 "reset": true, 00:18:41.734 "nvme_admin": false, 00:18:41.734 "nvme_io": false, 00:18:41.734 "nvme_io_md": false, 00:18:41.734 "write_zeroes": true, 00:18:41.734 "zcopy": true, 00:18:41.734 "get_zone_info": false, 00:18:41.734 "zone_management": false, 00:18:41.734 "zone_append": false, 00:18:41.734 "compare": false, 00:18:41.734 "compare_and_write": false, 00:18:41.734 "abort": true, 00:18:41.734 "seek_hole": false, 00:18:41.734 "seek_data": false, 00:18:41.734 "copy": true, 00:18:41.734 "nvme_iov_md": false 00:18:41.734 }, 00:18:41.734 "memory_domains": [ 00:18:41.734 { 00:18:41.734 "dma_device_id": "system", 00:18:41.734 "dma_device_type": 1 00:18:41.734 }, 00:18:41.734 { 00:18:41.734 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:41.734 "dma_device_type": 2 00:18:41.734 } 00:18:41.734 ], 00:18:41.734 "driver_specific": {} 00:18:41.734 } 00:18:41.734 ] 00:18:41.734 19:54:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:18:41.734 19:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:41.734 19:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:41.734 19:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:41.992 [2024-07-24 19:54:33.464122] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:41.992 [2024-07-24 19:54:33.464165] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:41.992 [2024-07-24 19:54:33.464185] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:41.992 [2024-07-24 19:54:33.465728] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:41.992 [2024-07-24 19:54:33.465772] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:41.992 19:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:41.992 19:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:41.992 19:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:41.992 19:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:41.992 19:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:41.992 19:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:41.992 19:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:41.992 19:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:41.992 19:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:41.992 19:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:41.992 19:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:41.992 19:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:42.250 19:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:42.250 "name": "Existed_Raid", 00:18:42.250 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:42.250 "strip_size_kb": 64, 00:18:42.250 "state": "configuring", 00:18:42.250 "raid_level": "raid0", 00:18:42.250 "superblock": false, 00:18:42.250 "num_base_bdevs": 4, 00:18:42.250 "num_base_bdevs_discovered": 3, 00:18:42.250 "num_base_bdevs_operational": 4, 00:18:42.250 "base_bdevs_list": [ 00:18:42.250 { 00:18:42.250 "name": "BaseBdev1", 00:18:42.250 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:42.250 "is_configured": false, 00:18:42.250 "data_offset": 0, 00:18:42.250 "data_size": 0 00:18:42.250 }, 00:18:42.250 { 00:18:42.250 "name": "BaseBdev2", 00:18:42.250 "uuid": "4231a66f-667d-4500-8202-9ef037709552", 00:18:42.250 "is_configured": true, 00:18:42.250 "data_offset": 0, 00:18:42.250 "data_size": 65536 00:18:42.250 }, 00:18:42.250 { 00:18:42.250 "name": "BaseBdev3", 00:18:42.250 "uuid": "45de6d3e-2252-41c8-9163-9baf379a5039", 00:18:42.250 "is_configured": true, 00:18:42.250 "data_offset": 0, 00:18:42.250 "data_size": 65536 00:18:42.250 }, 00:18:42.250 { 00:18:42.250 "name": "BaseBdev4", 00:18:42.250 "uuid": "8b2a6983-2045-4ee1-82a7-96d07bdb6bac", 00:18:42.250 "is_configured": true, 00:18:42.250 "data_offset": 0, 00:18:42.250 "data_size": 65536 00:18:42.250 } 00:18:42.250 ] 00:18:42.250 }' 00:18:42.250 19:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:42.250 19:54:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:43.182 19:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:43.182 [2024-07-24 19:54:34.643295] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:43.182 19:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:43.182 19:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:43.182 19:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:43.182 19:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:43.182 19:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:43.182 19:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:43.182 19:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:43.182 19:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:43.182 19:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:43.182 19:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:43.182 19:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:43.182 19:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:43.440 19:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:43.440 "name": "Existed_Raid", 00:18:43.440 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:43.440 "strip_size_kb": 64, 00:18:43.440 "state": "configuring", 00:18:43.440 "raid_level": "raid0", 00:18:43.440 "superblock": false, 00:18:43.440 "num_base_bdevs": 4, 00:18:43.440 "num_base_bdevs_discovered": 2, 00:18:43.440 "num_base_bdevs_operational": 4, 00:18:43.440 "base_bdevs_list": [ 00:18:43.440 { 00:18:43.440 "name": "BaseBdev1", 00:18:43.440 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:43.440 "is_configured": false, 00:18:43.440 "data_offset": 0, 00:18:43.440 "data_size": 0 00:18:43.440 }, 00:18:43.440 { 00:18:43.440 "name": null, 00:18:43.440 "uuid": "4231a66f-667d-4500-8202-9ef037709552", 00:18:43.440 "is_configured": false, 00:18:43.440 "data_offset": 0, 00:18:43.440 "data_size": 65536 00:18:43.440 }, 00:18:43.440 { 00:18:43.440 "name": "BaseBdev3", 00:18:43.440 "uuid": "45de6d3e-2252-41c8-9163-9baf379a5039", 00:18:43.440 "is_configured": true, 00:18:43.440 "data_offset": 0, 00:18:43.440 "data_size": 65536 00:18:43.440 }, 00:18:43.440 { 00:18:43.440 "name": "BaseBdev4", 00:18:43.440 "uuid": "8b2a6983-2045-4ee1-82a7-96d07bdb6bac", 00:18:43.440 "is_configured": true, 00:18:43.441 "data_offset": 0, 00:18:43.441 "data_size": 65536 00:18:43.441 } 00:18:43.441 ] 00:18:43.441 }' 00:18:43.441 19:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:43.441 19:54:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:44.005 19:54:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:44.005 19:54:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:44.263 19:54:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:44.263 19:54:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:44.521 [2024-07-24 19:54:36.006294] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:44.521 BaseBdev1 00:18:44.521 19:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:44.521 19:54:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:18:44.521 19:54:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:44.521 19:54:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:18:44.521 19:54:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:44.521 19:54:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:44.521 19:54:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:44.778 19:54:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:45.037 [ 00:18:45.037 { 00:18:45.037 "name": "BaseBdev1", 00:18:45.037 "aliases": [ 00:18:45.037 "a62d575a-616c-401d-b8b6-898f04545854" 00:18:45.037 ], 00:18:45.037 "product_name": "Malloc disk", 00:18:45.037 "block_size": 512, 00:18:45.037 "num_blocks": 65536, 00:18:45.037 "uuid": "a62d575a-616c-401d-b8b6-898f04545854", 00:18:45.037 "assigned_rate_limits": { 00:18:45.037 "rw_ios_per_sec": 0, 00:18:45.037 "rw_mbytes_per_sec": 0, 00:18:45.037 "r_mbytes_per_sec": 0, 00:18:45.037 "w_mbytes_per_sec": 0 00:18:45.037 }, 00:18:45.037 "claimed": true, 00:18:45.037 "claim_type": "exclusive_write", 00:18:45.037 "zoned": false, 00:18:45.037 "supported_io_types": { 00:18:45.037 "read": true, 00:18:45.037 "write": true, 00:18:45.037 "unmap": true, 00:18:45.037 "flush": true, 00:18:45.037 "reset": true, 00:18:45.037 "nvme_admin": false, 00:18:45.037 "nvme_io": false, 00:18:45.037 "nvme_io_md": false, 00:18:45.037 "write_zeroes": true, 00:18:45.037 "zcopy": true, 00:18:45.037 "get_zone_info": false, 00:18:45.037 "zone_management": false, 00:18:45.037 "zone_append": false, 00:18:45.037 "compare": false, 00:18:45.037 "compare_and_write": false, 00:18:45.037 "abort": true, 00:18:45.037 "seek_hole": false, 00:18:45.037 "seek_data": false, 00:18:45.037 "copy": true, 00:18:45.037 "nvme_iov_md": false 00:18:45.037 }, 00:18:45.037 "memory_domains": [ 00:18:45.037 { 00:18:45.037 "dma_device_id": "system", 00:18:45.037 "dma_device_type": 1 00:18:45.037 }, 00:18:45.037 { 00:18:45.037 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:45.037 "dma_device_type": 2 00:18:45.037 } 00:18:45.037 ], 00:18:45.037 "driver_specific": {} 00:18:45.037 } 00:18:45.037 ] 00:18:45.037 19:54:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:18:45.037 19:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:45.037 19:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:45.037 19:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:45.037 19:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:45.037 19:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:45.037 19:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:45.037 19:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:45.037 19:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:45.037 19:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:45.037 19:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:45.037 19:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:45.037 19:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:45.295 19:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:45.295 "name": "Existed_Raid", 00:18:45.295 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:45.295 "strip_size_kb": 64, 00:18:45.295 "state": "configuring", 00:18:45.295 "raid_level": "raid0", 00:18:45.295 "superblock": false, 00:18:45.295 "num_base_bdevs": 4, 00:18:45.295 "num_base_bdevs_discovered": 3, 00:18:45.295 "num_base_bdevs_operational": 4, 00:18:45.295 "base_bdevs_list": [ 00:18:45.295 { 00:18:45.295 "name": "BaseBdev1", 00:18:45.295 "uuid": "a62d575a-616c-401d-b8b6-898f04545854", 00:18:45.295 "is_configured": true, 00:18:45.295 "data_offset": 0, 00:18:45.295 "data_size": 65536 00:18:45.295 }, 00:18:45.295 { 00:18:45.295 "name": null, 00:18:45.295 "uuid": "4231a66f-667d-4500-8202-9ef037709552", 00:18:45.295 "is_configured": false, 00:18:45.295 "data_offset": 0, 00:18:45.295 "data_size": 65536 00:18:45.295 }, 00:18:45.295 { 00:18:45.295 "name": "BaseBdev3", 00:18:45.295 "uuid": "45de6d3e-2252-41c8-9163-9baf379a5039", 00:18:45.295 "is_configured": true, 00:18:45.295 "data_offset": 0, 00:18:45.295 "data_size": 65536 00:18:45.295 }, 00:18:45.295 { 00:18:45.295 "name": "BaseBdev4", 00:18:45.295 "uuid": "8b2a6983-2045-4ee1-82a7-96d07bdb6bac", 00:18:45.295 "is_configured": true, 00:18:45.295 "data_offset": 0, 00:18:45.295 "data_size": 65536 00:18:45.295 } 00:18:45.295 ] 00:18:45.295 }' 00:18:45.295 19:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:45.295 19:54:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:45.861 19:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:45.861 19:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:46.119 19:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:46.119 19:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:46.377 [2024-07-24 19:54:37.811118] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:46.377 19:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:46.377 19:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:46.377 19:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:46.377 19:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:46.377 19:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:46.377 19:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:46.377 19:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:46.377 19:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:46.377 19:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:46.377 19:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:46.377 19:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:46.377 19:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:46.635 19:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:46.635 "name": "Existed_Raid", 00:18:46.635 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:46.635 "strip_size_kb": 64, 00:18:46.635 "state": "configuring", 00:18:46.635 "raid_level": "raid0", 00:18:46.635 "superblock": false, 00:18:46.635 "num_base_bdevs": 4, 00:18:46.635 "num_base_bdevs_discovered": 2, 00:18:46.635 "num_base_bdevs_operational": 4, 00:18:46.635 "base_bdevs_list": [ 00:18:46.635 { 00:18:46.635 "name": "BaseBdev1", 00:18:46.635 "uuid": "a62d575a-616c-401d-b8b6-898f04545854", 00:18:46.635 "is_configured": true, 00:18:46.635 "data_offset": 0, 00:18:46.635 "data_size": 65536 00:18:46.635 }, 00:18:46.635 { 00:18:46.635 "name": null, 00:18:46.635 "uuid": "4231a66f-667d-4500-8202-9ef037709552", 00:18:46.635 "is_configured": false, 00:18:46.635 "data_offset": 0, 00:18:46.635 "data_size": 65536 00:18:46.635 }, 00:18:46.635 { 00:18:46.635 "name": null, 00:18:46.635 "uuid": "45de6d3e-2252-41c8-9163-9baf379a5039", 00:18:46.635 "is_configured": false, 00:18:46.635 "data_offset": 0, 00:18:46.635 "data_size": 65536 00:18:46.635 }, 00:18:46.635 { 00:18:46.635 "name": "BaseBdev4", 00:18:46.635 "uuid": "8b2a6983-2045-4ee1-82a7-96d07bdb6bac", 00:18:46.635 "is_configured": true, 00:18:46.635 "data_offset": 0, 00:18:46.635 "data_size": 65536 00:18:46.635 } 00:18:46.635 ] 00:18:46.635 }' 00:18:46.635 19:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:46.635 19:54:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:47.202 19:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:47.202 19:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:47.461 19:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:47.461 19:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:48.028 [2024-07-24 19:54:39.439450] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:48.028 19:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:48.028 19:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:48.029 19:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:48.029 19:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:48.029 19:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:48.029 19:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:48.029 19:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:48.029 19:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:48.029 19:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:48.029 19:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:48.029 19:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:48.029 19:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:48.596 19:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:48.596 "name": "Existed_Raid", 00:18:48.596 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:48.596 "strip_size_kb": 64, 00:18:48.596 "state": "configuring", 00:18:48.596 "raid_level": "raid0", 00:18:48.596 "superblock": false, 00:18:48.596 "num_base_bdevs": 4, 00:18:48.596 "num_base_bdevs_discovered": 3, 00:18:48.596 "num_base_bdevs_operational": 4, 00:18:48.596 "base_bdevs_list": [ 00:18:48.596 { 00:18:48.596 "name": "BaseBdev1", 00:18:48.596 "uuid": "a62d575a-616c-401d-b8b6-898f04545854", 00:18:48.596 "is_configured": true, 00:18:48.596 "data_offset": 0, 00:18:48.596 "data_size": 65536 00:18:48.596 }, 00:18:48.596 { 00:18:48.596 "name": null, 00:18:48.596 "uuid": "4231a66f-667d-4500-8202-9ef037709552", 00:18:48.596 "is_configured": false, 00:18:48.596 "data_offset": 0, 00:18:48.596 "data_size": 65536 00:18:48.596 }, 00:18:48.596 { 00:18:48.596 "name": "BaseBdev3", 00:18:48.596 "uuid": "45de6d3e-2252-41c8-9163-9baf379a5039", 00:18:48.596 "is_configured": true, 00:18:48.596 "data_offset": 0, 00:18:48.596 "data_size": 65536 00:18:48.596 }, 00:18:48.596 { 00:18:48.596 "name": "BaseBdev4", 00:18:48.596 "uuid": "8b2a6983-2045-4ee1-82a7-96d07bdb6bac", 00:18:48.596 "is_configured": true, 00:18:48.596 "data_offset": 0, 00:18:48.596 "data_size": 65536 00:18:48.596 } 00:18:48.596 ] 00:18:48.596 }' 00:18:48.596 19:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:48.596 19:54:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:49.162 19:54:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:49.162 19:54:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:49.421 19:54:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:49.421 19:54:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:49.679 [2024-07-24 19:54:41.091856] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:49.679 19:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:49.679 19:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:49.679 19:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:49.679 19:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:49.679 19:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:49.679 19:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:49.679 19:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:49.679 19:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:49.679 19:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:49.679 19:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:49.679 19:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:49.679 19:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:50.247 19:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:50.247 "name": "Existed_Raid", 00:18:50.247 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:50.247 "strip_size_kb": 64, 00:18:50.247 "state": "configuring", 00:18:50.247 "raid_level": "raid0", 00:18:50.247 "superblock": false, 00:18:50.247 "num_base_bdevs": 4, 00:18:50.247 "num_base_bdevs_discovered": 2, 00:18:50.247 "num_base_bdevs_operational": 4, 00:18:50.247 "base_bdevs_list": [ 00:18:50.247 { 00:18:50.247 "name": null, 00:18:50.247 "uuid": "a62d575a-616c-401d-b8b6-898f04545854", 00:18:50.247 "is_configured": false, 00:18:50.247 "data_offset": 0, 00:18:50.247 "data_size": 65536 00:18:50.247 }, 00:18:50.247 { 00:18:50.247 "name": null, 00:18:50.247 "uuid": "4231a66f-667d-4500-8202-9ef037709552", 00:18:50.247 "is_configured": false, 00:18:50.247 "data_offset": 0, 00:18:50.247 "data_size": 65536 00:18:50.247 }, 00:18:50.247 { 00:18:50.247 "name": "BaseBdev3", 00:18:50.247 "uuid": "45de6d3e-2252-41c8-9163-9baf379a5039", 00:18:50.247 "is_configured": true, 00:18:50.247 "data_offset": 0, 00:18:50.247 "data_size": 65536 00:18:50.247 }, 00:18:50.247 { 00:18:50.247 "name": "BaseBdev4", 00:18:50.247 "uuid": "8b2a6983-2045-4ee1-82a7-96d07bdb6bac", 00:18:50.247 "is_configured": true, 00:18:50.247 "data_offset": 0, 00:18:50.247 "data_size": 65536 00:18:50.247 } 00:18:50.247 ] 00:18:50.247 }' 00:18:50.247 19:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:50.247 19:54:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:50.814 19:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:50.814 19:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:51.073 19:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:51.073 19:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:51.639 [2024-07-24 19:54:42.991608] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:51.639 19:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:51.639 19:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:51.639 19:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:51.639 19:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:51.639 19:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:51.639 19:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:51.639 19:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:51.639 19:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:51.639 19:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:51.639 19:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:51.639 19:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:51.639 19:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:52.208 19:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:52.208 "name": "Existed_Raid", 00:18:52.208 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:52.208 "strip_size_kb": 64, 00:18:52.208 "state": "configuring", 00:18:52.208 "raid_level": "raid0", 00:18:52.208 "superblock": false, 00:18:52.208 "num_base_bdevs": 4, 00:18:52.208 "num_base_bdevs_discovered": 3, 00:18:52.208 "num_base_bdevs_operational": 4, 00:18:52.208 "base_bdevs_list": [ 00:18:52.208 { 00:18:52.208 "name": null, 00:18:52.208 "uuid": "a62d575a-616c-401d-b8b6-898f04545854", 00:18:52.208 "is_configured": false, 00:18:52.208 "data_offset": 0, 00:18:52.208 "data_size": 65536 00:18:52.208 }, 00:18:52.208 { 00:18:52.208 "name": "BaseBdev2", 00:18:52.208 "uuid": "4231a66f-667d-4500-8202-9ef037709552", 00:18:52.208 "is_configured": true, 00:18:52.208 "data_offset": 0, 00:18:52.208 "data_size": 65536 00:18:52.208 }, 00:18:52.208 { 00:18:52.208 "name": "BaseBdev3", 00:18:52.208 "uuid": "45de6d3e-2252-41c8-9163-9baf379a5039", 00:18:52.208 "is_configured": true, 00:18:52.208 "data_offset": 0, 00:18:52.208 "data_size": 65536 00:18:52.208 }, 00:18:52.208 { 00:18:52.208 "name": "BaseBdev4", 00:18:52.208 "uuid": "8b2a6983-2045-4ee1-82a7-96d07bdb6bac", 00:18:52.208 "is_configured": true, 00:18:52.208 "data_offset": 0, 00:18:52.208 "data_size": 65536 00:18:52.208 } 00:18:52.208 ] 00:18:52.208 }' 00:18:52.208 19:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:52.208 19:54:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:52.774 19:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:52.774 19:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:53.033 19:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:53.033 19:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:53.033 19:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:53.033 19:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u a62d575a-616c-401d-b8b6-898f04545854 00:18:53.292 [2024-07-24 19:54:44.813340] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:53.292 [2024-07-24 19:54:44.813396] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b804f0 00:18:53.292 [2024-07-24 19:54:44.813405] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:18:53.292 [2024-07-24 19:54:44.813631] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b81250 00:18:53.292 [2024-07-24 19:54:44.813769] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b804f0 00:18:53.292 [2024-07-24 19:54:44.813779] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1b804f0 00:18:53.292 [2024-07-24 19:54:44.813961] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:53.292 NewBaseBdev 00:18:53.292 19:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:53.292 19:54:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:18:53.292 19:54:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:53.292 19:54:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:18:53.292 19:54:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:53.292 19:54:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:53.292 19:54:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:53.551 19:54:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:53.810 [ 00:18:53.810 { 00:18:53.810 "name": "NewBaseBdev", 00:18:53.810 "aliases": [ 00:18:53.810 "a62d575a-616c-401d-b8b6-898f04545854" 00:18:53.810 ], 00:18:53.810 "product_name": "Malloc disk", 00:18:53.810 "block_size": 512, 00:18:53.810 "num_blocks": 65536, 00:18:53.810 "uuid": "a62d575a-616c-401d-b8b6-898f04545854", 00:18:53.810 "assigned_rate_limits": { 00:18:53.810 "rw_ios_per_sec": 0, 00:18:53.810 "rw_mbytes_per_sec": 0, 00:18:53.810 "r_mbytes_per_sec": 0, 00:18:53.810 "w_mbytes_per_sec": 0 00:18:53.810 }, 00:18:53.810 "claimed": true, 00:18:53.810 "claim_type": "exclusive_write", 00:18:53.810 "zoned": false, 00:18:53.810 "supported_io_types": { 00:18:53.810 "read": true, 00:18:53.810 "write": true, 00:18:53.810 "unmap": true, 00:18:53.810 "flush": true, 00:18:53.810 "reset": true, 00:18:53.810 "nvme_admin": false, 00:18:53.810 "nvme_io": false, 00:18:53.810 "nvme_io_md": false, 00:18:53.810 "write_zeroes": true, 00:18:53.810 "zcopy": true, 00:18:53.810 "get_zone_info": false, 00:18:53.810 "zone_management": false, 00:18:53.810 "zone_append": false, 00:18:53.810 "compare": false, 00:18:53.810 "compare_and_write": false, 00:18:53.810 "abort": true, 00:18:53.810 "seek_hole": false, 00:18:53.810 "seek_data": false, 00:18:53.810 "copy": true, 00:18:53.810 "nvme_iov_md": false 00:18:53.810 }, 00:18:53.810 "memory_domains": [ 00:18:53.810 { 00:18:53.810 "dma_device_id": "system", 00:18:53.810 "dma_device_type": 1 00:18:53.810 }, 00:18:53.810 { 00:18:53.810 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.810 "dma_device_type": 2 00:18:53.810 } 00:18:53.810 ], 00:18:53.810 "driver_specific": {} 00:18:53.810 } 00:18:53.810 ] 00:18:53.810 19:54:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:18:53.810 19:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:53.810 19:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:53.810 19:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:53.810 19:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:53.810 19:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:53.810 19:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:53.810 19:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:53.810 19:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:53.810 19:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:53.810 19:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:53.810 19:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:53.810 19:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:54.069 19:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:54.069 "name": "Existed_Raid", 00:18:54.069 "uuid": "1e88d951-c28b-4347-8fa1-f3c5d3e02bf4", 00:18:54.069 "strip_size_kb": 64, 00:18:54.069 "state": "online", 00:18:54.069 "raid_level": "raid0", 00:18:54.069 "superblock": false, 00:18:54.069 "num_base_bdevs": 4, 00:18:54.069 "num_base_bdevs_discovered": 4, 00:18:54.069 "num_base_bdevs_operational": 4, 00:18:54.069 "base_bdevs_list": [ 00:18:54.069 { 00:18:54.069 "name": "NewBaseBdev", 00:18:54.070 "uuid": "a62d575a-616c-401d-b8b6-898f04545854", 00:18:54.070 "is_configured": true, 00:18:54.070 "data_offset": 0, 00:18:54.070 "data_size": 65536 00:18:54.070 }, 00:18:54.070 { 00:18:54.070 "name": "BaseBdev2", 00:18:54.070 "uuid": "4231a66f-667d-4500-8202-9ef037709552", 00:18:54.070 "is_configured": true, 00:18:54.070 "data_offset": 0, 00:18:54.070 "data_size": 65536 00:18:54.070 }, 00:18:54.070 { 00:18:54.070 "name": "BaseBdev3", 00:18:54.070 "uuid": "45de6d3e-2252-41c8-9163-9baf379a5039", 00:18:54.070 "is_configured": true, 00:18:54.070 "data_offset": 0, 00:18:54.070 "data_size": 65536 00:18:54.070 }, 00:18:54.070 { 00:18:54.070 "name": "BaseBdev4", 00:18:54.070 "uuid": "8b2a6983-2045-4ee1-82a7-96d07bdb6bac", 00:18:54.070 "is_configured": true, 00:18:54.070 "data_offset": 0, 00:18:54.070 "data_size": 65536 00:18:54.070 } 00:18:54.070 ] 00:18:54.070 }' 00:18:54.070 19:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:54.070 19:54:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:54.637 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:54.637 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:54.637 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:54.637 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:54.637 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:54.637 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:54.637 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:54.637 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:54.897 [2024-07-24 19:54:46.381856] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:54.897 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:54.897 "name": "Existed_Raid", 00:18:54.897 "aliases": [ 00:18:54.897 "1e88d951-c28b-4347-8fa1-f3c5d3e02bf4" 00:18:54.897 ], 00:18:54.897 "product_name": "Raid Volume", 00:18:54.897 "block_size": 512, 00:18:54.897 "num_blocks": 262144, 00:18:54.897 "uuid": "1e88d951-c28b-4347-8fa1-f3c5d3e02bf4", 00:18:54.897 "assigned_rate_limits": { 00:18:54.897 "rw_ios_per_sec": 0, 00:18:54.897 "rw_mbytes_per_sec": 0, 00:18:54.897 "r_mbytes_per_sec": 0, 00:18:54.897 "w_mbytes_per_sec": 0 00:18:54.897 }, 00:18:54.897 "claimed": false, 00:18:54.897 "zoned": false, 00:18:54.897 "supported_io_types": { 00:18:54.897 "read": true, 00:18:54.897 "write": true, 00:18:54.897 "unmap": true, 00:18:54.897 "flush": true, 00:18:54.897 "reset": true, 00:18:54.897 "nvme_admin": false, 00:18:54.897 "nvme_io": false, 00:18:54.897 "nvme_io_md": false, 00:18:54.897 "write_zeroes": true, 00:18:54.897 "zcopy": false, 00:18:54.897 "get_zone_info": false, 00:18:54.897 "zone_management": false, 00:18:54.897 "zone_append": false, 00:18:54.897 "compare": false, 00:18:54.897 "compare_and_write": false, 00:18:54.897 "abort": false, 00:18:54.897 "seek_hole": false, 00:18:54.897 "seek_data": false, 00:18:54.897 "copy": false, 00:18:54.897 "nvme_iov_md": false 00:18:54.897 }, 00:18:54.897 "memory_domains": [ 00:18:54.897 { 00:18:54.897 "dma_device_id": "system", 00:18:54.897 "dma_device_type": 1 00:18:54.897 }, 00:18:54.897 { 00:18:54.897 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:54.897 "dma_device_type": 2 00:18:54.897 }, 00:18:54.897 { 00:18:54.897 "dma_device_id": "system", 00:18:54.897 "dma_device_type": 1 00:18:54.897 }, 00:18:54.897 { 00:18:54.897 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:54.897 "dma_device_type": 2 00:18:54.897 }, 00:18:54.897 { 00:18:54.897 "dma_device_id": "system", 00:18:54.897 "dma_device_type": 1 00:18:54.897 }, 00:18:54.897 { 00:18:54.897 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:54.897 "dma_device_type": 2 00:18:54.897 }, 00:18:54.897 { 00:18:54.897 "dma_device_id": "system", 00:18:54.897 "dma_device_type": 1 00:18:54.897 }, 00:18:54.897 { 00:18:54.897 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:54.897 "dma_device_type": 2 00:18:54.897 } 00:18:54.897 ], 00:18:54.897 "driver_specific": { 00:18:54.897 "raid": { 00:18:54.897 "uuid": "1e88d951-c28b-4347-8fa1-f3c5d3e02bf4", 00:18:54.897 "strip_size_kb": 64, 00:18:54.897 "state": "online", 00:18:54.897 "raid_level": "raid0", 00:18:54.897 "superblock": false, 00:18:54.897 "num_base_bdevs": 4, 00:18:54.897 "num_base_bdevs_discovered": 4, 00:18:54.897 "num_base_bdevs_operational": 4, 00:18:54.897 "base_bdevs_list": [ 00:18:54.897 { 00:18:54.897 "name": "NewBaseBdev", 00:18:54.897 "uuid": "a62d575a-616c-401d-b8b6-898f04545854", 00:18:54.897 "is_configured": true, 00:18:54.897 "data_offset": 0, 00:18:54.897 "data_size": 65536 00:18:54.897 }, 00:18:54.897 { 00:18:54.897 "name": "BaseBdev2", 00:18:54.897 "uuid": "4231a66f-667d-4500-8202-9ef037709552", 00:18:54.897 "is_configured": true, 00:18:54.897 "data_offset": 0, 00:18:54.897 "data_size": 65536 00:18:54.897 }, 00:18:54.897 { 00:18:54.897 "name": "BaseBdev3", 00:18:54.897 "uuid": "45de6d3e-2252-41c8-9163-9baf379a5039", 00:18:54.897 "is_configured": true, 00:18:54.897 "data_offset": 0, 00:18:54.897 "data_size": 65536 00:18:54.897 }, 00:18:54.897 { 00:18:54.897 "name": "BaseBdev4", 00:18:54.897 "uuid": "8b2a6983-2045-4ee1-82a7-96d07bdb6bac", 00:18:54.897 "is_configured": true, 00:18:54.897 "data_offset": 0, 00:18:54.897 "data_size": 65536 00:18:54.897 } 00:18:54.897 ] 00:18:54.897 } 00:18:54.897 } 00:18:54.897 }' 00:18:54.897 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:54.897 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:54.897 BaseBdev2 00:18:54.897 BaseBdev3 00:18:54.897 BaseBdev4' 00:18:54.897 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:54.897 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:54.897 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:55.156 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:55.156 "name": "NewBaseBdev", 00:18:55.156 "aliases": [ 00:18:55.156 "a62d575a-616c-401d-b8b6-898f04545854" 00:18:55.156 ], 00:18:55.156 "product_name": "Malloc disk", 00:18:55.156 "block_size": 512, 00:18:55.156 "num_blocks": 65536, 00:18:55.156 "uuid": "a62d575a-616c-401d-b8b6-898f04545854", 00:18:55.156 "assigned_rate_limits": { 00:18:55.156 "rw_ios_per_sec": 0, 00:18:55.156 "rw_mbytes_per_sec": 0, 00:18:55.156 "r_mbytes_per_sec": 0, 00:18:55.156 "w_mbytes_per_sec": 0 00:18:55.156 }, 00:18:55.156 "claimed": true, 00:18:55.156 "claim_type": "exclusive_write", 00:18:55.156 "zoned": false, 00:18:55.156 "supported_io_types": { 00:18:55.156 "read": true, 00:18:55.156 "write": true, 00:18:55.156 "unmap": true, 00:18:55.156 "flush": true, 00:18:55.156 "reset": true, 00:18:55.156 "nvme_admin": false, 00:18:55.156 "nvme_io": false, 00:18:55.156 "nvme_io_md": false, 00:18:55.156 "write_zeroes": true, 00:18:55.156 "zcopy": true, 00:18:55.156 "get_zone_info": false, 00:18:55.156 "zone_management": false, 00:18:55.156 "zone_append": false, 00:18:55.156 "compare": false, 00:18:55.156 "compare_and_write": false, 00:18:55.156 "abort": true, 00:18:55.156 "seek_hole": false, 00:18:55.156 "seek_data": false, 00:18:55.156 "copy": true, 00:18:55.156 "nvme_iov_md": false 00:18:55.156 }, 00:18:55.156 "memory_domains": [ 00:18:55.156 { 00:18:55.156 "dma_device_id": "system", 00:18:55.156 "dma_device_type": 1 00:18:55.156 }, 00:18:55.156 { 00:18:55.156 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:55.156 "dma_device_type": 2 00:18:55.156 } 00:18:55.156 ], 00:18:55.156 "driver_specific": {} 00:18:55.156 }' 00:18:55.156 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:55.156 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:55.156 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:55.156 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:55.156 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:55.414 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:55.414 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:55.414 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:55.414 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:55.414 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:55.414 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:55.414 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:55.414 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:55.414 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:55.414 19:54:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:55.673 19:54:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:55.673 "name": "BaseBdev2", 00:18:55.673 "aliases": [ 00:18:55.673 "4231a66f-667d-4500-8202-9ef037709552" 00:18:55.673 ], 00:18:55.673 "product_name": "Malloc disk", 00:18:55.673 "block_size": 512, 00:18:55.673 "num_blocks": 65536, 00:18:55.673 "uuid": "4231a66f-667d-4500-8202-9ef037709552", 00:18:55.673 "assigned_rate_limits": { 00:18:55.673 "rw_ios_per_sec": 0, 00:18:55.673 "rw_mbytes_per_sec": 0, 00:18:55.673 "r_mbytes_per_sec": 0, 00:18:55.673 "w_mbytes_per_sec": 0 00:18:55.673 }, 00:18:55.673 "claimed": true, 00:18:55.673 "claim_type": "exclusive_write", 00:18:55.673 "zoned": false, 00:18:55.673 "supported_io_types": { 00:18:55.673 "read": true, 00:18:55.673 "write": true, 00:18:55.673 "unmap": true, 00:18:55.673 "flush": true, 00:18:55.673 "reset": true, 00:18:55.673 "nvme_admin": false, 00:18:55.673 "nvme_io": false, 00:18:55.673 "nvme_io_md": false, 00:18:55.673 "write_zeroes": true, 00:18:55.673 "zcopy": true, 00:18:55.673 "get_zone_info": false, 00:18:55.673 "zone_management": false, 00:18:55.673 "zone_append": false, 00:18:55.673 "compare": false, 00:18:55.673 "compare_and_write": false, 00:18:55.673 "abort": true, 00:18:55.673 "seek_hole": false, 00:18:55.673 "seek_data": false, 00:18:55.673 "copy": true, 00:18:55.673 "nvme_iov_md": false 00:18:55.673 }, 00:18:55.673 "memory_domains": [ 00:18:55.673 { 00:18:55.673 "dma_device_id": "system", 00:18:55.673 "dma_device_type": 1 00:18:55.673 }, 00:18:55.673 { 00:18:55.673 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:55.673 "dma_device_type": 2 00:18:55.673 } 00:18:55.673 ], 00:18:55.673 "driver_specific": {} 00:18:55.673 }' 00:18:55.673 19:54:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:55.673 19:54:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:55.932 19:54:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:55.932 19:54:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:55.932 19:54:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:55.932 19:54:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:55.932 19:54:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:55.932 19:54:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:55.932 19:54:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:55.932 19:54:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:55.932 19:54:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:56.191 19:54:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:56.191 19:54:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:56.191 19:54:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:56.191 19:54:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:56.450 19:54:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:56.450 "name": "BaseBdev3", 00:18:56.450 "aliases": [ 00:18:56.450 "45de6d3e-2252-41c8-9163-9baf379a5039" 00:18:56.450 ], 00:18:56.450 "product_name": "Malloc disk", 00:18:56.450 "block_size": 512, 00:18:56.450 "num_blocks": 65536, 00:18:56.450 "uuid": "45de6d3e-2252-41c8-9163-9baf379a5039", 00:18:56.450 "assigned_rate_limits": { 00:18:56.450 "rw_ios_per_sec": 0, 00:18:56.450 "rw_mbytes_per_sec": 0, 00:18:56.450 "r_mbytes_per_sec": 0, 00:18:56.450 "w_mbytes_per_sec": 0 00:18:56.450 }, 00:18:56.450 "claimed": true, 00:18:56.450 "claim_type": "exclusive_write", 00:18:56.450 "zoned": false, 00:18:56.450 "supported_io_types": { 00:18:56.450 "read": true, 00:18:56.450 "write": true, 00:18:56.450 "unmap": true, 00:18:56.450 "flush": true, 00:18:56.450 "reset": true, 00:18:56.450 "nvme_admin": false, 00:18:56.450 "nvme_io": false, 00:18:56.450 "nvme_io_md": false, 00:18:56.450 "write_zeroes": true, 00:18:56.450 "zcopy": true, 00:18:56.450 "get_zone_info": false, 00:18:56.450 "zone_management": false, 00:18:56.450 "zone_append": false, 00:18:56.450 "compare": false, 00:18:56.450 "compare_and_write": false, 00:18:56.450 "abort": true, 00:18:56.450 "seek_hole": false, 00:18:56.450 "seek_data": false, 00:18:56.450 "copy": true, 00:18:56.450 "nvme_iov_md": false 00:18:56.450 }, 00:18:56.450 "memory_domains": [ 00:18:56.450 { 00:18:56.450 "dma_device_id": "system", 00:18:56.450 "dma_device_type": 1 00:18:56.450 }, 00:18:56.450 { 00:18:56.450 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:56.450 "dma_device_type": 2 00:18:56.450 } 00:18:56.450 ], 00:18:56.450 "driver_specific": {} 00:18:56.450 }' 00:18:56.450 19:54:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:56.450 19:54:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:56.450 19:54:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:56.450 19:54:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:56.450 19:54:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:56.450 19:54:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:56.450 19:54:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:56.450 19:54:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:56.450 19:54:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:56.450 19:54:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:56.708 19:54:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:56.708 19:54:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:56.708 19:54:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:56.708 19:54:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:56.708 19:54:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:57.000 19:54:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:57.000 "name": "BaseBdev4", 00:18:57.000 "aliases": [ 00:18:57.000 "8b2a6983-2045-4ee1-82a7-96d07bdb6bac" 00:18:57.000 ], 00:18:57.000 "product_name": "Malloc disk", 00:18:57.000 "block_size": 512, 00:18:57.000 "num_blocks": 65536, 00:18:57.001 "uuid": "8b2a6983-2045-4ee1-82a7-96d07bdb6bac", 00:18:57.001 "assigned_rate_limits": { 00:18:57.001 "rw_ios_per_sec": 0, 00:18:57.001 "rw_mbytes_per_sec": 0, 00:18:57.001 "r_mbytes_per_sec": 0, 00:18:57.001 "w_mbytes_per_sec": 0 00:18:57.001 }, 00:18:57.001 "claimed": true, 00:18:57.001 "claim_type": "exclusive_write", 00:18:57.001 "zoned": false, 00:18:57.001 "supported_io_types": { 00:18:57.001 "read": true, 00:18:57.001 "write": true, 00:18:57.001 "unmap": true, 00:18:57.001 "flush": true, 00:18:57.001 "reset": true, 00:18:57.001 "nvme_admin": false, 00:18:57.001 "nvme_io": false, 00:18:57.001 "nvme_io_md": false, 00:18:57.001 "write_zeroes": true, 00:18:57.001 "zcopy": true, 00:18:57.001 "get_zone_info": false, 00:18:57.001 "zone_management": false, 00:18:57.001 "zone_append": false, 00:18:57.001 "compare": false, 00:18:57.001 "compare_and_write": false, 00:18:57.001 "abort": true, 00:18:57.001 "seek_hole": false, 00:18:57.001 "seek_data": false, 00:18:57.001 "copy": true, 00:18:57.001 "nvme_iov_md": false 00:18:57.001 }, 00:18:57.001 "memory_domains": [ 00:18:57.001 { 00:18:57.001 "dma_device_id": "system", 00:18:57.001 "dma_device_type": 1 00:18:57.001 }, 00:18:57.001 { 00:18:57.001 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:57.001 "dma_device_type": 2 00:18:57.001 } 00:18:57.001 ], 00:18:57.001 "driver_specific": {} 00:18:57.001 }' 00:18:57.001 19:54:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:57.001 19:54:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:57.001 19:54:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:57.001 19:54:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:57.001 19:54:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:57.001 19:54:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:57.001 19:54:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:57.001 19:54:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:57.259 19:54:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:57.259 19:54:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:57.259 19:54:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:57.259 19:54:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:57.259 19:54:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:57.517 [2024-07-24 19:54:48.952358] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:57.517 [2024-07-24 19:54:48.952385] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:57.517 [2024-07-24 19:54:48.952448] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:57.517 [2024-07-24 19:54:48.952515] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:57.517 [2024-07-24 19:54:48.952527] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b804f0 name Existed_Raid, state offline 00:18:57.517 19:54:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1436476 00:18:57.517 19:54:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 1436476 ']' 00:18:57.517 19:54:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 1436476 00:18:57.517 19:54:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:18:57.518 19:54:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:18:57.518 19:54:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1436476 00:18:57.518 19:54:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:18:57.518 19:54:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:18:57.518 19:54:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1436476' 00:18:57.518 killing process with pid 1436476 00:18:57.518 19:54:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 1436476 00:18:57.518 [2024-07-24 19:54:49.032519] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:57.518 19:54:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 1436476 00:18:57.518 [2024-07-24 19:54:49.109178] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:18:58.085 00:18:58.085 real 0m35.276s 00:18:58.085 user 1m4.695s 00:18:58.085 sys 0m6.094s 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:58.085 ************************************ 00:18:58.085 END TEST raid_state_function_test 00:18:58.085 ************************************ 00:18:58.085 19:54:49 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:18:58.085 19:54:49 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:18:58.085 19:54:49 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:18:58.085 19:54:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:58.085 ************************************ 00:18:58.085 START TEST raid_state_function_test_sb 00:18:58.085 ************************************ 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 4 true 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1441707 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1441707' 00:18:58.085 Process raid pid: 1441707 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1441707 /var/tmp/spdk-raid.sock 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1441707 ']' 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:58.085 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:18:58.085 19:54:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:58.085 [2024-07-24 19:54:49.670104] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:18:58.085 [2024-07-24 19:54:49.670173] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:58.345 [2024-07-24 19:54:49.803191] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:58.345 [2024-07-24 19:54:49.904807] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:58.603 [2024-07-24 19:54:49.968741] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:58.603 [2024-07-24 19:54:49.968770] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:59.170 19:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:18:59.170 19:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:18:59.170 19:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:59.427 [2024-07-24 19:54:50.780101] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:59.427 [2024-07-24 19:54:50.780142] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:59.427 [2024-07-24 19:54:50.780153] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:59.427 [2024-07-24 19:54:50.780165] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:59.427 [2024-07-24 19:54:50.780174] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:59.427 [2024-07-24 19:54:50.780184] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:59.427 [2024-07-24 19:54:50.780193] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:59.427 [2024-07-24 19:54:50.780204] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:59.427 19:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:59.427 19:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:59.428 19:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:59.428 19:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:59.428 19:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:59.428 19:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:59.428 19:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:59.428 19:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:59.428 19:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:59.428 19:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:59.428 19:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:59.428 19:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:59.685 19:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:59.685 "name": "Existed_Raid", 00:18:59.685 "uuid": "c490a8d1-e606-40d1-857e-6dc933a9fe78", 00:18:59.685 "strip_size_kb": 64, 00:18:59.685 "state": "configuring", 00:18:59.685 "raid_level": "raid0", 00:18:59.685 "superblock": true, 00:18:59.685 "num_base_bdevs": 4, 00:18:59.685 "num_base_bdevs_discovered": 0, 00:18:59.685 "num_base_bdevs_operational": 4, 00:18:59.685 "base_bdevs_list": [ 00:18:59.685 { 00:18:59.685 "name": "BaseBdev1", 00:18:59.685 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:59.685 "is_configured": false, 00:18:59.685 "data_offset": 0, 00:18:59.685 "data_size": 0 00:18:59.685 }, 00:18:59.685 { 00:18:59.685 "name": "BaseBdev2", 00:18:59.685 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:59.685 "is_configured": false, 00:18:59.685 "data_offset": 0, 00:18:59.685 "data_size": 0 00:18:59.685 }, 00:18:59.685 { 00:18:59.685 "name": "BaseBdev3", 00:18:59.685 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:59.685 "is_configured": false, 00:18:59.685 "data_offset": 0, 00:18:59.685 "data_size": 0 00:18:59.685 }, 00:18:59.685 { 00:18:59.685 "name": "BaseBdev4", 00:18:59.685 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:59.685 "is_configured": false, 00:18:59.685 "data_offset": 0, 00:18:59.685 "data_size": 0 00:18:59.685 } 00:18:59.685 ] 00:18:59.685 }' 00:18:59.685 19:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:59.685 19:54:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:00.250 19:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:00.250 [2024-07-24 19:54:51.750510] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:00.250 [2024-07-24 19:54:51.750539] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22efa30 name Existed_Raid, state configuring 00:19:00.250 19:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:00.508 [2024-07-24 19:54:51.943056] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:00.508 [2024-07-24 19:54:51.943082] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:00.508 [2024-07-24 19:54:51.943092] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:00.508 [2024-07-24 19:54:51.943103] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:00.508 [2024-07-24 19:54:51.943112] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:00.508 [2024-07-24 19:54:51.943123] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:00.508 [2024-07-24 19:54:51.943131] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:00.508 [2024-07-24 19:54:51.943142] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:00.508 19:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:00.767 [2024-07-24 19:54:52.205803] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:00.767 BaseBdev1 00:19:00.767 19:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:00.767 19:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:19:00.767 19:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:00.767 19:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:00.767 19:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:00.767 19:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:00.767 19:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:01.026 19:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:01.026 [ 00:19:01.026 { 00:19:01.026 "name": "BaseBdev1", 00:19:01.026 "aliases": [ 00:19:01.026 "bf75dc61-68e0-4bd1-acbf-4930b6f856ab" 00:19:01.026 ], 00:19:01.026 "product_name": "Malloc disk", 00:19:01.026 "block_size": 512, 00:19:01.026 "num_blocks": 65536, 00:19:01.026 "uuid": "bf75dc61-68e0-4bd1-acbf-4930b6f856ab", 00:19:01.026 "assigned_rate_limits": { 00:19:01.026 "rw_ios_per_sec": 0, 00:19:01.026 "rw_mbytes_per_sec": 0, 00:19:01.026 "r_mbytes_per_sec": 0, 00:19:01.026 "w_mbytes_per_sec": 0 00:19:01.026 }, 00:19:01.026 "claimed": true, 00:19:01.026 "claim_type": "exclusive_write", 00:19:01.026 "zoned": false, 00:19:01.026 "supported_io_types": { 00:19:01.026 "read": true, 00:19:01.026 "write": true, 00:19:01.026 "unmap": true, 00:19:01.026 "flush": true, 00:19:01.026 "reset": true, 00:19:01.026 "nvme_admin": false, 00:19:01.026 "nvme_io": false, 00:19:01.026 "nvme_io_md": false, 00:19:01.026 "write_zeroes": true, 00:19:01.026 "zcopy": true, 00:19:01.026 "get_zone_info": false, 00:19:01.026 "zone_management": false, 00:19:01.026 "zone_append": false, 00:19:01.026 "compare": false, 00:19:01.026 "compare_and_write": false, 00:19:01.026 "abort": true, 00:19:01.026 "seek_hole": false, 00:19:01.026 "seek_data": false, 00:19:01.026 "copy": true, 00:19:01.026 "nvme_iov_md": false 00:19:01.026 }, 00:19:01.026 "memory_domains": [ 00:19:01.026 { 00:19:01.026 "dma_device_id": "system", 00:19:01.026 "dma_device_type": 1 00:19:01.026 }, 00:19:01.026 { 00:19:01.026 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:01.026 "dma_device_type": 2 00:19:01.026 } 00:19:01.026 ], 00:19:01.026 "driver_specific": {} 00:19:01.026 } 00:19:01.026 ] 00:19:01.285 19:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:01.285 19:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:01.285 19:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:01.285 19:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:01.285 19:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:01.285 19:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:01.285 19:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:01.285 19:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:01.285 19:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:01.285 19:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:01.285 19:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:01.285 19:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:01.285 19:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:01.285 19:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:01.285 "name": "Existed_Raid", 00:19:01.285 "uuid": "cd1a0914-e536-4848-b676-23627fc7fde6", 00:19:01.285 "strip_size_kb": 64, 00:19:01.285 "state": "configuring", 00:19:01.285 "raid_level": "raid0", 00:19:01.285 "superblock": true, 00:19:01.285 "num_base_bdevs": 4, 00:19:01.285 "num_base_bdevs_discovered": 1, 00:19:01.285 "num_base_bdevs_operational": 4, 00:19:01.285 "base_bdevs_list": [ 00:19:01.285 { 00:19:01.285 "name": "BaseBdev1", 00:19:01.285 "uuid": "bf75dc61-68e0-4bd1-acbf-4930b6f856ab", 00:19:01.285 "is_configured": true, 00:19:01.285 "data_offset": 2048, 00:19:01.285 "data_size": 63488 00:19:01.285 }, 00:19:01.285 { 00:19:01.285 "name": "BaseBdev2", 00:19:01.285 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:01.285 "is_configured": false, 00:19:01.285 "data_offset": 0, 00:19:01.285 "data_size": 0 00:19:01.285 }, 00:19:01.285 { 00:19:01.285 "name": "BaseBdev3", 00:19:01.285 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:01.285 "is_configured": false, 00:19:01.285 "data_offset": 0, 00:19:01.285 "data_size": 0 00:19:01.285 }, 00:19:01.285 { 00:19:01.285 "name": "BaseBdev4", 00:19:01.285 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:01.285 "is_configured": false, 00:19:01.285 "data_offset": 0, 00:19:01.285 "data_size": 0 00:19:01.285 } 00:19:01.285 ] 00:19:01.285 }' 00:19:01.285 19:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:01.285 19:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:01.852 19:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:02.112 [2024-07-24 19:54:53.525303] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:02.112 [2024-07-24 19:54:53.525349] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22ef2a0 name Existed_Raid, state configuring 00:19:02.112 19:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:02.371 [2024-07-24 19:54:53.705842] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:02.371 [2024-07-24 19:54:53.707265] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:02.371 [2024-07-24 19:54:53.707299] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:02.371 [2024-07-24 19:54:53.707310] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:02.371 [2024-07-24 19:54:53.707321] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:02.371 [2024-07-24 19:54:53.707330] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:02.371 [2024-07-24 19:54:53.707341] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:02.371 19:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:02.371 19:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:02.371 19:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:02.371 19:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:02.371 19:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:02.371 19:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:02.371 19:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:02.371 19:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:02.371 19:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:02.371 19:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:02.371 19:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:02.371 19:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:02.371 19:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:02.371 19:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:02.371 19:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:02.371 "name": "Existed_Raid", 00:19:02.371 "uuid": "6ea5d89b-49c8-4651-b890-b969bab142b1", 00:19:02.371 "strip_size_kb": 64, 00:19:02.371 "state": "configuring", 00:19:02.371 "raid_level": "raid0", 00:19:02.371 "superblock": true, 00:19:02.371 "num_base_bdevs": 4, 00:19:02.371 "num_base_bdevs_discovered": 1, 00:19:02.371 "num_base_bdevs_operational": 4, 00:19:02.371 "base_bdevs_list": [ 00:19:02.371 { 00:19:02.371 "name": "BaseBdev1", 00:19:02.371 "uuid": "bf75dc61-68e0-4bd1-acbf-4930b6f856ab", 00:19:02.371 "is_configured": true, 00:19:02.371 "data_offset": 2048, 00:19:02.371 "data_size": 63488 00:19:02.371 }, 00:19:02.371 { 00:19:02.371 "name": "BaseBdev2", 00:19:02.371 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:02.371 "is_configured": false, 00:19:02.371 "data_offset": 0, 00:19:02.371 "data_size": 0 00:19:02.371 }, 00:19:02.371 { 00:19:02.371 "name": "BaseBdev3", 00:19:02.371 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:02.371 "is_configured": false, 00:19:02.371 "data_offset": 0, 00:19:02.371 "data_size": 0 00:19:02.371 }, 00:19:02.371 { 00:19:02.371 "name": "BaseBdev4", 00:19:02.371 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:02.371 "is_configured": false, 00:19:02.371 "data_offset": 0, 00:19:02.371 "data_size": 0 00:19:02.371 } 00:19:02.371 ] 00:19:02.371 }' 00:19:02.371 19:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:02.371 19:54:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:03.307 19:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:03.307 [2024-07-24 19:54:54.777271] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:03.307 BaseBdev2 00:19:03.307 19:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:03.307 19:54:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:19:03.307 19:54:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:03.307 19:54:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:03.307 19:54:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:03.307 19:54:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:03.307 19:54:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:03.565 19:54:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:03.823 [ 00:19:03.823 { 00:19:03.823 "name": "BaseBdev2", 00:19:03.823 "aliases": [ 00:19:03.823 "6a519c7f-efb0-4298-81ef-29eeaba7c3b5" 00:19:03.823 ], 00:19:03.823 "product_name": "Malloc disk", 00:19:03.823 "block_size": 512, 00:19:03.823 "num_blocks": 65536, 00:19:03.823 "uuid": "6a519c7f-efb0-4298-81ef-29eeaba7c3b5", 00:19:03.823 "assigned_rate_limits": { 00:19:03.823 "rw_ios_per_sec": 0, 00:19:03.823 "rw_mbytes_per_sec": 0, 00:19:03.823 "r_mbytes_per_sec": 0, 00:19:03.823 "w_mbytes_per_sec": 0 00:19:03.823 }, 00:19:03.823 "claimed": true, 00:19:03.823 "claim_type": "exclusive_write", 00:19:03.823 "zoned": false, 00:19:03.823 "supported_io_types": { 00:19:03.823 "read": true, 00:19:03.823 "write": true, 00:19:03.823 "unmap": true, 00:19:03.823 "flush": true, 00:19:03.823 "reset": true, 00:19:03.823 "nvme_admin": false, 00:19:03.823 "nvme_io": false, 00:19:03.823 "nvme_io_md": false, 00:19:03.823 "write_zeroes": true, 00:19:03.823 "zcopy": true, 00:19:03.823 "get_zone_info": false, 00:19:03.823 "zone_management": false, 00:19:03.823 "zone_append": false, 00:19:03.823 "compare": false, 00:19:03.823 "compare_and_write": false, 00:19:03.823 "abort": true, 00:19:03.823 "seek_hole": false, 00:19:03.823 "seek_data": false, 00:19:03.823 "copy": true, 00:19:03.823 "nvme_iov_md": false 00:19:03.823 }, 00:19:03.823 "memory_domains": [ 00:19:03.823 { 00:19:03.823 "dma_device_id": "system", 00:19:03.823 "dma_device_type": 1 00:19:03.823 }, 00:19:03.823 { 00:19:03.823 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:03.823 "dma_device_type": 2 00:19:03.823 } 00:19:03.823 ], 00:19:03.823 "driver_specific": {} 00:19:03.823 } 00:19:03.823 ] 00:19:03.823 19:54:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:03.823 19:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:03.823 19:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:03.823 19:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:03.823 19:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:03.824 19:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:03.824 19:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:03.824 19:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:03.824 19:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:03.824 19:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:03.824 19:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:03.824 19:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:03.824 19:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:03.824 19:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:03.824 19:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:04.082 19:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:04.082 "name": "Existed_Raid", 00:19:04.082 "uuid": "6ea5d89b-49c8-4651-b890-b969bab142b1", 00:19:04.082 "strip_size_kb": 64, 00:19:04.082 "state": "configuring", 00:19:04.082 "raid_level": "raid0", 00:19:04.082 "superblock": true, 00:19:04.082 "num_base_bdevs": 4, 00:19:04.082 "num_base_bdevs_discovered": 2, 00:19:04.082 "num_base_bdevs_operational": 4, 00:19:04.082 "base_bdevs_list": [ 00:19:04.082 { 00:19:04.082 "name": "BaseBdev1", 00:19:04.082 "uuid": "bf75dc61-68e0-4bd1-acbf-4930b6f856ab", 00:19:04.082 "is_configured": true, 00:19:04.082 "data_offset": 2048, 00:19:04.082 "data_size": 63488 00:19:04.082 }, 00:19:04.082 { 00:19:04.082 "name": "BaseBdev2", 00:19:04.082 "uuid": "6a519c7f-efb0-4298-81ef-29eeaba7c3b5", 00:19:04.082 "is_configured": true, 00:19:04.082 "data_offset": 2048, 00:19:04.082 "data_size": 63488 00:19:04.082 }, 00:19:04.082 { 00:19:04.082 "name": "BaseBdev3", 00:19:04.082 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:04.082 "is_configured": false, 00:19:04.082 "data_offset": 0, 00:19:04.082 "data_size": 0 00:19:04.082 }, 00:19:04.082 { 00:19:04.082 "name": "BaseBdev4", 00:19:04.082 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:04.082 "is_configured": false, 00:19:04.082 "data_offset": 0, 00:19:04.082 "data_size": 0 00:19:04.082 } 00:19:04.082 ] 00:19:04.082 }' 00:19:04.082 19:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:04.082 19:54:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:04.649 19:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:04.908 [2024-07-24 19:54:56.308879] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:04.908 BaseBdev3 00:19:04.908 19:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:04.908 19:54:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:19:04.908 19:54:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:04.908 19:54:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:04.908 19:54:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:04.908 19:54:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:04.908 19:54:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:05.165 19:54:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:05.424 [ 00:19:05.424 { 00:19:05.424 "name": "BaseBdev3", 00:19:05.424 "aliases": [ 00:19:05.424 "1eebe4c0-2964-4fc6-8f97-e527b6d90f1b" 00:19:05.424 ], 00:19:05.424 "product_name": "Malloc disk", 00:19:05.424 "block_size": 512, 00:19:05.424 "num_blocks": 65536, 00:19:05.424 "uuid": "1eebe4c0-2964-4fc6-8f97-e527b6d90f1b", 00:19:05.424 "assigned_rate_limits": { 00:19:05.424 "rw_ios_per_sec": 0, 00:19:05.424 "rw_mbytes_per_sec": 0, 00:19:05.424 "r_mbytes_per_sec": 0, 00:19:05.424 "w_mbytes_per_sec": 0 00:19:05.424 }, 00:19:05.424 "claimed": true, 00:19:05.424 "claim_type": "exclusive_write", 00:19:05.424 "zoned": false, 00:19:05.424 "supported_io_types": { 00:19:05.424 "read": true, 00:19:05.424 "write": true, 00:19:05.424 "unmap": true, 00:19:05.424 "flush": true, 00:19:05.424 "reset": true, 00:19:05.424 "nvme_admin": false, 00:19:05.424 "nvme_io": false, 00:19:05.424 "nvme_io_md": false, 00:19:05.424 "write_zeroes": true, 00:19:05.424 "zcopy": true, 00:19:05.424 "get_zone_info": false, 00:19:05.424 "zone_management": false, 00:19:05.424 "zone_append": false, 00:19:05.424 "compare": false, 00:19:05.424 "compare_and_write": false, 00:19:05.424 "abort": true, 00:19:05.424 "seek_hole": false, 00:19:05.424 "seek_data": false, 00:19:05.424 "copy": true, 00:19:05.424 "nvme_iov_md": false 00:19:05.424 }, 00:19:05.424 "memory_domains": [ 00:19:05.424 { 00:19:05.424 "dma_device_id": "system", 00:19:05.424 "dma_device_type": 1 00:19:05.424 }, 00:19:05.424 { 00:19:05.424 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:05.424 "dma_device_type": 2 00:19:05.424 } 00:19:05.424 ], 00:19:05.424 "driver_specific": {} 00:19:05.424 } 00:19:05.424 ] 00:19:05.424 19:54:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:05.424 19:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:05.424 19:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:05.424 19:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:05.424 19:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:05.424 19:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:05.424 19:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:05.424 19:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:05.424 19:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:05.424 19:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:05.424 19:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:05.424 19:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:05.424 19:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:05.424 19:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:05.424 19:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:05.683 19:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:05.683 "name": "Existed_Raid", 00:19:05.683 "uuid": "6ea5d89b-49c8-4651-b890-b969bab142b1", 00:19:05.683 "strip_size_kb": 64, 00:19:05.683 "state": "configuring", 00:19:05.683 "raid_level": "raid0", 00:19:05.683 "superblock": true, 00:19:05.683 "num_base_bdevs": 4, 00:19:05.683 "num_base_bdevs_discovered": 3, 00:19:05.683 "num_base_bdevs_operational": 4, 00:19:05.683 "base_bdevs_list": [ 00:19:05.683 { 00:19:05.683 "name": "BaseBdev1", 00:19:05.683 "uuid": "bf75dc61-68e0-4bd1-acbf-4930b6f856ab", 00:19:05.683 "is_configured": true, 00:19:05.684 "data_offset": 2048, 00:19:05.684 "data_size": 63488 00:19:05.684 }, 00:19:05.684 { 00:19:05.684 "name": "BaseBdev2", 00:19:05.684 "uuid": "6a519c7f-efb0-4298-81ef-29eeaba7c3b5", 00:19:05.684 "is_configured": true, 00:19:05.684 "data_offset": 2048, 00:19:05.684 "data_size": 63488 00:19:05.684 }, 00:19:05.684 { 00:19:05.684 "name": "BaseBdev3", 00:19:05.684 "uuid": "1eebe4c0-2964-4fc6-8f97-e527b6d90f1b", 00:19:05.684 "is_configured": true, 00:19:05.684 "data_offset": 2048, 00:19:05.684 "data_size": 63488 00:19:05.684 }, 00:19:05.684 { 00:19:05.684 "name": "BaseBdev4", 00:19:05.684 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:05.684 "is_configured": false, 00:19:05.684 "data_offset": 0, 00:19:05.684 "data_size": 0 00:19:05.684 } 00:19:05.684 ] 00:19:05.684 }' 00:19:05.684 19:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:05.684 19:54:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:06.250 19:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:06.509 [2024-07-24 19:54:57.956681] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:06.509 [2024-07-24 19:54:57.956869] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x22f0300 00:19:06.509 [2024-07-24 19:54:57.956883] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:06.509 [2024-07-24 19:54:57.957062] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22f1280 00:19:06.509 [2024-07-24 19:54:57.957193] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22f0300 00:19:06.509 [2024-07-24 19:54:57.957203] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x22f0300 00:19:06.509 [2024-07-24 19:54:57.957295] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:06.509 BaseBdev4 00:19:06.509 19:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:19:06.509 19:54:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:19:06.509 19:54:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:06.509 19:54:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:06.509 19:54:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:06.509 19:54:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:06.509 19:54:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:06.767 19:54:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:07.027 [ 00:19:07.027 { 00:19:07.027 "name": "BaseBdev4", 00:19:07.027 "aliases": [ 00:19:07.027 "3a1363ef-3556-406f-9f3a-612688a204de" 00:19:07.027 ], 00:19:07.027 "product_name": "Malloc disk", 00:19:07.027 "block_size": 512, 00:19:07.027 "num_blocks": 65536, 00:19:07.027 "uuid": "3a1363ef-3556-406f-9f3a-612688a204de", 00:19:07.027 "assigned_rate_limits": { 00:19:07.027 "rw_ios_per_sec": 0, 00:19:07.027 "rw_mbytes_per_sec": 0, 00:19:07.027 "r_mbytes_per_sec": 0, 00:19:07.027 "w_mbytes_per_sec": 0 00:19:07.027 }, 00:19:07.027 "claimed": true, 00:19:07.027 "claim_type": "exclusive_write", 00:19:07.027 "zoned": false, 00:19:07.027 "supported_io_types": { 00:19:07.027 "read": true, 00:19:07.027 "write": true, 00:19:07.027 "unmap": true, 00:19:07.027 "flush": true, 00:19:07.027 "reset": true, 00:19:07.027 "nvme_admin": false, 00:19:07.027 "nvme_io": false, 00:19:07.027 "nvme_io_md": false, 00:19:07.027 "write_zeroes": true, 00:19:07.027 "zcopy": true, 00:19:07.027 "get_zone_info": false, 00:19:07.027 "zone_management": false, 00:19:07.027 "zone_append": false, 00:19:07.027 "compare": false, 00:19:07.027 "compare_and_write": false, 00:19:07.027 "abort": true, 00:19:07.027 "seek_hole": false, 00:19:07.027 "seek_data": false, 00:19:07.027 "copy": true, 00:19:07.027 "nvme_iov_md": false 00:19:07.027 }, 00:19:07.027 "memory_domains": [ 00:19:07.027 { 00:19:07.027 "dma_device_id": "system", 00:19:07.027 "dma_device_type": 1 00:19:07.027 }, 00:19:07.027 { 00:19:07.027 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:07.027 "dma_device_type": 2 00:19:07.027 } 00:19:07.027 ], 00:19:07.027 "driver_specific": {} 00:19:07.027 } 00:19:07.027 ] 00:19:07.027 19:54:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:07.027 19:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:07.027 19:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:07.027 19:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:19:07.027 19:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:07.027 19:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:07.027 19:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:07.027 19:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:07.027 19:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:07.027 19:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:07.027 19:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:07.027 19:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:07.027 19:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:07.028 19:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:07.028 19:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:07.286 19:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:07.286 "name": "Existed_Raid", 00:19:07.286 "uuid": "6ea5d89b-49c8-4651-b890-b969bab142b1", 00:19:07.286 "strip_size_kb": 64, 00:19:07.286 "state": "online", 00:19:07.286 "raid_level": "raid0", 00:19:07.286 "superblock": true, 00:19:07.286 "num_base_bdevs": 4, 00:19:07.286 "num_base_bdevs_discovered": 4, 00:19:07.286 "num_base_bdevs_operational": 4, 00:19:07.286 "base_bdevs_list": [ 00:19:07.286 { 00:19:07.286 "name": "BaseBdev1", 00:19:07.286 "uuid": "bf75dc61-68e0-4bd1-acbf-4930b6f856ab", 00:19:07.286 "is_configured": true, 00:19:07.286 "data_offset": 2048, 00:19:07.286 "data_size": 63488 00:19:07.286 }, 00:19:07.286 { 00:19:07.286 "name": "BaseBdev2", 00:19:07.286 "uuid": "6a519c7f-efb0-4298-81ef-29eeaba7c3b5", 00:19:07.286 "is_configured": true, 00:19:07.286 "data_offset": 2048, 00:19:07.286 "data_size": 63488 00:19:07.286 }, 00:19:07.286 { 00:19:07.286 "name": "BaseBdev3", 00:19:07.286 "uuid": "1eebe4c0-2964-4fc6-8f97-e527b6d90f1b", 00:19:07.286 "is_configured": true, 00:19:07.286 "data_offset": 2048, 00:19:07.286 "data_size": 63488 00:19:07.286 }, 00:19:07.286 { 00:19:07.286 "name": "BaseBdev4", 00:19:07.286 "uuid": "3a1363ef-3556-406f-9f3a-612688a204de", 00:19:07.286 "is_configured": true, 00:19:07.286 "data_offset": 2048, 00:19:07.286 "data_size": 63488 00:19:07.286 } 00:19:07.286 ] 00:19:07.286 }' 00:19:07.286 19:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:07.286 19:54:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:07.853 19:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:07.853 19:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:07.853 19:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:07.853 19:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:07.853 19:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:07.853 19:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:07.853 19:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:07.853 19:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:08.111 [2024-07-24 19:54:59.613415] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:08.111 19:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:08.111 "name": "Existed_Raid", 00:19:08.111 "aliases": [ 00:19:08.111 "6ea5d89b-49c8-4651-b890-b969bab142b1" 00:19:08.111 ], 00:19:08.111 "product_name": "Raid Volume", 00:19:08.111 "block_size": 512, 00:19:08.111 "num_blocks": 253952, 00:19:08.111 "uuid": "6ea5d89b-49c8-4651-b890-b969bab142b1", 00:19:08.111 "assigned_rate_limits": { 00:19:08.111 "rw_ios_per_sec": 0, 00:19:08.111 "rw_mbytes_per_sec": 0, 00:19:08.111 "r_mbytes_per_sec": 0, 00:19:08.111 "w_mbytes_per_sec": 0 00:19:08.111 }, 00:19:08.111 "claimed": false, 00:19:08.111 "zoned": false, 00:19:08.111 "supported_io_types": { 00:19:08.111 "read": true, 00:19:08.111 "write": true, 00:19:08.111 "unmap": true, 00:19:08.111 "flush": true, 00:19:08.111 "reset": true, 00:19:08.111 "nvme_admin": false, 00:19:08.111 "nvme_io": false, 00:19:08.111 "nvme_io_md": false, 00:19:08.111 "write_zeroes": true, 00:19:08.111 "zcopy": false, 00:19:08.111 "get_zone_info": false, 00:19:08.111 "zone_management": false, 00:19:08.111 "zone_append": false, 00:19:08.111 "compare": false, 00:19:08.111 "compare_and_write": false, 00:19:08.111 "abort": false, 00:19:08.111 "seek_hole": false, 00:19:08.111 "seek_data": false, 00:19:08.111 "copy": false, 00:19:08.111 "nvme_iov_md": false 00:19:08.111 }, 00:19:08.111 "memory_domains": [ 00:19:08.111 { 00:19:08.111 "dma_device_id": "system", 00:19:08.111 "dma_device_type": 1 00:19:08.111 }, 00:19:08.111 { 00:19:08.111 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:08.111 "dma_device_type": 2 00:19:08.111 }, 00:19:08.111 { 00:19:08.111 "dma_device_id": "system", 00:19:08.111 "dma_device_type": 1 00:19:08.111 }, 00:19:08.111 { 00:19:08.111 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:08.111 "dma_device_type": 2 00:19:08.111 }, 00:19:08.111 { 00:19:08.111 "dma_device_id": "system", 00:19:08.111 "dma_device_type": 1 00:19:08.111 }, 00:19:08.111 { 00:19:08.111 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:08.111 "dma_device_type": 2 00:19:08.111 }, 00:19:08.111 { 00:19:08.111 "dma_device_id": "system", 00:19:08.111 "dma_device_type": 1 00:19:08.111 }, 00:19:08.111 { 00:19:08.111 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:08.111 "dma_device_type": 2 00:19:08.111 } 00:19:08.111 ], 00:19:08.111 "driver_specific": { 00:19:08.111 "raid": { 00:19:08.111 "uuid": "6ea5d89b-49c8-4651-b890-b969bab142b1", 00:19:08.111 "strip_size_kb": 64, 00:19:08.111 "state": "online", 00:19:08.111 "raid_level": "raid0", 00:19:08.111 "superblock": true, 00:19:08.111 "num_base_bdevs": 4, 00:19:08.111 "num_base_bdevs_discovered": 4, 00:19:08.111 "num_base_bdevs_operational": 4, 00:19:08.111 "base_bdevs_list": [ 00:19:08.111 { 00:19:08.111 "name": "BaseBdev1", 00:19:08.111 "uuid": "bf75dc61-68e0-4bd1-acbf-4930b6f856ab", 00:19:08.111 "is_configured": true, 00:19:08.111 "data_offset": 2048, 00:19:08.111 "data_size": 63488 00:19:08.111 }, 00:19:08.111 { 00:19:08.111 "name": "BaseBdev2", 00:19:08.111 "uuid": "6a519c7f-efb0-4298-81ef-29eeaba7c3b5", 00:19:08.111 "is_configured": true, 00:19:08.111 "data_offset": 2048, 00:19:08.111 "data_size": 63488 00:19:08.111 }, 00:19:08.111 { 00:19:08.111 "name": "BaseBdev3", 00:19:08.111 "uuid": "1eebe4c0-2964-4fc6-8f97-e527b6d90f1b", 00:19:08.111 "is_configured": true, 00:19:08.111 "data_offset": 2048, 00:19:08.111 "data_size": 63488 00:19:08.111 }, 00:19:08.111 { 00:19:08.111 "name": "BaseBdev4", 00:19:08.111 "uuid": "3a1363ef-3556-406f-9f3a-612688a204de", 00:19:08.111 "is_configured": true, 00:19:08.111 "data_offset": 2048, 00:19:08.111 "data_size": 63488 00:19:08.111 } 00:19:08.111 ] 00:19:08.111 } 00:19:08.111 } 00:19:08.111 }' 00:19:08.111 19:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:08.111 19:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:08.111 BaseBdev2 00:19:08.111 BaseBdev3 00:19:08.111 BaseBdev4' 00:19:08.111 19:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:08.112 19:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:08.112 19:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:08.370 19:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:08.370 "name": "BaseBdev1", 00:19:08.370 "aliases": [ 00:19:08.370 "bf75dc61-68e0-4bd1-acbf-4930b6f856ab" 00:19:08.370 ], 00:19:08.370 "product_name": "Malloc disk", 00:19:08.370 "block_size": 512, 00:19:08.370 "num_blocks": 65536, 00:19:08.370 "uuid": "bf75dc61-68e0-4bd1-acbf-4930b6f856ab", 00:19:08.370 "assigned_rate_limits": { 00:19:08.370 "rw_ios_per_sec": 0, 00:19:08.370 "rw_mbytes_per_sec": 0, 00:19:08.370 "r_mbytes_per_sec": 0, 00:19:08.370 "w_mbytes_per_sec": 0 00:19:08.370 }, 00:19:08.370 "claimed": true, 00:19:08.370 "claim_type": "exclusive_write", 00:19:08.370 "zoned": false, 00:19:08.370 "supported_io_types": { 00:19:08.370 "read": true, 00:19:08.370 "write": true, 00:19:08.370 "unmap": true, 00:19:08.370 "flush": true, 00:19:08.370 "reset": true, 00:19:08.370 "nvme_admin": false, 00:19:08.370 "nvme_io": false, 00:19:08.370 "nvme_io_md": false, 00:19:08.370 "write_zeroes": true, 00:19:08.370 "zcopy": true, 00:19:08.370 "get_zone_info": false, 00:19:08.370 "zone_management": false, 00:19:08.370 "zone_append": false, 00:19:08.370 "compare": false, 00:19:08.370 "compare_and_write": false, 00:19:08.370 "abort": true, 00:19:08.370 "seek_hole": false, 00:19:08.370 "seek_data": false, 00:19:08.370 "copy": true, 00:19:08.370 "nvme_iov_md": false 00:19:08.370 }, 00:19:08.370 "memory_domains": [ 00:19:08.370 { 00:19:08.370 "dma_device_id": "system", 00:19:08.370 "dma_device_type": 1 00:19:08.370 }, 00:19:08.370 { 00:19:08.370 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:08.370 "dma_device_type": 2 00:19:08.370 } 00:19:08.370 ], 00:19:08.370 "driver_specific": {} 00:19:08.370 }' 00:19:08.370 19:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:08.628 19:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:08.628 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:08.628 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:08.628 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:08.628 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:08.628 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:08.628 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:08.628 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:08.628 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:08.887 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:08.887 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:08.887 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:08.887 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:08.887 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:09.145 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:09.145 "name": "BaseBdev2", 00:19:09.145 "aliases": [ 00:19:09.145 "6a519c7f-efb0-4298-81ef-29eeaba7c3b5" 00:19:09.145 ], 00:19:09.145 "product_name": "Malloc disk", 00:19:09.145 "block_size": 512, 00:19:09.145 "num_blocks": 65536, 00:19:09.145 "uuid": "6a519c7f-efb0-4298-81ef-29eeaba7c3b5", 00:19:09.145 "assigned_rate_limits": { 00:19:09.145 "rw_ios_per_sec": 0, 00:19:09.145 "rw_mbytes_per_sec": 0, 00:19:09.145 "r_mbytes_per_sec": 0, 00:19:09.145 "w_mbytes_per_sec": 0 00:19:09.145 }, 00:19:09.145 "claimed": true, 00:19:09.145 "claim_type": "exclusive_write", 00:19:09.145 "zoned": false, 00:19:09.145 "supported_io_types": { 00:19:09.145 "read": true, 00:19:09.145 "write": true, 00:19:09.145 "unmap": true, 00:19:09.145 "flush": true, 00:19:09.145 "reset": true, 00:19:09.145 "nvme_admin": false, 00:19:09.145 "nvme_io": false, 00:19:09.145 "nvme_io_md": false, 00:19:09.145 "write_zeroes": true, 00:19:09.145 "zcopy": true, 00:19:09.145 "get_zone_info": false, 00:19:09.145 "zone_management": false, 00:19:09.145 "zone_append": false, 00:19:09.145 "compare": false, 00:19:09.145 "compare_and_write": false, 00:19:09.145 "abort": true, 00:19:09.145 "seek_hole": false, 00:19:09.145 "seek_data": false, 00:19:09.145 "copy": true, 00:19:09.145 "nvme_iov_md": false 00:19:09.145 }, 00:19:09.145 "memory_domains": [ 00:19:09.145 { 00:19:09.145 "dma_device_id": "system", 00:19:09.145 "dma_device_type": 1 00:19:09.145 }, 00:19:09.145 { 00:19:09.145 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:09.145 "dma_device_type": 2 00:19:09.145 } 00:19:09.145 ], 00:19:09.145 "driver_specific": {} 00:19:09.145 }' 00:19:09.145 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:09.145 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:09.145 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:09.145 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:09.145 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:09.145 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:09.145 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:09.404 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:09.404 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:09.404 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:09.404 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:09.404 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:09.404 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:09.404 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:09.404 19:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:09.662 19:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:09.662 "name": "BaseBdev3", 00:19:09.662 "aliases": [ 00:19:09.662 "1eebe4c0-2964-4fc6-8f97-e527b6d90f1b" 00:19:09.662 ], 00:19:09.662 "product_name": "Malloc disk", 00:19:09.662 "block_size": 512, 00:19:09.662 "num_blocks": 65536, 00:19:09.662 "uuid": "1eebe4c0-2964-4fc6-8f97-e527b6d90f1b", 00:19:09.662 "assigned_rate_limits": { 00:19:09.662 "rw_ios_per_sec": 0, 00:19:09.662 "rw_mbytes_per_sec": 0, 00:19:09.662 "r_mbytes_per_sec": 0, 00:19:09.662 "w_mbytes_per_sec": 0 00:19:09.662 }, 00:19:09.662 "claimed": true, 00:19:09.662 "claim_type": "exclusive_write", 00:19:09.662 "zoned": false, 00:19:09.662 "supported_io_types": { 00:19:09.662 "read": true, 00:19:09.662 "write": true, 00:19:09.662 "unmap": true, 00:19:09.662 "flush": true, 00:19:09.662 "reset": true, 00:19:09.662 "nvme_admin": false, 00:19:09.662 "nvme_io": false, 00:19:09.662 "nvme_io_md": false, 00:19:09.662 "write_zeroes": true, 00:19:09.662 "zcopy": true, 00:19:09.662 "get_zone_info": false, 00:19:09.662 "zone_management": false, 00:19:09.662 "zone_append": false, 00:19:09.662 "compare": false, 00:19:09.662 "compare_and_write": false, 00:19:09.662 "abort": true, 00:19:09.662 "seek_hole": false, 00:19:09.662 "seek_data": false, 00:19:09.662 "copy": true, 00:19:09.662 "nvme_iov_md": false 00:19:09.662 }, 00:19:09.662 "memory_domains": [ 00:19:09.662 { 00:19:09.662 "dma_device_id": "system", 00:19:09.662 "dma_device_type": 1 00:19:09.662 }, 00:19:09.662 { 00:19:09.662 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:09.662 "dma_device_type": 2 00:19:09.662 } 00:19:09.662 ], 00:19:09.662 "driver_specific": {} 00:19:09.662 }' 00:19:09.662 19:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:09.662 19:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:09.662 19:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:09.662 19:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:09.920 19:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:09.920 19:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:09.920 19:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:09.920 19:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:09.920 19:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:09.920 19:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:10.177 19:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:10.177 19:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:10.177 19:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:10.177 19:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:10.178 19:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:10.743 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:10.743 "name": "BaseBdev4", 00:19:10.743 "aliases": [ 00:19:10.743 "3a1363ef-3556-406f-9f3a-612688a204de" 00:19:10.743 ], 00:19:10.743 "product_name": "Malloc disk", 00:19:10.743 "block_size": 512, 00:19:10.743 "num_blocks": 65536, 00:19:10.743 "uuid": "3a1363ef-3556-406f-9f3a-612688a204de", 00:19:10.743 "assigned_rate_limits": { 00:19:10.743 "rw_ios_per_sec": 0, 00:19:10.743 "rw_mbytes_per_sec": 0, 00:19:10.743 "r_mbytes_per_sec": 0, 00:19:10.743 "w_mbytes_per_sec": 0 00:19:10.743 }, 00:19:10.743 "claimed": true, 00:19:10.743 "claim_type": "exclusive_write", 00:19:10.743 "zoned": false, 00:19:10.743 "supported_io_types": { 00:19:10.743 "read": true, 00:19:10.743 "write": true, 00:19:10.743 "unmap": true, 00:19:10.743 "flush": true, 00:19:10.743 "reset": true, 00:19:10.743 "nvme_admin": false, 00:19:10.743 "nvme_io": false, 00:19:10.743 "nvme_io_md": false, 00:19:10.743 "write_zeroes": true, 00:19:10.743 "zcopy": true, 00:19:10.743 "get_zone_info": false, 00:19:10.743 "zone_management": false, 00:19:10.743 "zone_append": false, 00:19:10.743 "compare": false, 00:19:10.743 "compare_and_write": false, 00:19:10.743 "abort": true, 00:19:10.743 "seek_hole": false, 00:19:10.743 "seek_data": false, 00:19:10.743 "copy": true, 00:19:10.743 "nvme_iov_md": false 00:19:10.743 }, 00:19:10.743 "memory_domains": [ 00:19:10.743 { 00:19:10.743 "dma_device_id": "system", 00:19:10.743 "dma_device_type": 1 00:19:10.743 }, 00:19:10.743 { 00:19:10.743 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:10.743 "dma_device_type": 2 00:19:10.743 } 00:19:10.743 ], 00:19:10.743 "driver_specific": {} 00:19:10.743 }' 00:19:10.743 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:10.743 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:10.743 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:10.743 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:10.743 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:10.743 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:10.743 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:10.743 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:11.001 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:11.001 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:11.001 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:11.001 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:11.001 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:11.259 [2024-07-24 19:55:02.685321] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:11.259 [2024-07-24 19:55:02.685353] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:11.259 [2024-07-24 19:55:02.685408] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:11.259 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:11.259 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:19:11.259 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:11.259 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:19:11.259 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:19:11.259 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:19:11.259 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:11.259 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:19:11.259 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:11.259 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:11.259 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:11.259 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:11.259 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:11.259 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:11.259 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:11.259 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:11.259 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:11.517 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:11.517 "name": "Existed_Raid", 00:19:11.517 "uuid": "6ea5d89b-49c8-4651-b890-b969bab142b1", 00:19:11.517 "strip_size_kb": 64, 00:19:11.517 "state": "offline", 00:19:11.517 "raid_level": "raid0", 00:19:11.517 "superblock": true, 00:19:11.517 "num_base_bdevs": 4, 00:19:11.517 "num_base_bdevs_discovered": 3, 00:19:11.517 "num_base_bdevs_operational": 3, 00:19:11.517 "base_bdevs_list": [ 00:19:11.517 { 00:19:11.517 "name": null, 00:19:11.517 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:11.517 "is_configured": false, 00:19:11.517 "data_offset": 2048, 00:19:11.517 "data_size": 63488 00:19:11.517 }, 00:19:11.517 { 00:19:11.517 "name": "BaseBdev2", 00:19:11.517 "uuid": "6a519c7f-efb0-4298-81ef-29eeaba7c3b5", 00:19:11.517 "is_configured": true, 00:19:11.517 "data_offset": 2048, 00:19:11.517 "data_size": 63488 00:19:11.517 }, 00:19:11.517 { 00:19:11.517 "name": "BaseBdev3", 00:19:11.517 "uuid": "1eebe4c0-2964-4fc6-8f97-e527b6d90f1b", 00:19:11.517 "is_configured": true, 00:19:11.517 "data_offset": 2048, 00:19:11.517 "data_size": 63488 00:19:11.517 }, 00:19:11.517 { 00:19:11.517 "name": "BaseBdev4", 00:19:11.517 "uuid": "3a1363ef-3556-406f-9f3a-612688a204de", 00:19:11.517 "is_configured": true, 00:19:11.517 "data_offset": 2048, 00:19:11.517 "data_size": 63488 00:19:11.517 } 00:19:11.517 ] 00:19:11.517 }' 00:19:11.517 19:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:11.517 19:55:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:12.082 19:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:12.082 19:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:12.082 19:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:12.082 19:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:12.339 19:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:12.339 19:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:12.339 19:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:12.597 [2024-07-24 19:55:03.981934] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:12.597 19:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:12.597 19:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:12.597 19:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:12.597 19:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:12.855 19:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:12.855 19:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:12.855 19:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:13.113 [2024-07-24 19:55:04.448594] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:13.113 19:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:13.113 19:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:13.113 19:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:13.113 19:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:13.373 19:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:13.373 19:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:13.373 19:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:19:13.373 [2024-07-24 19:55:04.956975] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:19:13.373 [2024-07-24 19:55:04.957059] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22f0300 name Existed_Raid, state offline 00:19:13.669 19:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:13.669 19:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:13.669 19:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:13.669 19:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:13.669 19:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:13.669 19:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:13.669 19:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:19:13.669 19:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:13.669 19:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:13.669 19:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:13.928 BaseBdev2 00:19:13.928 19:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:13.928 19:55:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:19:13.928 19:55:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:13.928 19:55:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:13.928 19:55:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:13.928 19:55:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:13.928 19:55:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:14.187 19:55:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:14.446 [ 00:19:14.446 { 00:19:14.446 "name": "BaseBdev2", 00:19:14.446 "aliases": [ 00:19:14.446 "cf4cb410-e9a8-4a84-a021-7e522fb00e9a" 00:19:14.446 ], 00:19:14.446 "product_name": "Malloc disk", 00:19:14.446 "block_size": 512, 00:19:14.446 "num_blocks": 65536, 00:19:14.446 "uuid": "cf4cb410-e9a8-4a84-a021-7e522fb00e9a", 00:19:14.446 "assigned_rate_limits": { 00:19:14.446 "rw_ios_per_sec": 0, 00:19:14.446 "rw_mbytes_per_sec": 0, 00:19:14.446 "r_mbytes_per_sec": 0, 00:19:14.446 "w_mbytes_per_sec": 0 00:19:14.446 }, 00:19:14.446 "claimed": false, 00:19:14.446 "zoned": false, 00:19:14.446 "supported_io_types": { 00:19:14.446 "read": true, 00:19:14.446 "write": true, 00:19:14.446 "unmap": true, 00:19:14.446 "flush": true, 00:19:14.446 "reset": true, 00:19:14.446 "nvme_admin": false, 00:19:14.446 "nvme_io": false, 00:19:14.446 "nvme_io_md": false, 00:19:14.446 "write_zeroes": true, 00:19:14.446 "zcopy": true, 00:19:14.446 "get_zone_info": false, 00:19:14.446 "zone_management": false, 00:19:14.446 "zone_append": false, 00:19:14.446 "compare": false, 00:19:14.446 "compare_and_write": false, 00:19:14.446 "abort": true, 00:19:14.446 "seek_hole": false, 00:19:14.446 "seek_data": false, 00:19:14.446 "copy": true, 00:19:14.446 "nvme_iov_md": false 00:19:14.446 }, 00:19:14.446 "memory_domains": [ 00:19:14.446 { 00:19:14.446 "dma_device_id": "system", 00:19:14.446 "dma_device_type": 1 00:19:14.446 }, 00:19:14.446 { 00:19:14.446 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:14.446 "dma_device_type": 2 00:19:14.446 } 00:19:14.446 ], 00:19:14.446 "driver_specific": {} 00:19:14.446 } 00:19:14.446 ] 00:19:14.446 19:55:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:14.446 19:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:14.446 19:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:14.446 19:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:14.706 BaseBdev3 00:19:14.706 19:55:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:19:14.706 19:55:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:19:14.706 19:55:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:14.706 19:55:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:14.706 19:55:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:14.706 19:55:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:14.706 19:55:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:14.965 19:55:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:15.237 [ 00:19:15.237 { 00:19:15.237 "name": "BaseBdev3", 00:19:15.237 "aliases": [ 00:19:15.237 "934290c4-18e8-438e-9f3f-125a220c70d0" 00:19:15.237 ], 00:19:15.237 "product_name": "Malloc disk", 00:19:15.237 "block_size": 512, 00:19:15.237 "num_blocks": 65536, 00:19:15.237 "uuid": "934290c4-18e8-438e-9f3f-125a220c70d0", 00:19:15.237 "assigned_rate_limits": { 00:19:15.237 "rw_ios_per_sec": 0, 00:19:15.237 "rw_mbytes_per_sec": 0, 00:19:15.237 "r_mbytes_per_sec": 0, 00:19:15.237 "w_mbytes_per_sec": 0 00:19:15.237 }, 00:19:15.237 "claimed": false, 00:19:15.237 "zoned": false, 00:19:15.237 "supported_io_types": { 00:19:15.237 "read": true, 00:19:15.237 "write": true, 00:19:15.237 "unmap": true, 00:19:15.237 "flush": true, 00:19:15.237 "reset": true, 00:19:15.237 "nvme_admin": false, 00:19:15.237 "nvme_io": false, 00:19:15.237 "nvme_io_md": false, 00:19:15.237 "write_zeroes": true, 00:19:15.237 "zcopy": true, 00:19:15.237 "get_zone_info": false, 00:19:15.237 "zone_management": false, 00:19:15.237 "zone_append": false, 00:19:15.237 "compare": false, 00:19:15.237 "compare_and_write": false, 00:19:15.237 "abort": true, 00:19:15.237 "seek_hole": false, 00:19:15.237 "seek_data": false, 00:19:15.237 "copy": true, 00:19:15.237 "nvme_iov_md": false 00:19:15.237 }, 00:19:15.237 "memory_domains": [ 00:19:15.237 { 00:19:15.237 "dma_device_id": "system", 00:19:15.237 "dma_device_type": 1 00:19:15.237 }, 00:19:15.237 { 00:19:15.237 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:15.237 "dma_device_type": 2 00:19:15.237 } 00:19:15.237 ], 00:19:15.237 "driver_specific": {} 00:19:15.237 } 00:19:15.237 ] 00:19:15.237 19:55:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:15.237 19:55:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:15.237 19:55:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:15.237 19:55:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:15.497 BaseBdev4 00:19:15.497 19:55:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:19:15.498 19:55:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:19:15.498 19:55:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:15.498 19:55:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:15.498 19:55:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:15.498 19:55:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:15.498 19:55:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:15.757 19:55:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:16.016 [ 00:19:16.016 { 00:19:16.016 "name": "BaseBdev4", 00:19:16.016 "aliases": [ 00:19:16.016 "dd898240-692e-4d59-98d3-943178a64621" 00:19:16.016 ], 00:19:16.016 "product_name": "Malloc disk", 00:19:16.016 "block_size": 512, 00:19:16.016 "num_blocks": 65536, 00:19:16.016 "uuid": "dd898240-692e-4d59-98d3-943178a64621", 00:19:16.016 "assigned_rate_limits": { 00:19:16.016 "rw_ios_per_sec": 0, 00:19:16.016 "rw_mbytes_per_sec": 0, 00:19:16.016 "r_mbytes_per_sec": 0, 00:19:16.016 "w_mbytes_per_sec": 0 00:19:16.016 }, 00:19:16.016 "claimed": false, 00:19:16.016 "zoned": false, 00:19:16.016 "supported_io_types": { 00:19:16.016 "read": true, 00:19:16.016 "write": true, 00:19:16.016 "unmap": true, 00:19:16.016 "flush": true, 00:19:16.016 "reset": true, 00:19:16.016 "nvme_admin": false, 00:19:16.016 "nvme_io": false, 00:19:16.016 "nvme_io_md": false, 00:19:16.016 "write_zeroes": true, 00:19:16.016 "zcopy": true, 00:19:16.016 "get_zone_info": false, 00:19:16.016 "zone_management": false, 00:19:16.016 "zone_append": false, 00:19:16.016 "compare": false, 00:19:16.016 "compare_and_write": false, 00:19:16.016 "abort": true, 00:19:16.016 "seek_hole": false, 00:19:16.016 "seek_data": false, 00:19:16.016 "copy": true, 00:19:16.016 "nvme_iov_md": false 00:19:16.016 }, 00:19:16.016 "memory_domains": [ 00:19:16.016 { 00:19:16.016 "dma_device_id": "system", 00:19:16.016 "dma_device_type": 1 00:19:16.016 }, 00:19:16.016 { 00:19:16.016 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:16.016 "dma_device_type": 2 00:19:16.016 } 00:19:16.016 ], 00:19:16.016 "driver_specific": {} 00:19:16.016 } 00:19:16.016 ] 00:19:16.016 19:55:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:16.016 19:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:16.016 19:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:16.016 19:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:16.275 [2024-07-24 19:55:07.686866] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:16.275 [2024-07-24 19:55:07.686937] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:16.275 [2024-07-24 19:55:07.686978] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:16.275 [2024-07-24 19:55:07.689656] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:16.275 [2024-07-24 19:55:07.689771] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:16.275 19:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:16.275 19:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:16.275 19:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:16.275 19:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:16.275 19:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:16.275 19:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:16.275 19:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:16.275 19:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:16.275 19:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:16.275 19:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:16.275 19:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:16.275 19:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:16.534 19:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:16.534 "name": "Existed_Raid", 00:19:16.534 "uuid": "167f6c9e-970c-4600-90fa-57a2f4d9f99f", 00:19:16.534 "strip_size_kb": 64, 00:19:16.534 "state": "configuring", 00:19:16.534 "raid_level": "raid0", 00:19:16.534 "superblock": true, 00:19:16.534 "num_base_bdevs": 4, 00:19:16.534 "num_base_bdevs_discovered": 3, 00:19:16.534 "num_base_bdevs_operational": 4, 00:19:16.534 "base_bdevs_list": [ 00:19:16.534 { 00:19:16.534 "name": "BaseBdev1", 00:19:16.534 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:16.534 "is_configured": false, 00:19:16.534 "data_offset": 0, 00:19:16.534 "data_size": 0 00:19:16.534 }, 00:19:16.534 { 00:19:16.534 "name": "BaseBdev2", 00:19:16.534 "uuid": "cf4cb410-e9a8-4a84-a021-7e522fb00e9a", 00:19:16.534 "is_configured": true, 00:19:16.534 "data_offset": 2048, 00:19:16.534 "data_size": 63488 00:19:16.534 }, 00:19:16.534 { 00:19:16.534 "name": "BaseBdev3", 00:19:16.534 "uuid": "934290c4-18e8-438e-9f3f-125a220c70d0", 00:19:16.534 "is_configured": true, 00:19:16.534 "data_offset": 2048, 00:19:16.534 "data_size": 63488 00:19:16.534 }, 00:19:16.534 { 00:19:16.534 "name": "BaseBdev4", 00:19:16.534 "uuid": "dd898240-692e-4d59-98d3-943178a64621", 00:19:16.534 "is_configured": true, 00:19:16.534 "data_offset": 2048, 00:19:16.534 "data_size": 63488 00:19:16.534 } 00:19:16.534 ] 00:19:16.534 }' 00:19:16.534 19:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:16.534 19:55:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:17.101 19:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:17.360 [2024-07-24 19:55:08.737878] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:17.360 19:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:17.360 19:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:17.360 19:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:17.360 19:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:17.360 19:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:17.360 19:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:17.360 19:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:17.360 19:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:17.360 19:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:17.360 19:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:17.360 19:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:17.360 19:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:17.619 19:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:17.619 "name": "Existed_Raid", 00:19:17.619 "uuid": "167f6c9e-970c-4600-90fa-57a2f4d9f99f", 00:19:17.619 "strip_size_kb": 64, 00:19:17.619 "state": "configuring", 00:19:17.619 "raid_level": "raid0", 00:19:17.619 "superblock": true, 00:19:17.619 "num_base_bdevs": 4, 00:19:17.619 "num_base_bdevs_discovered": 2, 00:19:17.619 "num_base_bdevs_operational": 4, 00:19:17.619 "base_bdevs_list": [ 00:19:17.619 { 00:19:17.619 "name": "BaseBdev1", 00:19:17.619 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:17.619 "is_configured": false, 00:19:17.619 "data_offset": 0, 00:19:17.619 "data_size": 0 00:19:17.619 }, 00:19:17.619 { 00:19:17.619 "name": null, 00:19:17.619 "uuid": "cf4cb410-e9a8-4a84-a021-7e522fb00e9a", 00:19:17.619 "is_configured": false, 00:19:17.619 "data_offset": 2048, 00:19:17.619 "data_size": 63488 00:19:17.619 }, 00:19:17.619 { 00:19:17.619 "name": "BaseBdev3", 00:19:17.619 "uuid": "934290c4-18e8-438e-9f3f-125a220c70d0", 00:19:17.619 "is_configured": true, 00:19:17.619 "data_offset": 2048, 00:19:17.619 "data_size": 63488 00:19:17.619 }, 00:19:17.619 { 00:19:17.619 "name": "BaseBdev4", 00:19:17.619 "uuid": "dd898240-692e-4d59-98d3-943178a64621", 00:19:17.619 "is_configured": true, 00:19:17.619 "data_offset": 2048, 00:19:17.619 "data_size": 63488 00:19:17.619 } 00:19:17.619 ] 00:19:17.619 }' 00:19:17.619 19:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:17.619 19:55:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:18.186 19:55:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:18.186 19:55:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:18.186 19:55:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:18.186 19:55:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:18.445 [2024-07-24 19:55:09.968526] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:18.445 BaseBdev1 00:19:18.445 19:55:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:18.445 19:55:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:19:18.445 19:55:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:18.445 19:55:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:18.445 19:55:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:18.445 19:55:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:18.445 19:55:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:18.704 19:55:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:18.962 [ 00:19:18.962 { 00:19:18.962 "name": "BaseBdev1", 00:19:18.962 "aliases": [ 00:19:18.962 "f08672ef-a83e-44fa-b554-48c8e9a3d47b" 00:19:18.962 ], 00:19:18.962 "product_name": "Malloc disk", 00:19:18.962 "block_size": 512, 00:19:18.962 "num_blocks": 65536, 00:19:18.962 "uuid": "f08672ef-a83e-44fa-b554-48c8e9a3d47b", 00:19:18.962 "assigned_rate_limits": { 00:19:18.962 "rw_ios_per_sec": 0, 00:19:18.963 "rw_mbytes_per_sec": 0, 00:19:18.963 "r_mbytes_per_sec": 0, 00:19:18.963 "w_mbytes_per_sec": 0 00:19:18.963 }, 00:19:18.963 "claimed": true, 00:19:18.963 "claim_type": "exclusive_write", 00:19:18.963 "zoned": false, 00:19:18.963 "supported_io_types": { 00:19:18.963 "read": true, 00:19:18.963 "write": true, 00:19:18.963 "unmap": true, 00:19:18.963 "flush": true, 00:19:18.963 "reset": true, 00:19:18.963 "nvme_admin": false, 00:19:18.963 "nvme_io": false, 00:19:18.963 "nvme_io_md": false, 00:19:18.963 "write_zeroes": true, 00:19:18.963 "zcopy": true, 00:19:18.963 "get_zone_info": false, 00:19:18.963 "zone_management": false, 00:19:18.963 "zone_append": false, 00:19:18.963 "compare": false, 00:19:18.963 "compare_and_write": false, 00:19:18.963 "abort": true, 00:19:18.963 "seek_hole": false, 00:19:18.963 "seek_data": false, 00:19:18.963 "copy": true, 00:19:18.963 "nvme_iov_md": false 00:19:18.963 }, 00:19:18.963 "memory_domains": [ 00:19:18.963 { 00:19:18.963 "dma_device_id": "system", 00:19:18.963 "dma_device_type": 1 00:19:18.963 }, 00:19:18.963 { 00:19:18.963 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:18.963 "dma_device_type": 2 00:19:18.963 } 00:19:18.963 ], 00:19:18.963 "driver_specific": {} 00:19:18.963 } 00:19:18.963 ] 00:19:18.963 19:55:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:18.963 19:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:18.963 19:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:18.963 19:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:18.963 19:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:18.963 19:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:18.963 19:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:18.963 19:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:18.963 19:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:18.963 19:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:18.963 19:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:18.963 19:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:18.963 19:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:19.222 19:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:19.222 "name": "Existed_Raid", 00:19:19.222 "uuid": "167f6c9e-970c-4600-90fa-57a2f4d9f99f", 00:19:19.222 "strip_size_kb": 64, 00:19:19.222 "state": "configuring", 00:19:19.222 "raid_level": "raid0", 00:19:19.222 "superblock": true, 00:19:19.222 "num_base_bdevs": 4, 00:19:19.222 "num_base_bdevs_discovered": 3, 00:19:19.222 "num_base_bdevs_operational": 4, 00:19:19.222 "base_bdevs_list": [ 00:19:19.222 { 00:19:19.222 "name": "BaseBdev1", 00:19:19.222 "uuid": "f08672ef-a83e-44fa-b554-48c8e9a3d47b", 00:19:19.222 "is_configured": true, 00:19:19.222 "data_offset": 2048, 00:19:19.222 "data_size": 63488 00:19:19.222 }, 00:19:19.222 { 00:19:19.222 "name": null, 00:19:19.222 "uuid": "cf4cb410-e9a8-4a84-a021-7e522fb00e9a", 00:19:19.222 "is_configured": false, 00:19:19.222 "data_offset": 2048, 00:19:19.222 "data_size": 63488 00:19:19.222 }, 00:19:19.222 { 00:19:19.222 "name": "BaseBdev3", 00:19:19.222 "uuid": "934290c4-18e8-438e-9f3f-125a220c70d0", 00:19:19.222 "is_configured": true, 00:19:19.222 "data_offset": 2048, 00:19:19.222 "data_size": 63488 00:19:19.222 }, 00:19:19.222 { 00:19:19.222 "name": "BaseBdev4", 00:19:19.222 "uuid": "dd898240-692e-4d59-98d3-943178a64621", 00:19:19.222 "is_configured": true, 00:19:19.222 "data_offset": 2048, 00:19:19.222 "data_size": 63488 00:19:19.222 } 00:19:19.222 ] 00:19:19.222 }' 00:19:19.222 19:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:19.222 19:55:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:19.789 19:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:19.789 19:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:20.048 19:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:19:20.048 19:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:20.616 [2024-07-24 19:55:12.034704] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:20.616 19:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:20.616 19:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:20.616 19:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:20.616 19:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:20.616 19:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:20.616 19:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:20.616 19:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:20.616 19:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:20.616 19:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:20.616 19:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:20.616 19:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:20.616 19:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:21.184 19:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:21.184 "name": "Existed_Raid", 00:19:21.184 "uuid": "167f6c9e-970c-4600-90fa-57a2f4d9f99f", 00:19:21.184 "strip_size_kb": 64, 00:19:21.184 "state": "configuring", 00:19:21.184 "raid_level": "raid0", 00:19:21.184 "superblock": true, 00:19:21.184 "num_base_bdevs": 4, 00:19:21.184 "num_base_bdevs_discovered": 2, 00:19:21.184 "num_base_bdevs_operational": 4, 00:19:21.184 "base_bdevs_list": [ 00:19:21.184 { 00:19:21.184 "name": "BaseBdev1", 00:19:21.184 "uuid": "f08672ef-a83e-44fa-b554-48c8e9a3d47b", 00:19:21.184 "is_configured": true, 00:19:21.184 "data_offset": 2048, 00:19:21.184 "data_size": 63488 00:19:21.184 }, 00:19:21.184 { 00:19:21.184 "name": null, 00:19:21.184 "uuid": "cf4cb410-e9a8-4a84-a021-7e522fb00e9a", 00:19:21.184 "is_configured": false, 00:19:21.184 "data_offset": 2048, 00:19:21.184 "data_size": 63488 00:19:21.184 }, 00:19:21.184 { 00:19:21.184 "name": null, 00:19:21.184 "uuid": "934290c4-18e8-438e-9f3f-125a220c70d0", 00:19:21.184 "is_configured": false, 00:19:21.184 "data_offset": 2048, 00:19:21.184 "data_size": 63488 00:19:21.184 }, 00:19:21.184 { 00:19:21.184 "name": "BaseBdev4", 00:19:21.184 "uuid": "dd898240-692e-4d59-98d3-943178a64621", 00:19:21.184 "is_configured": true, 00:19:21.184 "data_offset": 2048, 00:19:21.184 "data_size": 63488 00:19:21.184 } 00:19:21.184 ] 00:19:21.184 }' 00:19:21.184 19:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:21.184 19:55:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:21.751 19:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:21.751 19:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:22.009 19:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:19:22.009 19:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:22.267 [2024-07-24 19:55:13.691633] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:22.267 19:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:22.267 19:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:22.267 19:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:22.267 19:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:22.267 19:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:22.267 19:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:22.267 19:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:22.267 19:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:22.267 19:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:22.267 19:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:22.267 19:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:22.267 19:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:22.526 19:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:22.526 "name": "Existed_Raid", 00:19:22.526 "uuid": "167f6c9e-970c-4600-90fa-57a2f4d9f99f", 00:19:22.526 "strip_size_kb": 64, 00:19:22.526 "state": "configuring", 00:19:22.526 "raid_level": "raid0", 00:19:22.526 "superblock": true, 00:19:22.526 "num_base_bdevs": 4, 00:19:22.526 "num_base_bdevs_discovered": 3, 00:19:22.526 "num_base_bdevs_operational": 4, 00:19:22.526 "base_bdevs_list": [ 00:19:22.526 { 00:19:22.526 "name": "BaseBdev1", 00:19:22.526 "uuid": "f08672ef-a83e-44fa-b554-48c8e9a3d47b", 00:19:22.526 "is_configured": true, 00:19:22.526 "data_offset": 2048, 00:19:22.526 "data_size": 63488 00:19:22.526 }, 00:19:22.526 { 00:19:22.526 "name": null, 00:19:22.526 "uuid": "cf4cb410-e9a8-4a84-a021-7e522fb00e9a", 00:19:22.526 "is_configured": false, 00:19:22.526 "data_offset": 2048, 00:19:22.526 "data_size": 63488 00:19:22.526 }, 00:19:22.526 { 00:19:22.526 "name": "BaseBdev3", 00:19:22.526 "uuid": "934290c4-18e8-438e-9f3f-125a220c70d0", 00:19:22.526 "is_configured": true, 00:19:22.526 "data_offset": 2048, 00:19:22.526 "data_size": 63488 00:19:22.526 }, 00:19:22.526 { 00:19:22.526 "name": "BaseBdev4", 00:19:22.526 "uuid": "dd898240-692e-4d59-98d3-943178a64621", 00:19:22.526 "is_configured": true, 00:19:22.526 "data_offset": 2048, 00:19:22.526 "data_size": 63488 00:19:22.526 } 00:19:22.526 ] 00:19:22.526 }' 00:19:22.526 19:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:22.526 19:55:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:23.091 19:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:23.091 19:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:23.350 19:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:23.350 19:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:23.608 [2024-07-24 19:55:15.087724] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:23.608 19:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:23.608 19:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:23.608 19:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:23.608 19:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:23.608 19:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:23.608 19:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:23.608 19:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:23.608 19:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:23.609 19:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:23.609 19:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:23.609 19:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:23.609 19:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:23.867 19:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:23.868 "name": "Existed_Raid", 00:19:23.868 "uuid": "167f6c9e-970c-4600-90fa-57a2f4d9f99f", 00:19:23.868 "strip_size_kb": 64, 00:19:23.868 "state": "configuring", 00:19:23.868 "raid_level": "raid0", 00:19:23.868 "superblock": true, 00:19:23.868 "num_base_bdevs": 4, 00:19:23.868 "num_base_bdevs_discovered": 2, 00:19:23.868 "num_base_bdevs_operational": 4, 00:19:23.868 "base_bdevs_list": [ 00:19:23.868 { 00:19:23.868 "name": null, 00:19:23.868 "uuid": "f08672ef-a83e-44fa-b554-48c8e9a3d47b", 00:19:23.868 "is_configured": false, 00:19:23.868 "data_offset": 2048, 00:19:23.868 "data_size": 63488 00:19:23.868 }, 00:19:23.868 { 00:19:23.868 "name": null, 00:19:23.868 "uuid": "cf4cb410-e9a8-4a84-a021-7e522fb00e9a", 00:19:23.868 "is_configured": false, 00:19:23.868 "data_offset": 2048, 00:19:23.868 "data_size": 63488 00:19:23.868 }, 00:19:23.868 { 00:19:23.868 "name": "BaseBdev3", 00:19:23.868 "uuid": "934290c4-18e8-438e-9f3f-125a220c70d0", 00:19:23.868 "is_configured": true, 00:19:23.868 "data_offset": 2048, 00:19:23.868 "data_size": 63488 00:19:23.868 }, 00:19:23.868 { 00:19:23.868 "name": "BaseBdev4", 00:19:23.868 "uuid": "dd898240-692e-4d59-98d3-943178a64621", 00:19:23.868 "is_configured": true, 00:19:23.868 "data_offset": 2048, 00:19:23.868 "data_size": 63488 00:19:23.868 } 00:19:23.868 ] 00:19:23.868 }' 00:19:23.868 19:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:23.868 19:55:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:24.436 19:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:24.436 19:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:24.695 19:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:24.695 19:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:24.954 [2024-07-24 19:55:16.438880] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:24.954 19:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:24.954 19:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:24.954 19:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:24.954 19:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:24.954 19:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:24.954 19:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:24.954 19:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:24.954 19:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:24.954 19:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:24.954 19:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:24.954 19:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:24.954 19:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:25.522 19:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:25.522 "name": "Existed_Raid", 00:19:25.522 "uuid": "167f6c9e-970c-4600-90fa-57a2f4d9f99f", 00:19:25.522 "strip_size_kb": 64, 00:19:25.522 "state": "configuring", 00:19:25.522 "raid_level": "raid0", 00:19:25.522 "superblock": true, 00:19:25.522 "num_base_bdevs": 4, 00:19:25.522 "num_base_bdevs_discovered": 3, 00:19:25.522 "num_base_bdevs_operational": 4, 00:19:25.522 "base_bdevs_list": [ 00:19:25.522 { 00:19:25.522 "name": null, 00:19:25.522 "uuid": "f08672ef-a83e-44fa-b554-48c8e9a3d47b", 00:19:25.522 "is_configured": false, 00:19:25.522 "data_offset": 2048, 00:19:25.522 "data_size": 63488 00:19:25.522 }, 00:19:25.522 { 00:19:25.522 "name": "BaseBdev2", 00:19:25.522 "uuid": "cf4cb410-e9a8-4a84-a021-7e522fb00e9a", 00:19:25.522 "is_configured": true, 00:19:25.522 "data_offset": 2048, 00:19:25.522 "data_size": 63488 00:19:25.522 }, 00:19:25.522 { 00:19:25.522 "name": "BaseBdev3", 00:19:25.522 "uuid": "934290c4-18e8-438e-9f3f-125a220c70d0", 00:19:25.522 "is_configured": true, 00:19:25.522 "data_offset": 2048, 00:19:25.522 "data_size": 63488 00:19:25.522 }, 00:19:25.522 { 00:19:25.522 "name": "BaseBdev4", 00:19:25.522 "uuid": "dd898240-692e-4d59-98d3-943178a64621", 00:19:25.522 "is_configured": true, 00:19:25.522 "data_offset": 2048, 00:19:25.522 "data_size": 63488 00:19:25.522 } 00:19:25.522 ] 00:19:25.522 }' 00:19:25.522 19:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:25.522 19:55:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:26.089 19:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.089 19:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:26.351 19:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:26.351 19:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.351 19:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:26.609 19:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u f08672ef-a83e-44fa-b554-48c8e9a3d47b 00:19:26.868 [2024-07-24 19:55:18.315432] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:26.868 [2024-07-24 19:55:18.315742] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x22eeee0 00:19:26.868 [2024-07-24 19:55:18.315771] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:26.868 [2024-07-24 19:55:18.316149] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22eec30 00:19:26.868 [2024-07-24 19:55:18.316414] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22eeee0 00:19:26.868 [2024-07-24 19:55:18.316437] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x22eeee0 00:19:26.868 NewBaseBdev 00:19:26.868 [2024-07-24 19:55:18.316633] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:26.868 19:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:26.868 19:55:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:19:26.868 19:55:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:26.868 19:55:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:26.868 19:55:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:26.868 19:55:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:26.868 19:55:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:27.126 19:55:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:27.385 [ 00:19:27.385 { 00:19:27.385 "name": "NewBaseBdev", 00:19:27.385 "aliases": [ 00:19:27.385 "f08672ef-a83e-44fa-b554-48c8e9a3d47b" 00:19:27.385 ], 00:19:27.385 "product_name": "Malloc disk", 00:19:27.385 "block_size": 512, 00:19:27.385 "num_blocks": 65536, 00:19:27.385 "uuid": "f08672ef-a83e-44fa-b554-48c8e9a3d47b", 00:19:27.385 "assigned_rate_limits": { 00:19:27.385 "rw_ios_per_sec": 0, 00:19:27.385 "rw_mbytes_per_sec": 0, 00:19:27.385 "r_mbytes_per_sec": 0, 00:19:27.385 "w_mbytes_per_sec": 0 00:19:27.385 }, 00:19:27.385 "claimed": true, 00:19:27.385 "claim_type": "exclusive_write", 00:19:27.385 "zoned": false, 00:19:27.385 "supported_io_types": { 00:19:27.385 "read": true, 00:19:27.385 "write": true, 00:19:27.385 "unmap": true, 00:19:27.385 "flush": true, 00:19:27.385 "reset": true, 00:19:27.385 "nvme_admin": false, 00:19:27.385 "nvme_io": false, 00:19:27.385 "nvme_io_md": false, 00:19:27.385 "write_zeroes": true, 00:19:27.385 "zcopy": true, 00:19:27.385 "get_zone_info": false, 00:19:27.385 "zone_management": false, 00:19:27.385 "zone_append": false, 00:19:27.385 "compare": false, 00:19:27.385 "compare_and_write": false, 00:19:27.385 "abort": true, 00:19:27.385 "seek_hole": false, 00:19:27.385 "seek_data": false, 00:19:27.385 "copy": true, 00:19:27.385 "nvme_iov_md": false 00:19:27.385 }, 00:19:27.385 "memory_domains": [ 00:19:27.385 { 00:19:27.385 "dma_device_id": "system", 00:19:27.385 "dma_device_type": 1 00:19:27.385 }, 00:19:27.385 { 00:19:27.385 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:27.385 "dma_device_type": 2 00:19:27.385 } 00:19:27.385 ], 00:19:27.385 "driver_specific": {} 00:19:27.385 } 00:19:27.385 ] 00:19:27.385 19:55:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:27.385 19:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:19:27.385 19:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:27.385 19:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:27.385 19:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:27.385 19:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:27.385 19:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:27.385 19:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:27.385 19:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:27.385 19:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:27.385 19:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:27.385 19:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.385 19:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:27.645 19:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:27.645 "name": "Existed_Raid", 00:19:27.645 "uuid": "167f6c9e-970c-4600-90fa-57a2f4d9f99f", 00:19:27.645 "strip_size_kb": 64, 00:19:27.645 "state": "online", 00:19:27.645 "raid_level": "raid0", 00:19:27.645 "superblock": true, 00:19:27.645 "num_base_bdevs": 4, 00:19:27.645 "num_base_bdevs_discovered": 4, 00:19:27.645 "num_base_bdevs_operational": 4, 00:19:27.645 "base_bdevs_list": [ 00:19:27.645 { 00:19:27.645 "name": "NewBaseBdev", 00:19:27.645 "uuid": "f08672ef-a83e-44fa-b554-48c8e9a3d47b", 00:19:27.645 "is_configured": true, 00:19:27.645 "data_offset": 2048, 00:19:27.645 "data_size": 63488 00:19:27.645 }, 00:19:27.645 { 00:19:27.645 "name": "BaseBdev2", 00:19:27.645 "uuid": "cf4cb410-e9a8-4a84-a021-7e522fb00e9a", 00:19:27.645 "is_configured": true, 00:19:27.645 "data_offset": 2048, 00:19:27.645 "data_size": 63488 00:19:27.645 }, 00:19:27.645 { 00:19:27.645 "name": "BaseBdev3", 00:19:27.645 "uuid": "934290c4-18e8-438e-9f3f-125a220c70d0", 00:19:27.645 "is_configured": true, 00:19:27.645 "data_offset": 2048, 00:19:27.645 "data_size": 63488 00:19:27.645 }, 00:19:27.645 { 00:19:27.645 "name": "BaseBdev4", 00:19:27.645 "uuid": "dd898240-692e-4d59-98d3-943178a64621", 00:19:27.645 "is_configured": true, 00:19:27.645 "data_offset": 2048, 00:19:27.645 "data_size": 63488 00:19:27.645 } 00:19:27.645 ] 00:19:27.645 }' 00:19:27.645 19:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:27.645 19:55:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:28.582 19:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:28.582 19:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:28.582 19:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:28.582 19:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:28.582 19:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:28.582 19:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:28.583 19:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:28.583 19:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:28.583 [2024-07-24 19:55:20.165705] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:28.841 19:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:28.841 "name": "Existed_Raid", 00:19:28.841 "aliases": [ 00:19:28.841 "167f6c9e-970c-4600-90fa-57a2f4d9f99f" 00:19:28.841 ], 00:19:28.841 "product_name": "Raid Volume", 00:19:28.841 "block_size": 512, 00:19:28.841 "num_blocks": 253952, 00:19:28.841 "uuid": "167f6c9e-970c-4600-90fa-57a2f4d9f99f", 00:19:28.841 "assigned_rate_limits": { 00:19:28.841 "rw_ios_per_sec": 0, 00:19:28.841 "rw_mbytes_per_sec": 0, 00:19:28.841 "r_mbytes_per_sec": 0, 00:19:28.841 "w_mbytes_per_sec": 0 00:19:28.841 }, 00:19:28.841 "claimed": false, 00:19:28.841 "zoned": false, 00:19:28.841 "supported_io_types": { 00:19:28.841 "read": true, 00:19:28.841 "write": true, 00:19:28.841 "unmap": true, 00:19:28.841 "flush": true, 00:19:28.841 "reset": true, 00:19:28.841 "nvme_admin": false, 00:19:28.841 "nvme_io": false, 00:19:28.841 "nvme_io_md": false, 00:19:28.841 "write_zeroes": true, 00:19:28.841 "zcopy": false, 00:19:28.841 "get_zone_info": false, 00:19:28.841 "zone_management": false, 00:19:28.841 "zone_append": false, 00:19:28.841 "compare": false, 00:19:28.841 "compare_and_write": false, 00:19:28.841 "abort": false, 00:19:28.841 "seek_hole": false, 00:19:28.841 "seek_data": false, 00:19:28.841 "copy": false, 00:19:28.841 "nvme_iov_md": false 00:19:28.841 }, 00:19:28.841 "memory_domains": [ 00:19:28.841 { 00:19:28.841 "dma_device_id": "system", 00:19:28.841 "dma_device_type": 1 00:19:28.841 }, 00:19:28.842 { 00:19:28.842 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:28.842 "dma_device_type": 2 00:19:28.842 }, 00:19:28.842 { 00:19:28.842 "dma_device_id": "system", 00:19:28.842 "dma_device_type": 1 00:19:28.842 }, 00:19:28.842 { 00:19:28.842 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:28.842 "dma_device_type": 2 00:19:28.842 }, 00:19:28.842 { 00:19:28.842 "dma_device_id": "system", 00:19:28.842 "dma_device_type": 1 00:19:28.842 }, 00:19:28.842 { 00:19:28.842 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:28.842 "dma_device_type": 2 00:19:28.842 }, 00:19:28.842 { 00:19:28.842 "dma_device_id": "system", 00:19:28.842 "dma_device_type": 1 00:19:28.842 }, 00:19:28.842 { 00:19:28.842 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:28.842 "dma_device_type": 2 00:19:28.842 } 00:19:28.842 ], 00:19:28.842 "driver_specific": { 00:19:28.842 "raid": { 00:19:28.842 "uuid": "167f6c9e-970c-4600-90fa-57a2f4d9f99f", 00:19:28.842 "strip_size_kb": 64, 00:19:28.842 "state": "online", 00:19:28.842 "raid_level": "raid0", 00:19:28.842 "superblock": true, 00:19:28.842 "num_base_bdevs": 4, 00:19:28.842 "num_base_bdevs_discovered": 4, 00:19:28.842 "num_base_bdevs_operational": 4, 00:19:28.842 "base_bdevs_list": [ 00:19:28.842 { 00:19:28.842 "name": "NewBaseBdev", 00:19:28.842 "uuid": "f08672ef-a83e-44fa-b554-48c8e9a3d47b", 00:19:28.842 "is_configured": true, 00:19:28.842 "data_offset": 2048, 00:19:28.842 "data_size": 63488 00:19:28.842 }, 00:19:28.842 { 00:19:28.842 "name": "BaseBdev2", 00:19:28.842 "uuid": "cf4cb410-e9a8-4a84-a021-7e522fb00e9a", 00:19:28.842 "is_configured": true, 00:19:28.842 "data_offset": 2048, 00:19:28.842 "data_size": 63488 00:19:28.842 }, 00:19:28.842 { 00:19:28.842 "name": "BaseBdev3", 00:19:28.842 "uuid": "934290c4-18e8-438e-9f3f-125a220c70d0", 00:19:28.842 "is_configured": true, 00:19:28.842 "data_offset": 2048, 00:19:28.842 "data_size": 63488 00:19:28.842 }, 00:19:28.842 { 00:19:28.842 "name": "BaseBdev4", 00:19:28.842 "uuid": "dd898240-692e-4d59-98d3-943178a64621", 00:19:28.842 "is_configured": true, 00:19:28.842 "data_offset": 2048, 00:19:28.842 "data_size": 63488 00:19:28.842 } 00:19:28.842 ] 00:19:28.842 } 00:19:28.842 } 00:19:28.842 }' 00:19:28.842 19:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:28.842 19:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:28.842 BaseBdev2 00:19:28.842 BaseBdev3 00:19:28.842 BaseBdev4' 00:19:28.842 19:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:28.842 19:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:28.842 19:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:29.101 19:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:29.101 "name": "NewBaseBdev", 00:19:29.101 "aliases": [ 00:19:29.101 "f08672ef-a83e-44fa-b554-48c8e9a3d47b" 00:19:29.101 ], 00:19:29.101 "product_name": "Malloc disk", 00:19:29.101 "block_size": 512, 00:19:29.101 "num_blocks": 65536, 00:19:29.101 "uuid": "f08672ef-a83e-44fa-b554-48c8e9a3d47b", 00:19:29.101 "assigned_rate_limits": { 00:19:29.101 "rw_ios_per_sec": 0, 00:19:29.101 "rw_mbytes_per_sec": 0, 00:19:29.101 "r_mbytes_per_sec": 0, 00:19:29.101 "w_mbytes_per_sec": 0 00:19:29.101 }, 00:19:29.101 "claimed": true, 00:19:29.101 "claim_type": "exclusive_write", 00:19:29.101 "zoned": false, 00:19:29.101 "supported_io_types": { 00:19:29.101 "read": true, 00:19:29.101 "write": true, 00:19:29.101 "unmap": true, 00:19:29.101 "flush": true, 00:19:29.101 "reset": true, 00:19:29.101 "nvme_admin": false, 00:19:29.101 "nvme_io": false, 00:19:29.101 "nvme_io_md": false, 00:19:29.101 "write_zeroes": true, 00:19:29.101 "zcopy": true, 00:19:29.101 "get_zone_info": false, 00:19:29.101 "zone_management": false, 00:19:29.101 "zone_append": false, 00:19:29.101 "compare": false, 00:19:29.101 "compare_and_write": false, 00:19:29.101 "abort": true, 00:19:29.101 "seek_hole": false, 00:19:29.101 "seek_data": false, 00:19:29.101 "copy": true, 00:19:29.101 "nvme_iov_md": false 00:19:29.101 }, 00:19:29.101 "memory_domains": [ 00:19:29.101 { 00:19:29.101 "dma_device_id": "system", 00:19:29.101 "dma_device_type": 1 00:19:29.101 }, 00:19:29.101 { 00:19:29.101 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.101 "dma_device_type": 2 00:19:29.101 } 00:19:29.101 ], 00:19:29.101 "driver_specific": {} 00:19:29.101 }' 00:19:29.101 19:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:29.101 19:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:29.101 19:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:29.101 19:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:29.101 19:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:29.101 19:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:29.101 19:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:29.360 19:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:29.360 19:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:29.360 19:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:29.360 19:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:29.360 19:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:29.360 19:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:29.360 19:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:29.360 19:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:29.688 19:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:29.688 "name": "BaseBdev2", 00:19:29.688 "aliases": [ 00:19:29.688 "cf4cb410-e9a8-4a84-a021-7e522fb00e9a" 00:19:29.688 ], 00:19:29.688 "product_name": "Malloc disk", 00:19:29.688 "block_size": 512, 00:19:29.688 "num_blocks": 65536, 00:19:29.688 "uuid": "cf4cb410-e9a8-4a84-a021-7e522fb00e9a", 00:19:29.688 "assigned_rate_limits": { 00:19:29.688 "rw_ios_per_sec": 0, 00:19:29.688 "rw_mbytes_per_sec": 0, 00:19:29.689 "r_mbytes_per_sec": 0, 00:19:29.689 "w_mbytes_per_sec": 0 00:19:29.689 }, 00:19:29.689 "claimed": true, 00:19:29.689 "claim_type": "exclusive_write", 00:19:29.689 "zoned": false, 00:19:29.689 "supported_io_types": { 00:19:29.689 "read": true, 00:19:29.689 "write": true, 00:19:29.689 "unmap": true, 00:19:29.689 "flush": true, 00:19:29.689 "reset": true, 00:19:29.689 "nvme_admin": false, 00:19:29.689 "nvme_io": false, 00:19:29.689 "nvme_io_md": false, 00:19:29.689 "write_zeroes": true, 00:19:29.689 "zcopy": true, 00:19:29.689 "get_zone_info": false, 00:19:29.689 "zone_management": false, 00:19:29.689 "zone_append": false, 00:19:29.689 "compare": false, 00:19:29.689 "compare_and_write": false, 00:19:29.689 "abort": true, 00:19:29.689 "seek_hole": false, 00:19:29.689 "seek_data": false, 00:19:29.689 "copy": true, 00:19:29.689 "nvme_iov_md": false 00:19:29.689 }, 00:19:29.689 "memory_domains": [ 00:19:29.689 { 00:19:29.689 "dma_device_id": "system", 00:19:29.689 "dma_device_type": 1 00:19:29.689 }, 00:19:29.689 { 00:19:29.689 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.689 "dma_device_type": 2 00:19:29.689 } 00:19:29.689 ], 00:19:29.689 "driver_specific": {} 00:19:29.689 }' 00:19:29.689 19:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:29.689 19:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:29.689 19:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:29.689 19:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:29.689 19:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:29.947 19:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:29.947 19:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:29.947 19:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:29.947 19:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:29.947 19:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:29.947 19:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:29.947 19:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:29.947 19:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:29.947 19:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:29.947 19:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:30.206 19:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:30.206 "name": "BaseBdev3", 00:19:30.206 "aliases": [ 00:19:30.206 "934290c4-18e8-438e-9f3f-125a220c70d0" 00:19:30.206 ], 00:19:30.206 "product_name": "Malloc disk", 00:19:30.206 "block_size": 512, 00:19:30.206 "num_blocks": 65536, 00:19:30.206 "uuid": "934290c4-18e8-438e-9f3f-125a220c70d0", 00:19:30.206 "assigned_rate_limits": { 00:19:30.206 "rw_ios_per_sec": 0, 00:19:30.206 "rw_mbytes_per_sec": 0, 00:19:30.206 "r_mbytes_per_sec": 0, 00:19:30.206 "w_mbytes_per_sec": 0 00:19:30.206 }, 00:19:30.206 "claimed": true, 00:19:30.206 "claim_type": "exclusive_write", 00:19:30.206 "zoned": false, 00:19:30.206 "supported_io_types": { 00:19:30.206 "read": true, 00:19:30.206 "write": true, 00:19:30.206 "unmap": true, 00:19:30.206 "flush": true, 00:19:30.206 "reset": true, 00:19:30.206 "nvme_admin": false, 00:19:30.206 "nvme_io": false, 00:19:30.206 "nvme_io_md": false, 00:19:30.206 "write_zeroes": true, 00:19:30.206 "zcopy": true, 00:19:30.206 "get_zone_info": false, 00:19:30.206 "zone_management": false, 00:19:30.206 "zone_append": false, 00:19:30.206 "compare": false, 00:19:30.206 "compare_and_write": false, 00:19:30.206 "abort": true, 00:19:30.206 "seek_hole": false, 00:19:30.206 "seek_data": false, 00:19:30.206 "copy": true, 00:19:30.206 "nvme_iov_md": false 00:19:30.206 }, 00:19:30.206 "memory_domains": [ 00:19:30.206 { 00:19:30.206 "dma_device_id": "system", 00:19:30.206 "dma_device_type": 1 00:19:30.206 }, 00:19:30.206 { 00:19:30.206 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:30.206 "dma_device_type": 2 00:19:30.206 } 00:19:30.206 ], 00:19:30.206 "driver_specific": {} 00:19:30.206 }' 00:19:30.206 19:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:30.206 19:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:30.530 19:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:30.530 19:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:30.530 19:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:30.530 19:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:30.530 19:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:30.530 19:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:30.530 19:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:30.530 19:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:30.530 19:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:30.530 19:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:30.530 19:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:30.530 19:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:30.530 19:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:30.790 19:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:30.790 "name": "BaseBdev4", 00:19:30.790 "aliases": [ 00:19:30.790 "dd898240-692e-4d59-98d3-943178a64621" 00:19:30.790 ], 00:19:30.790 "product_name": "Malloc disk", 00:19:30.790 "block_size": 512, 00:19:30.790 "num_blocks": 65536, 00:19:30.790 "uuid": "dd898240-692e-4d59-98d3-943178a64621", 00:19:30.790 "assigned_rate_limits": { 00:19:30.790 "rw_ios_per_sec": 0, 00:19:30.790 "rw_mbytes_per_sec": 0, 00:19:30.790 "r_mbytes_per_sec": 0, 00:19:30.790 "w_mbytes_per_sec": 0 00:19:30.790 }, 00:19:30.790 "claimed": true, 00:19:30.790 "claim_type": "exclusive_write", 00:19:30.790 "zoned": false, 00:19:30.790 "supported_io_types": { 00:19:30.790 "read": true, 00:19:30.790 "write": true, 00:19:30.790 "unmap": true, 00:19:30.790 "flush": true, 00:19:30.790 "reset": true, 00:19:30.790 "nvme_admin": false, 00:19:30.790 "nvme_io": false, 00:19:30.790 "nvme_io_md": false, 00:19:30.790 "write_zeroes": true, 00:19:30.790 "zcopy": true, 00:19:30.790 "get_zone_info": false, 00:19:30.790 "zone_management": false, 00:19:30.790 "zone_append": false, 00:19:30.790 "compare": false, 00:19:30.790 "compare_and_write": false, 00:19:30.790 "abort": true, 00:19:30.790 "seek_hole": false, 00:19:30.790 "seek_data": false, 00:19:30.790 "copy": true, 00:19:30.790 "nvme_iov_md": false 00:19:30.790 }, 00:19:30.790 "memory_domains": [ 00:19:30.790 { 00:19:30.790 "dma_device_id": "system", 00:19:30.790 "dma_device_type": 1 00:19:30.790 }, 00:19:30.790 { 00:19:30.790 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:30.790 "dma_device_type": 2 00:19:30.790 } 00:19:30.790 ], 00:19:30.790 "driver_specific": {} 00:19:30.790 }' 00:19:30.790 19:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:30.790 19:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:31.049 19:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:31.049 19:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:31.049 19:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:31.049 19:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:31.049 19:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:31.049 19:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:31.049 19:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:31.049 19:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:31.049 19:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:31.309 19:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:31.309 19:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:31.309 [2024-07-24 19:55:22.897029] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:31.309 [2024-07-24 19:55:22.897080] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:31.309 [2024-07-24 19:55:22.897176] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:31.309 [2024-07-24 19:55:22.897296] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:31.309 [2024-07-24 19:55:22.897322] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22eeee0 name Existed_Raid, state offline 00:19:31.568 19:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1441707 00:19:31.568 19:55:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1441707 ']' 00:19:31.568 19:55:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 1441707 00:19:31.568 19:55:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:19:31.568 19:55:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:19:31.568 19:55:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1441707 00:19:31.568 19:55:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:19:31.568 19:55:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:19:31.568 19:55:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1441707' 00:19:31.568 killing process with pid 1441707 00:19:31.568 19:55:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 1441707 00:19:31.568 [2024-07-24 19:55:22.962045] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:31.568 19:55:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 1441707 00:19:31.568 [2024-07-24 19:55:23.007444] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:31.830 19:55:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:19:31.830 00:19:31.830 real 0m33.749s 00:19:31.830 user 1m1.772s 00:19:31.830 sys 0m5.983s 00:19:31.830 19:55:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:19:31.830 19:55:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:31.830 ************************************ 00:19:31.830 END TEST raid_state_function_test_sb 00:19:31.830 ************************************ 00:19:31.830 19:55:23 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:19:31.830 19:55:23 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:19:31.830 19:55:23 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:19:31.830 19:55:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:32.094 ************************************ 00:19:32.094 START TEST raid_superblock_test 00:19:32.094 ************************************ 00:19:32.094 19:55:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid0 4 00:19:32.094 19:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=raid0 00:19:32.094 19:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=4 00:19:32.094 19:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:19:32.094 19:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:19:32.094 19:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:19:32.094 19:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:19:32.094 19:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:19:32.094 19:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:19:32.094 19:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:19:32.094 19:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:19:32.094 19:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:19:32.094 19:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:19:32.094 19:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:19:32.094 19:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' raid0 '!=' raid1 ']' 00:19:32.094 19:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # strip_size=64 00:19:32.094 19:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # strip_size_create_arg='-z 64' 00:19:32.094 19:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=1446663 00:19:32.094 19:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 1446663 /var/tmp/spdk-raid.sock 00:19:32.094 19:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:19:32.094 19:55:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1446663 ']' 00:19:32.094 19:55:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:32.094 19:55:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:19:32.094 19:55:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:32.094 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:32.094 19:55:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:19:32.094 19:55:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:32.094 [2024-07-24 19:55:23.499476] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:19:32.094 [2024-07-24 19:55:23.499549] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1446663 ] 00:19:32.094 [2024-07-24 19:55:23.630507] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:32.354 [2024-07-24 19:55:23.736810] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:32.354 [2024-07-24 19:55:23.800755] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:32.354 [2024-07-24 19:55:23.800785] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:32.922 19:55:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:19:32.922 19:55:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:19:32.922 19:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:19:32.922 19:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:19:32.922 19:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:19:32.922 19:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:19:32.922 19:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:19:32.922 19:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:32.922 19:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:19:32.922 19:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:32.922 19:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:19:33.181 malloc1 00:19:33.181 19:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:33.439 [2024-07-24 19:55:24.906116] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:33.440 [2024-07-24 19:55:24.906164] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:33.440 [2024-07-24 19:55:24.906185] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f0e590 00:19:33.440 [2024-07-24 19:55:24.906197] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:33.440 [2024-07-24 19:55:24.907889] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:33.440 [2024-07-24 19:55:24.907918] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:33.440 pt1 00:19:33.440 19:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:19:33.440 19:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:19:33.440 19:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:19:33.440 19:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:19:33.440 19:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:19:33.440 19:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:33.440 19:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:19:33.440 19:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:33.440 19:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:19:33.699 malloc2 00:19:33.699 19:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:33.958 [2024-07-24 19:55:25.400118] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:33.958 [2024-07-24 19:55:25.400166] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:33.958 [2024-07-24 19:55:25.400185] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20b4690 00:19:33.958 [2024-07-24 19:55:25.400198] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:33.958 [2024-07-24 19:55:25.401713] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:33.958 [2024-07-24 19:55:25.401741] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:33.958 pt2 00:19:33.958 19:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:19:33.958 19:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:19:33.958 19:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc3 00:19:33.958 19:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt3 00:19:33.958 19:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:19:33.958 19:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:33.958 19:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:19:33.958 19:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:33.958 19:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:19:34.217 malloc3 00:19:34.217 19:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:34.476 [2024-07-24 19:55:25.907299] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:34.476 [2024-07-24 19:55:25.907346] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:34.476 [2024-07-24 19:55:25.907365] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20b5fc0 00:19:34.476 [2024-07-24 19:55:25.907378] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:34.476 [2024-07-24 19:55:25.908946] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:34.476 [2024-07-24 19:55:25.908973] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:34.476 pt3 00:19:34.476 19:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:19:34.476 19:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:19:34.476 19:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc4 00:19:34.476 19:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt4 00:19:34.476 19:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:19:34.476 19:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:34.476 19:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:19:34.476 19:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:34.476 19:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:19:34.735 malloc4 00:19:34.735 19:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:34.994 [2024-07-24 19:55:26.406466] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:34.994 [2024-07-24 19:55:26.406515] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:34.994 [2024-07-24 19:55:26.406535] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20b71c0 00:19:34.994 [2024-07-24 19:55:26.406547] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:34.994 [2024-07-24 19:55:26.408111] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:34.994 [2024-07-24 19:55:26.408138] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:34.994 pt4 00:19:34.994 19:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:19:34.994 19:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:19:34.994 19:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:19:35.252 [2024-07-24 19:55:26.651137] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:35.252 [2024-07-24 19:55:26.652454] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:35.252 [2024-07-24 19:55:26.652508] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:35.252 [2024-07-24 19:55:26.652553] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:35.252 [2024-07-24 19:55:26.652735] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x20bfe80 00:19:35.252 [2024-07-24 19:55:26.652746] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:35.252 [2024-07-24 19:55:26.652949] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f25480 00:19:35.252 [2024-07-24 19:55:26.653096] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20bfe80 00:19:35.252 [2024-07-24 19:55:26.653107] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20bfe80 00:19:35.252 [2024-07-24 19:55:26.653208] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:35.252 19:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:35.252 19:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:35.252 19:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:35.252 19:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:35.252 19:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:35.252 19:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:35.252 19:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:35.252 19:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:35.253 19:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:35.253 19:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:35.253 19:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:35.253 19:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:35.510 19:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:35.511 "name": "raid_bdev1", 00:19:35.511 "uuid": "bc411021-18a7-4e94-a582-876864e96814", 00:19:35.511 "strip_size_kb": 64, 00:19:35.511 "state": "online", 00:19:35.511 "raid_level": "raid0", 00:19:35.511 "superblock": true, 00:19:35.511 "num_base_bdevs": 4, 00:19:35.511 "num_base_bdevs_discovered": 4, 00:19:35.511 "num_base_bdevs_operational": 4, 00:19:35.511 "base_bdevs_list": [ 00:19:35.511 { 00:19:35.511 "name": "pt1", 00:19:35.511 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:35.511 "is_configured": true, 00:19:35.511 "data_offset": 2048, 00:19:35.511 "data_size": 63488 00:19:35.511 }, 00:19:35.511 { 00:19:35.511 "name": "pt2", 00:19:35.511 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:35.511 "is_configured": true, 00:19:35.511 "data_offset": 2048, 00:19:35.511 "data_size": 63488 00:19:35.511 }, 00:19:35.511 { 00:19:35.511 "name": "pt3", 00:19:35.511 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:35.511 "is_configured": true, 00:19:35.511 "data_offset": 2048, 00:19:35.511 "data_size": 63488 00:19:35.511 }, 00:19:35.511 { 00:19:35.511 "name": "pt4", 00:19:35.511 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:35.511 "is_configured": true, 00:19:35.511 "data_offset": 2048, 00:19:35.511 "data_size": 63488 00:19:35.511 } 00:19:35.511 ] 00:19:35.511 }' 00:19:35.511 19:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:35.511 19:55:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:36.079 19:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:19:36.079 19:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:19:36.079 19:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:36.079 19:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:36.079 19:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:36.079 19:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:36.079 19:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:36.079 19:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:36.338 [2024-07-24 19:55:27.758336] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:36.338 19:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:36.338 "name": "raid_bdev1", 00:19:36.338 "aliases": [ 00:19:36.338 "bc411021-18a7-4e94-a582-876864e96814" 00:19:36.338 ], 00:19:36.338 "product_name": "Raid Volume", 00:19:36.338 "block_size": 512, 00:19:36.338 "num_blocks": 253952, 00:19:36.338 "uuid": "bc411021-18a7-4e94-a582-876864e96814", 00:19:36.338 "assigned_rate_limits": { 00:19:36.338 "rw_ios_per_sec": 0, 00:19:36.338 "rw_mbytes_per_sec": 0, 00:19:36.338 "r_mbytes_per_sec": 0, 00:19:36.338 "w_mbytes_per_sec": 0 00:19:36.338 }, 00:19:36.338 "claimed": false, 00:19:36.338 "zoned": false, 00:19:36.338 "supported_io_types": { 00:19:36.338 "read": true, 00:19:36.338 "write": true, 00:19:36.338 "unmap": true, 00:19:36.338 "flush": true, 00:19:36.338 "reset": true, 00:19:36.338 "nvme_admin": false, 00:19:36.338 "nvme_io": false, 00:19:36.338 "nvme_io_md": false, 00:19:36.338 "write_zeroes": true, 00:19:36.338 "zcopy": false, 00:19:36.338 "get_zone_info": false, 00:19:36.338 "zone_management": false, 00:19:36.338 "zone_append": false, 00:19:36.338 "compare": false, 00:19:36.338 "compare_and_write": false, 00:19:36.338 "abort": false, 00:19:36.338 "seek_hole": false, 00:19:36.338 "seek_data": false, 00:19:36.338 "copy": false, 00:19:36.338 "nvme_iov_md": false 00:19:36.338 }, 00:19:36.338 "memory_domains": [ 00:19:36.338 { 00:19:36.338 "dma_device_id": "system", 00:19:36.338 "dma_device_type": 1 00:19:36.338 }, 00:19:36.338 { 00:19:36.338 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:36.338 "dma_device_type": 2 00:19:36.338 }, 00:19:36.338 { 00:19:36.338 "dma_device_id": "system", 00:19:36.338 "dma_device_type": 1 00:19:36.338 }, 00:19:36.338 { 00:19:36.338 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:36.338 "dma_device_type": 2 00:19:36.338 }, 00:19:36.338 { 00:19:36.338 "dma_device_id": "system", 00:19:36.338 "dma_device_type": 1 00:19:36.338 }, 00:19:36.338 { 00:19:36.338 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:36.338 "dma_device_type": 2 00:19:36.338 }, 00:19:36.338 { 00:19:36.338 "dma_device_id": "system", 00:19:36.338 "dma_device_type": 1 00:19:36.338 }, 00:19:36.338 { 00:19:36.338 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:36.338 "dma_device_type": 2 00:19:36.338 } 00:19:36.338 ], 00:19:36.338 "driver_specific": { 00:19:36.338 "raid": { 00:19:36.338 "uuid": "bc411021-18a7-4e94-a582-876864e96814", 00:19:36.338 "strip_size_kb": 64, 00:19:36.338 "state": "online", 00:19:36.338 "raid_level": "raid0", 00:19:36.338 "superblock": true, 00:19:36.338 "num_base_bdevs": 4, 00:19:36.338 "num_base_bdevs_discovered": 4, 00:19:36.338 "num_base_bdevs_operational": 4, 00:19:36.338 "base_bdevs_list": [ 00:19:36.338 { 00:19:36.338 "name": "pt1", 00:19:36.338 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:36.338 "is_configured": true, 00:19:36.338 "data_offset": 2048, 00:19:36.338 "data_size": 63488 00:19:36.338 }, 00:19:36.338 { 00:19:36.338 "name": "pt2", 00:19:36.338 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:36.338 "is_configured": true, 00:19:36.338 "data_offset": 2048, 00:19:36.338 "data_size": 63488 00:19:36.338 }, 00:19:36.338 { 00:19:36.338 "name": "pt3", 00:19:36.338 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:36.338 "is_configured": true, 00:19:36.338 "data_offset": 2048, 00:19:36.338 "data_size": 63488 00:19:36.338 }, 00:19:36.338 { 00:19:36.338 "name": "pt4", 00:19:36.338 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:36.338 "is_configured": true, 00:19:36.338 "data_offset": 2048, 00:19:36.338 "data_size": 63488 00:19:36.338 } 00:19:36.338 ] 00:19:36.338 } 00:19:36.338 } 00:19:36.338 }' 00:19:36.338 19:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:36.338 19:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:19:36.338 pt2 00:19:36.338 pt3 00:19:36.338 pt4' 00:19:36.338 19:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:36.338 19:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:36.338 19:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:36.597 19:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:36.597 "name": "pt1", 00:19:36.597 "aliases": [ 00:19:36.597 "00000000-0000-0000-0000-000000000001" 00:19:36.597 ], 00:19:36.597 "product_name": "passthru", 00:19:36.597 "block_size": 512, 00:19:36.597 "num_blocks": 65536, 00:19:36.597 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:36.597 "assigned_rate_limits": { 00:19:36.597 "rw_ios_per_sec": 0, 00:19:36.597 "rw_mbytes_per_sec": 0, 00:19:36.597 "r_mbytes_per_sec": 0, 00:19:36.597 "w_mbytes_per_sec": 0 00:19:36.597 }, 00:19:36.597 "claimed": true, 00:19:36.597 "claim_type": "exclusive_write", 00:19:36.597 "zoned": false, 00:19:36.597 "supported_io_types": { 00:19:36.597 "read": true, 00:19:36.597 "write": true, 00:19:36.597 "unmap": true, 00:19:36.597 "flush": true, 00:19:36.597 "reset": true, 00:19:36.597 "nvme_admin": false, 00:19:36.597 "nvme_io": false, 00:19:36.597 "nvme_io_md": false, 00:19:36.597 "write_zeroes": true, 00:19:36.597 "zcopy": true, 00:19:36.597 "get_zone_info": false, 00:19:36.597 "zone_management": false, 00:19:36.597 "zone_append": false, 00:19:36.597 "compare": false, 00:19:36.597 "compare_and_write": false, 00:19:36.597 "abort": true, 00:19:36.597 "seek_hole": false, 00:19:36.597 "seek_data": false, 00:19:36.597 "copy": true, 00:19:36.597 "nvme_iov_md": false 00:19:36.597 }, 00:19:36.597 "memory_domains": [ 00:19:36.597 { 00:19:36.597 "dma_device_id": "system", 00:19:36.597 "dma_device_type": 1 00:19:36.597 }, 00:19:36.597 { 00:19:36.597 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:36.597 "dma_device_type": 2 00:19:36.597 } 00:19:36.597 ], 00:19:36.597 "driver_specific": { 00:19:36.597 "passthru": { 00:19:36.597 "name": "pt1", 00:19:36.597 "base_bdev_name": "malloc1" 00:19:36.597 } 00:19:36.597 } 00:19:36.597 }' 00:19:36.597 19:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:36.597 19:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:36.856 19:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:36.856 19:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:36.856 19:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:36.856 19:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:36.856 19:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:36.856 19:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:36.856 19:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:36.856 19:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:36.856 19:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:37.115 19:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:37.115 19:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:37.115 19:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:37.115 19:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:37.374 19:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:37.374 "name": "pt2", 00:19:37.374 "aliases": [ 00:19:37.374 "00000000-0000-0000-0000-000000000002" 00:19:37.374 ], 00:19:37.374 "product_name": "passthru", 00:19:37.374 "block_size": 512, 00:19:37.374 "num_blocks": 65536, 00:19:37.374 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:37.374 "assigned_rate_limits": { 00:19:37.374 "rw_ios_per_sec": 0, 00:19:37.374 "rw_mbytes_per_sec": 0, 00:19:37.374 "r_mbytes_per_sec": 0, 00:19:37.374 "w_mbytes_per_sec": 0 00:19:37.374 }, 00:19:37.374 "claimed": true, 00:19:37.374 "claim_type": "exclusive_write", 00:19:37.374 "zoned": false, 00:19:37.374 "supported_io_types": { 00:19:37.374 "read": true, 00:19:37.374 "write": true, 00:19:37.374 "unmap": true, 00:19:37.374 "flush": true, 00:19:37.374 "reset": true, 00:19:37.374 "nvme_admin": false, 00:19:37.374 "nvme_io": false, 00:19:37.374 "nvme_io_md": false, 00:19:37.374 "write_zeroes": true, 00:19:37.374 "zcopy": true, 00:19:37.374 "get_zone_info": false, 00:19:37.374 "zone_management": false, 00:19:37.374 "zone_append": false, 00:19:37.374 "compare": false, 00:19:37.374 "compare_and_write": false, 00:19:37.374 "abort": true, 00:19:37.374 "seek_hole": false, 00:19:37.374 "seek_data": false, 00:19:37.374 "copy": true, 00:19:37.374 "nvme_iov_md": false 00:19:37.374 }, 00:19:37.375 "memory_domains": [ 00:19:37.375 { 00:19:37.375 "dma_device_id": "system", 00:19:37.375 "dma_device_type": 1 00:19:37.375 }, 00:19:37.375 { 00:19:37.375 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:37.375 "dma_device_type": 2 00:19:37.375 } 00:19:37.375 ], 00:19:37.375 "driver_specific": { 00:19:37.375 "passthru": { 00:19:37.375 "name": "pt2", 00:19:37.375 "base_bdev_name": "malloc2" 00:19:37.375 } 00:19:37.375 } 00:19:37.375 }' 00:19:37.375 19:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:37.375 19:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:37.375 19:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:37.375 19:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:37.375 19:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:37.375 19:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:37.375 19:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:37.634 19:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:37.634 19:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:37.634 19:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:37.634 19:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:37.634 19:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:37.634 19:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:37.634 19:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:37.634 19:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:37.893 19:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:37.893 "name": "pt3", 00:19:37.893 "aliases": [ 00:19:37.893 "00000000-0000-0000-0000-000000000003" 00:19:37.893 ], 00:19:37.893 "product_name": "passthru", 00:19:37.893 "block_size": 512, 00:19:37.893 "num_blocks": 65536, 00:19:37.893 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:37.893 "assigned_rate_limits": { 00:19:37.893 "rw_ios_per_sec": 0, 00:19:37.893 "rw_mbytes_per_sec": 0, 00:19:37.893 "r_mbytes_per_sec": 0, 00:19:37.893 "w_mbytes_per_sec": 0 00:19:37.893 }, 00:19:37.893 "claimed": true, 00:19:37.893 "claim_type": "exclusive_write", 00:19:37.893 "zoned": false, 00:19:37.893 "supported_io_types": { 00:19:37.893 "read": true, 00:19:37.893 "write": true, 00:19:37.893 "unmap": true, 00:19:37.893 "flush": true, 00:19:37.893 "reset": true, 00:19:37.893 "nvme_admin": false, 00:19:37.893 "nvme_io": false, 00:19:37.893 "nvme_io_md": false, 00:19:37.893 "write_zeroes": true, 00:19:37.893 "zcopy": true, 00:19:37.893 "get_zone_info": false, 00:19:37.893 "zone_management": false, 00:19:37.893 "zone_append": false, 00:19:37.893 "compare": false, 00:19:37.893 "compare_and_write": false, 00:19:37.893 "abort": true, 00:19:37.893 "seek_hole": false, 00:19:37.893 "seek_data": false, 00:19:37.893 "copy": true, 00:19:37.893 "nvme_iov_md": false 00:19:37.893 }, 00:19:37.893 "memory_domains": [ 00:19:37.893 { 00:19:37.893 "dma_device_id": "system", 00:19:37.893 "dma_device_type": 1 00:19:37.893 }, 00:19:37.893 { 00:19:37.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:37.893 "dma_device_type": 2 00:19:37.893 } 00:19:37.893 ], 00:19:37.893 "driver_specific": { 00:19:37.893 "passthru": { 00:19:37.893 "name": "pt3", 00:19:37.893 "base_bdev_name": "malloc3" 00:19:37.893 } 00:19:37.893 } 00:19:37.893 }' 00:19:37.893 19:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:37.893 19:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:38.152 19:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:38.152 19:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:38.152 19:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:38.152 19:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:38.152 19:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:38.152 19:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:38.152 19:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:38.152 19:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:38.152 19:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:38.412 19:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:38.412 19:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:38.412 19:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:19:38.412 19:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:38.672 19:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:38.672 "name": "pt4", 00:19:38.672 "aliases": [ 00:19:38.672 "00000000-0000-0000-0000-000000000004" 00:19:38.672 ], 00:19:38.672 "product_name": "passthru", 00:19:38.672 "block_size": 512, 00:19:38.672 "num_blocks": 65536, 00:19:38.672 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:38.672 "assigned_rate_limits": { 00:19:38.672 "rw_ios_per_sec": 0, 00:19:38.672 "rw_mbytes_per_sec": 0, 00:19:38.672 "r_mbytes_per_sec": 0, 00:19:38.672 "w_mbytes_per_sec": 0 00:19:38.672 }, 00:19:38.672 "claimed": true, 00:19:38.672 "claim_type": "exclusive_write", 00:19:38.672 "zoned": false, 00:19:38.672 "supported_io_types": { 00:19:38.672 "read": true, 00:19:38.672 "write": true, 00:19:38.672 "unmap": true, 00:19:38.672 "flush": true, 00:19:38.672 "reset": true, 00:19:38.672 "nvme_admin": false, 00:19:38.672 "nvme_io": false, 00:19:38.672 "nvme_io_md": false, 00:19:38.672 "write_zeroes": true, 00:19:38.672 "zcopy": true, 00:19:38.672 "get_zone_info": false, 00:19:38.672 "zone_management": false, 00:19:38.672 "zone_append": false, 00:19:38.672 "compare": false, 00:19:38.672 "compare_and_write": false, 00:19:38.672 "abort": true, 00:19:38.672 "seek_hole": false, 00:19:38.672 "seek_data": false, 00:19:38.672 "copy": true, 00:19:38.672 "nvme_iov_md": false 00:19:38.672 }, 00:19:38.672 "memory_domains": [ 00:19:38.672 { 00:19:38.672 "dma_device_id": "system", 00:19:38.672 "dma_device_type": 1 00:19:38.672 }, 00:19:38.672 { 00:19:38.672 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:38.672 "dma_device_type": 2 00:19:38.672 } 00:19:38.672 ], 00:19:38.672 "driver_specific": { 00:19:38.672 "passthru": { 00:19:38.672 "name": "pt4", 00:19:38.672 "base_bdev_name": "malloc4" 00:19:38.672 } 00:19:38.672 } 00:19:38.672 }' 00:19:38.672 19:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:38.672 19:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:38.672 19:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:38.672 19:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:38.672 19:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:38.672 19:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:38.672 19:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:38.672 19:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:38.931 19:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:38.931 19:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:38.931 19:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:38.931 19:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:38.931 19:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:38.931 19:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:19:39.190 [2024-07-24 19:55:30.629955] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:39.190 19:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=bc411021-18a7-4e94-a582-876864e96814 00:19:39.190 19:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z bc411021-18a7-4e94-a582-876864e96814 ']' 00:19:39.190 19:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:39.758 [2024-07-24 19:55:31.130956] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:39.758 [2024-07-24 19:55:31.130975] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:39.758 [2024-07-24 19:55:31.131023] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:39.758 [2024-07-24 19:55:31.131086] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:39.758 [2024-07-24 19:55:31.131098] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20bfe80 name raid_bdev1, state offline 00:19:39.758 19:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:39.758 19:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:19:40.326 19:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:19:40.326 19:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:19:40.326 19:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:19:40.326 19:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:19:40.585 19:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:19:40.585 19:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:41.154 19:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:19:41.154 19:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:19:41.154 19:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:19:41.154 19:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:19:41.722 19:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:19:41.722 19:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:19:41.981 19:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:19:41.981 19:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:41.981 19:55:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:19:41.981 19:55:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:41.981 19:55:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:41.981 19:55:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:19:41.981 19:55:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:41.981 19:55:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:19:41.981 19:55:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:41.981 19:55:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:19:41.981 19:55:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:41.981 19:55:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:19:41.981 19:55:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:42.240 [2024-07-24 19:55:33.677578] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:19:42.240 [2024-07-24 19:55:33.678970] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:19:42.240 [2024-07-24 19:55:33.679014] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:19:42.240 [2024-07-24 19:55:33.679048] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:19:42.240 [2024-07-24 19:55:33.679093] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:19:42.240 [2024-07-24 19:55:33.679136] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:19:42.240 [2024-07-24 19:55:33.679160] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:19:42.240 [2024-07-24 19:55:33.679182] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:19:42.240 [2024-07-24 19:55:33.679201] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:42.240 [2024-07-24 19:55:33.679211] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20b5c40 name raid_bdev1, state configuring 00:19:42.240 request: 00:19:42.240 { 00:19:42.240 "name": "raid_bdev1", 00:19:42.240 "raid_level": "raid0", 00:19:42.240 "base_bdevs": [ 00:19:42.240 "malloc1", 00:19:42.240 "malloc2", 00:19:42.240 "malloc3", 00:19:42.240 "malloc4" 00:19:42.240 ], 00:19:42.240 "strip_size_kb": 64, 00:19:42.240 "superblock": false, 00:19:42.240 "method": "bdev_raid_create", 00:19:42.240 "req_id": 1 00:19:42.240 } 00:19:42.240 Got JSON-RPC error response 00:19:42.240 response: 00:19:42.240 { 00:19:42.240 "code": -17, 00:19:42.240 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:19:42.240 } 00:19:42.240 19:55:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:19:42.240 19:55:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:19:42.240 19:55:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:19:42.240 19:55:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:19:42.240 19:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:42.240 19:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:19:42.499 19:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:19:42.499 19:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:19:42.499 19:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:42.758 [2024-07-24 19:55:34.190856] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:42.758 [2024-07-24 19:55:34.190900] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:42.758 [2024-07-24 19:55:34.190922] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20b4460 00:19:42.758 [2024-07-24 19:55:34.190935] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:42.758 [2024-07-24 19:55:34.192531] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:42.758 [2024-07-24 19:55:34.192562] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:42.758 [2024-07-24 19:55:34.192630] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:19:42.758 [2024-07-24 19:55:34.192657] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:42.758 pt1 00:19:42.758 19:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:19:42.758 19:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:42.758 19:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:42.758 19:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:42.758 19:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:42.758 19:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:42.758 19:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:42.758 19:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:42.758 19:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:42.758 19:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:42.758 19:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:42.759 19:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:43.327 19:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:43.327 "name": "raid_bdev1", 00:19:43.327 "uuid": "bc411021-18a7-4e94-a582-876864e96814", 00:19:43.327 "strip_size_kb": 64, 00:19:43.327 "state": "configuring", 00:19:43.327 "raid_level": "raid0", 00:19:43.327 "superblock": true, 00:19:43.327 "num_base_bdevs": 4, 00:19:43.327 "num_base_bdevs_discovered": 1, 00:19:43.327 "num_base_bdevs_operational": 4, 00:19:43.327 "base_bdevs_list": [ 00:19:43.327 { 00:19:43.327 "name": "pt1", 00:19:43.327 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:43.327 "is_configured": true, 00:19:43.327 "data_offset": 2048, 00:19:43.327 "data_size": 63488 00:19:43.327 }, 00:19:43.327 { 00:19:43.327 "name": null, 00:19:43.327 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:43.327 "is_configured": false, 00:19:43.327 "data_offset": 2048, 00:19:43.327 "data_size": 63488 00:19:43.327 }, 00:19:43.327 { 00:19:43.327 "name": null, 00:19:43.327 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:43.327 "is_configured": false, 00:19:43.327 "data_offset": 2048, 00:19:43.327 "data_size": 63488 00:19:43.327 }, 00:19:43.327 { 00:19:43.327 "name": null, 00:19:43.327 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:43.327 "is_configured": false, 00:19:43.327 "data_offset": 2048, 00:19:43.327 "data_size": 63488 00:19:43.327 } 00:19:43.327 ] 00:19:43.327 }' 00:19:43.327 19:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:43.327 19:55:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:43.895 19:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 4 -gt 2 ']' 00:19:43.895 19:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:43.895 [2024-07-24 19:55:35.466235] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:43.895 [2024-07-24 19:55:35.466284] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:43.895 [2024-07-24 19:55:35.466305] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f07da0 00:19:43.895 [2024-07-24 19:55:35.466317] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:43.895 [2024-07-24 19:55:35.466660] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:43.895 [2024-07-24 19:55:35.466678] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:43.895 [2024-07-24 19:55:35.466739] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:43.895 [2024-07-24 19:55:35.466758] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:43.895 pt2 00:19:44.154 19:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@488 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:44.154 [2024-07-24 19:55:35.714917] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:19:44.154 19:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@489 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:19:44.154 19:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:44.154 19:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:44.154 19:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:44.154 19:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:44.154 19:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:44.154 19:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:44.154 19:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:44.154 19:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:44.154 19:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:44.154 19:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:44.154 19:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:44.413 19:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:44.413 "name": "raid_bdev1", 00:19:44.413 "uuid": "bc411021-18a7-4e94-a582-876864e96814", 00:19:44.413 "strip_size_kb": 64, 00:19:44.413 "state": "configuring", 00:19:44.413 "raid_level": "raid0", 00:19:44.413 "superblock": true, 00:19:44.413 "num_base_bdevs": 4, 00:19:44.413 "num_base_bdevs_discovered": 1, 00:19:44.413 "num_base_bdevs_operational": 4, 00:19:44.413 "base_bdevs_list": [ 00:19:44.413 { 00:19:44.413 "name": "pt1", 00:19:44.413 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:44.413 "is_configured": true, 00:19:44.413 "data_offset": 2048, 00:19:44.413 "data_size": 63488 00:19:44.413 }, 00:19:44.413 { 00:19:44.413 "name": null, 00:19:44.414 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:44.414 "is_configured": false, 00:19:44.414 "data_offset": 2048, 00:19:44.414 "data_size": 63488 00:19:44.414 }, 00:19:44.414 { 00:19:44.414 "name": null, 00:19:44.414 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:44.414 "is_configured": false, 00:19:44.414 "data_offset": 2048, 00:19:44.414 "data_size": 63488 00:19:44.414 }, 00:19:44.414 { 00:19:44.414 "name": null, 00:19:44.414 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:44.414 "is_configured": false, 00:19:44.414 "data_offset": 2048, 00:19:44.414 "data_size": 63488 00:19:44.414 } 00:19:44.414 ] 00:19:44.414 }' 00:19:44.414 19:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:44.414 19:55:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:44.982 19:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:19:44.982 19:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:19:44.982 19:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:45.240 [2024-07-24 19:55:36.737614] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:45.240 [2024-07-24 19:55:36.737659] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:45.240 [2024-07-24 19:55:36.737678] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f05c90 00:19:45.240 [2024-07-24 19:55:36.737690] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:45.240 [2024-07-24 19:55:36.738025] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:45.240 [2024-07-24 19:55:36.738041] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:45.240 [2024-07-24 19:55:36.738103] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:45.240 [2024-07-24 19:55:36.738122] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:45.240 pt2 00:19:45.240 19:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:19:45.240 19:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:19:45.240 19:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:45.499 [2024-07-24 19:55:36.986269] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:45.499 [2024-07-24 19:55:36.986298] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:45.499 [2024-07-24 19:55:36.986316] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20b48c0 00:19:45.499 [2024-07-24 19:55:36.986328] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:45.499 [2024-07-24 19:55:36.986613] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:45.499 [2024-07-24 19:55:36.986630] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:45.499 [2024-07-24 19:55:36.986681] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:19:45.499 [2024-07-24 19:55:36.986698] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:45.499 pt3 00:19:45.499 19:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:19:45.499 19:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:19:45.499 19:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:45.757 [2024-07-24 19:55:37.178786] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:45.757 [2024-07-24 19:55:37.178813] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:45.757 [2024-07-24 19:55:37.178828] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20b2ff0 00:19:45.757 [2024-07-24 19:55:37.178840] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:45.757 [2024-07-24 19:55:37.179099] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:45.757 [2024-07-24 19:55:37.179122] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:45.757 [2024-07-24 19:55:37.179167] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:19:45.757 [2024-07-24 19:55:37.179184] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:45.757 [2024-07-24 19:55:37.179298] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f05530 00:19:45.757 [2024-07-24 19:55:37.179308] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:45.757 [2024-07-24 19:55:37.179483] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f0d960 00:19:45.757 [2024-07-24 19:55:37.179609] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f05530 00:19:45.757 [2024-07-24 19:55:37.179619] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f05530 00:19:45.757 [2024-07-24 19:55:37.179710] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:45.757 pt4 00:19:45.757 19:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:19:45.757 19:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:19:45.757 19:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:45.757 19:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:45.757 19:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:45.757 19:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:45.757 19:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:45.757 19:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:45.757 19:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:45.757 19:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:45.757 19:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:45.757 19:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:45.758 19:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:45.758 19:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:46.016 19:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:46.016 "name": "raid_bdev1", 00:19:46.016 "uuid": "bc411021-18a7-4e94-a582-876864e96814", 00:19:46.016 "strip_size_kb": 64, 00:19:46.016 "state": "online", 00:19:46.016 "raid_level": "raid0", 00:19:46.016 "superblock": true, 00:19:46.016 "num_base_bdevs": 4, 00:19:46.016 "num_base_bdevs_discovered": 4, 00:19:46.016 "num_base_bdevs_operational": 4, 00:19:46.016 "base_bdevs_list": [ 00:19:46.016 { 00:19:46.016 "name": "pt1", 00:19:46.016 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:46.016 "is_configured": true, 00:19:46.016 "data_offset": 2048, 00:19:46.016 "data_size": 63488 00:19:46.016 }, 00:19:46.016 { 00:19:46.016 "name": "pt2", 00:19:46.016 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:46.016 "is_configured": true, 00:19:46.016 "data_offset": 2048, 00:19:46.016 "data_size": 63488 00:19:46.016 }, 00:19:46.016 { 00:19:46.016 "name": "pt3", 00:19:46.016 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:46.016 "is_configured": true, 00:19:46.016 "data_offset": 2048, 00:19:46.016 "data_size": 63488 00:19:46.016 }, 00:19:46.016 { 00:19:46.016 "name": "pt4", 00:19:46.016 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:46.016 "is_configured": true, 00:19:46.016 "data_offset": 2048, 00:19:46.016 "data_size": 63488 00:19:46.016 } 00:19:46.016 ] 00:19:46.016 }' 00:19:46.016 19:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:46.016 19:55:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:46.585 19:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:19:46.585 19:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:19:46.585 19:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:46.585 19:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:46.585 19:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:46.585 19:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:46.585 19:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:46.585 19:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:46.844 [2024-07-24 19:55:38.221875] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:46.844 19:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:46.844 "name": "raid_bdev1", 00:19:46.844 "aliases": [ 00:19:46.844 "bc411021-18a7-4e94-a582-876864e96814" 00:19:46.844 ], 00:19:46.844 "product_name": "Raid Volume", 00:19:46.844 "block_size": 512, 00:19:46.844 "num_blocks": 253952, 00:19:46.844 "uuid": "bc411021-18a7-4e94-a582-876864e96814", 00:19:46.844 "assigned_rate_limits": { 00:19:46.844 "rw_ios_per_sec": 0, 00:19:46.844 "rw_mbytes_per_sec": 0, 00:19:46.844 "r_mbytes_per_sec": 0, 00:19:46.844 "w_mbytes_per_sec": 0 00:19:46.844 }, 00:19:46.844 "claimed": false, 00:19:46.844 "zoned": false, 00:19:46.844 "supported_io_types": { 00:19:46.844 "read": true, 00:19:46.844 "write": true, 00:19:46.844 "unmap": true, 00:19:46.844 "flush": true, 00:19:46.844 "reset": true, 00:19:46.844 "nvme_admin": false, 00:19:46.844 "nvme_io": false, 00:19:46.844 "nvme_io_md": false, 00:19:46.844 "write_zeroes": true, 00:19:46.845 "zcopy": false, 00:19:46.845 "get_zone_info": false, 00:19:46.845 "zone_management": false, 00:19:46.845 "zone_append": false, 00:19:46.845 "compare": false, 00:19:46.845 "compare_and_write": false, 00:19:46.845 "abort": false, 00:19:46.845 "seek_hole": false, 00:19:46.845 "seek_data": false, 00:19:46.845 "copy": false, 00:19:46.845 "nvme_iov_md": false 00:19:46.845 }, 00:19:46.845 "memory_domains": [ 00:19:46.845 { 00:19:46.845 "dma_device_id": "system", 00:19:46.845 "dma_device_type": 1 00:19:46.845 }, 00:19:46.845 { 00:19:46.845 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:46.845 "dma_device_type": 2 00:19:46.845 }, 00:19:46.845 { 00:19:46.845 "dma_device_id": "system", 00:19:46.845 "dma_device_type": 1 00:19:46.845 }, 00:19:46.845 { 00:19:46.845 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:46.845 "dma_device_type": 2 00:19:46.845 }, 00:19:46.845 { 00:19:46.845 "dma_device_id": "system", 00:19:46.845 "dma_device_type": 1 00:19:46.845 }, 00:19:46.845 { 00:19:46.845 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:46.845 "dma_device_type": 2 00:19:46.845 }, 00:19:46.845 { 00:19:46.845 "dma_device_id": "system", 00:19:46.845 "dma_device_type": 1 00:19:46.845 }, 00:19:46.845 { 00:19:46.845 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:46.845 "dma_device_type": 2 00:19:46.845 } 00:19:46.845 ], 00:19:46.845 "driver_specific": { 00:19:46.845 "raid": { 00:19:46.845 "uuid": "bc411021-18a7-4e94-a582-876864e96814", 00:19:46.845 "strip_size_kb": 64, 00:19:46.845 "state": "online", 00:19:46.845 "raid_level": "raid0", 00:19:46.845 "superblock": true, 00:19:46.845 "num_base_bdevs": 4, 00:19:46.845 "num_base_bdevs_discovered": 4, 00:19:46.845 "num_base_bdevs_operational": 4, 00:19:46.845 "base_bdevs_list": [ 00:19:46.845 { 00:19:46.845 "name": "pt1", 00:19:46.845 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:46.845 "is_configured": true, 00:19:46.845 "data_offset": 2048, 00:19:46.845 "data_size": 63488 00:19:46.845 }, 00:19:46.845 { 00:19:46.845 "name": "pt2", 00:19:46.845 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:46.845 "is_configured": true, 00:19:46.845 "data_offset": 2048, 00:19:46.845 "data_size": 63488 00:19:46.845 }, 00:19:46.845 { 00:19:46.845 "name": "pt3", 00:19:46.845 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:46.845 "is_configured": true, 00:19:46.845 "data_offset": 2048, 00:19:46.845 "data_size": 63488 00:19:46.845 }, 00:19:46.845 { 00:19:46.845 "name": "pt4", 00:19:46.845 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:46.845 "is_configured": true, 00:19:46.845 "data_offset": 2048, 00:19:46.845 "data_size": 63488 00:19:46.845 } 00:19:46.845 ] 00:19:46.845 } 00:19:46.845 } 00:19:46.845 }' 00:19:46.845 19:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:46.845 19:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:19:46.845 pt2 00:19:46.845 pt3 00:19:46.845 pt4' 00:19:46.845 19:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:46.845 19:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:46.845 19:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:47.219 19:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:47.219 "name": "pt1", 00:19:47.219 "aliases": [ 00:19:47.219 "00000000-0000-0000-0000-000000000001" 00:19:47.219 ], 00:19:47.219 "product_name": "passthru", 00:19:47.219 "block_size": 512, 00:19:47.219 "num_blocks": 65536, 00:19:47.219 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:47.219 "assigned_rate_limits": { 00:19:47.219 "rw_ios_per_sec": 0, 00:19:47.219 "rw_mbytes_per_sec": 0, 00:19:47.219 "r_mbytes_per_sec": 0, 00:19:47.219 "w_mbytes_per_sec": 0 00:19:47.219 }, 00:19:47.220 "claimed": true, 00:19:47.220 "claim_type": "exclusive_write", 00:19:47.220 "zoned": false, 00:19:47.220 "supported_io_types": { 00:19:47.220 "read": true, 00:19:47.220 "write": true, 00:19:47.220 "unmap": true, 00:19:47.220 "flush": true, 00:19:47.220 "reset": true, 00:19:47.220 "nvme_admin": false, 00:19:47.220 "nvme_io": false, 00:19:47.220 "nvme_io_md": false, 00:19:47.220 "write_zeroes": true, 00:19:47.220 "zcopy": true, 00:19:47.220 "get_zone_info": false, 00:19:47.220 "zone_management": false, 00:19:47.220 "zone_append": false, 00:19:47.220 "compare": false, 00:19:47.220 "compare_and_write": false, 00:19:47.220 "abort": true, 00:19:47.220 "seek_hole": false, 00:19:47.220 "seek_data": false, 00:19:47.220 "copy": true, 00:19:47.220 "nvme_iov_md": false 00:19:47.220 }, 00:19:47.220 "memory_domains": [ 00:19:47.220 { 00:19:47.220 "dma_device_id": "system", 00:19:47.220 "dma_device_type": 1 00:19:47.220 }, 00:19:47.220 { 00:19:47.220 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:47.220 "dma_device_type": 2 00:19:47.220 } 00:19:47.220 ], 00:19:47.220 "driver_specific": { 00:19:47.220 "passthru": { 00:19:47.220 "name": "pt1", 00:19:47.220 "base_bdev_name": "malloc1" 00:19:47.220 } 00:19:47.220 } 00:19:47.220 }' 00:19:47.220 19:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:47.220 19:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:47.220 19:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:47.220 19:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:47.220 19:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:47.220 19:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:47.220 19:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:47.479 19:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:47.479 19:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:47.479 19:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:47.479 19:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:47.479 19:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:47.479 19:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:47.479 19:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:47.479 19:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:47.736 19:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:47.736 "name": "pt2", 00:19:47.736 "aliases": [ 00:19:47.736 "00000000-0000-0000-0000-000000000002" 00:19:47.736 ], 00:19:47.736 "product_name": "passthru", 00:19:47.736 "block_size": 512, 00:19:47.736 "num_blocks": 65536, 00:19:47.736 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:47.736 "assigned_rate_limits": { 00:19:47.736 "rw_ios_per_sec": 0, 00:19:47.736 "rw_mbytes_per_sec": 0, 00:19:47.736 "r_mbytes_per_sec": 0, 00:19:47.736 "w_mbytes_per_sec": 0 00:19:47.736 }, 00:19:47.736 "claimed": true, 00:19:47.736 "claim_type": "exclusive_write", 00:19:47.736 "zoned": false, 00:19:47.736 "supported_io_types": { 00:19:47.736 "read": true, 00:19:47.736 "write": true, 00:19:47.736 "unmap": true, 00:19:47.736 "flush": true, 00:19:47.736 "reset": true, 00:19:47.736 "nvme_admin": false, 00:19:47.736 "nvme_io": false, 00:19:47.736 "nvme_io_md": false, 00:19:47.736 "write_zeroes": true, 00:19:47.736 "zcopy": true, 00:19:47.736 "get_zone_info": false, 00:19:47.736 "zone_management": false, 00:19:47.736 "zone_append": false, 00:19:47.736 "compare": false, 00:19:47.736 "compare_and_write": false, 00:19:47.736 "abort": true, 00:19:47.736 "seek_hole": false, 00:19:47.736 "seek_data": false, 00:19:47.736 "copy": true, 00:19:47.736 "nvme_iov_md": false 00:19:47.736 }, 00:19:47.736 "memory_domains": [ 00:19:47.736 { 00:19:47.736 "dma_device_id": "system", 00:19:47.736 "dma_device_type": 1 00:19:47.736 }, 00:19:47.736 { 00:19:47.736 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:47.737 "dma_device_type": 2 00:19:47.737 } 00:19:47.737 ], 00:19:47.737 "driver_specific": { 00:19:47.737 "passthru": { 00:19:47.737 "name": "pt2", 00:19:47.737 "base_bdev_name": "malloc2" 00:19:47.737 } 00:19:47.737 } 00:19:47.737 }' 00:19:47.737 19:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:47.737 19:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:47.737 19:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:47.737 19:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:47.737 19:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:47.737 19:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:47.737 19:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:47.994 19:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:47.994 19:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:47.994 19:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:47.994 19:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:47.994 19:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:47.994 19:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:47.994 19:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:47.994 19:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:48.252 19:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:48.252 "name": "pt3", 00:19:48.252 "aliases": [ 00:19:48.252 "00000000-0000-0000-0000-000000000003" 00:19:48.252 ], 00:19:48.252 "product_name": "passthru", 00:19:48.252 "block_size": 512, 00:19:48.252 "num_blocks": 65536, 00:19:48.252 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:48.252 "assigned_rate_limits": { 00:19:48.252 "rw_ios_per_sec": 0, 00:19:48.252 "rw_mbytes_per_sec": 0, 00:19:48.252 "r_mbytes_per_sec": 0, 00:19:48.252 "w_mbytes_per_sec": 0 00:19:48.252 }, 00:19:48.252 "claimed": true, 00:19:48.252 "claim_type": "exclusive_write", 00:19:48.252 "zoned": false, 00:19:48.252 "supported_io_types": { 00:19:48.252 "read": true, 00:19:48.252 "write": true, 00:19:48.252 "unmap": true, 00:19:48.252 "flush": true, 00:19:48.252 "reset": true, 00:19:48.252 "nvme_admin": false, 00:19:48.252 "nvme_io": false, 00:19:48.252 "nvme_io_md": false, 00:19:48.252 "write_zeroes": true, 00:19:48.252 "zcopy": true, 00:19:48.252 "get_zone_info": false, 00:19:48.252 "zone_management": false, 00:19:48.252 "zone_append": false, 00:19:48.252 "compare": false, 00:19:48.252 "compare_and_write": false, 00:19:48.252 "abort": true, 00:19:48.252 "seek_hole": false, 00:19:48.252 "seek_data": false, 00:19:48.252 "copy": true, 00:19:48.252 "nvme_iov_md": false 00:19:48.252 }, 00:19:48.252 "memory_domains": [ 00:19:48.252 { 00:19:48.252 "dma_device_id": "system", 00:19:48.252 "dma_device_type": 1 00:19:48.252 }, 00:19:48.252 { 00:19:48.252 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:48.252 "dma_device_type": 2 00:19:48.252 } 00:19:48.252 ], 00:19:48.252 "driver_specific": { 00:19:48.252 "passthru": { 00:19:48.252 "name": "pt3", 00:19:48.252 "base_bdev_name": "malloc3" 00:19:48.252 } 00:19:48.252 } 00:19:48.252 }' 00:19:48.252 19:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:48.252 19:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:48.252 19:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:48.252 19:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:48.510 19:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:48.510 19:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:48.510 19:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:48.510 19:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:48.510 19:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:48.510 19:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:48.510 19:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:48.510 19:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:48.510 19:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:48.510 19:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:19:48.510 19:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:48.767 19:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:48.767 "name": "pt4", 00:19:48.767 "aliases": [ 00:19:48.767 "00000000-0000-0000-0000-000000000004" 00:19:48.767 ], 00:19:48.767 "product_name": "passthru", 00:19:48.767 "block_size": 512, 00:19:48.767 "num_blocks": 65536, 00:19:48.767 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:48.767 "assigned_rate_limits": { 00:19:48.767 "rw_ios_per_sec": 0, 00:19:48.767 "rw_mbytes_per_sec": 0, 00:19:48.767 "r_mbytes_per_sec": 0, 00:19:48.767 "w_mbytes_per_sec": 0 00:19:48.767 }, 00:19:48.767 "claimed": true, 00:19:48.767 "claim_type": "exclusive_write", 00:19:48.767 "zoned": false, 00:19:48.767 "supported_io_types": { 00:19:48.767 "read": true, 00:19:48.767 "write": true, 00:19:48.767 "unmap": true, 00:19:48.767 "flush": true, 00:19:48.767 "reset": true, 00:19:48.767 "nvme_admin": false, 00:19:48.767 "nvme_io": false, 00:19:48.767 "nvme_io_md": false, 00:19:48.767 "write_zeroes": true, 00:19:48.767 "zcopy": true, 00:19:48.767 "get_zone_info": false, 00:19:48.767 "zone_management": false, 00:19:48.767 "zone_append": false, 00:19:48.767 "compare": false, 00:19:48.767 "compare_and_write": false, 00:19:48.767 "abort": true, 00:19:48.767 "seek_hole": false, 00:19:48.767 "seek_data": false, 00:19:48.767 "copy": true, 00:19:48.767 "nvme_iov_md": false 00:19:48.767 }, 00:19:48.767 "memory_domains": [ 00:19:48.767 { 00:19:48.767 "dma_device_id": "system", 00:19:48.767 "dma_device_type": 1 00:19:48.767 }, 00:19:48.767 { 00:19:48.767 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:48.767 "dma_device_type": 2 00:19:48.767 } 00:19:48.767 ], 00:19:48.767 "driver_specific": { 00:19:48.767 "passthru": { 00:19:48.767 "name": "pt4", 00:19:48.767 "base_bdev_name": "malloc4" 00:19:48.767 } 00:19:48.767 } 00:19:48.767 }' 00:19:48.767 19:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:48.767 19:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:48.767 19:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:48.767 19:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:49.025 19:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:49.025 19:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:49.025 19:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:49.025 19:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:49.025 19:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:49.025 19:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:49.025 19:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:49.025 19:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:49.284 19:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:49.284 19:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:19:49.284 [2024-07-24 19:55:40.844853] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:49.284 19:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' bc411021-18a7-4e94-a582-876864e96814 '!=' bc411021-18a7-4e94-a582-876864e96814 ']' 00:19:49.284 19:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy raid0 00:19:49.284 19:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:49.284 19:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:49.284 19:55:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 1446663 00:19:49.284 19:55:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1446663 ']' 00:19:49.284 19:55:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1446663 00:19:49.284 19:55:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:19:49.284 19:55:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:19:49.542 19:55:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1446663 00:19:49.542 19:55:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:19:49.542 19:55:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:19:49.542 19:55:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1446663' 00:19:49.542 killing process with pid 1446663 00:19:49.542 19:55:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 1446663 00:19:49.542 [2024-07-24 19:55:40.915954] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:49.542 [2024-07-24 19:55:40.916031] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:49.542 [2024-07-24 19:55:40.916095] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:49.542 [2024-07-24 19:55:40.916108] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f05530 name raid_bdev1, state offline 00:19:49.542 19:55:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 1446663 00:19:49.542 [2024-07-24 19:55:40.958325] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:49.802 19:55:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:19:49.802 00:19:49.802 real 0m17.748s 00:19:49.802 user 0m32.127s 00:19:49.802 sys 0m3.191s 00:19:49.802 19:55:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:19:49.802 19:55:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:49.802 ************************************ 00:19:49.802 END TEST raid_superblock_test 00:19:49.802 ************************************ 00:19:49.802 19:55:41 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:19:49.802 19:55:41 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:19:49.802 19:55:41 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:19:49.802 19:55:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:49.802 ************************************ 00:19:49.802 START TEST raid_read_error_test 00:19:49.802 ************************************ 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 4 read 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid0 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=4 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev4 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid0 '!=' raid1 ']' 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.SyXFX11miM 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1449306 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1449306 /var/tmp/spdk-raid.sock 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 1449306 ']' 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:49.802 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:19:49.802 19:55:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:49.802 [2024-07-24 19:55:41.357447] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:19:49.802 [2024-07-24 19:55:41.357514] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1449306 ] 00:19:50.061 [2024-07-24 19:55:41.484984] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:50.061 [2024-07-24 19:55:41.586908] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:50.061 [2024-07-24 19:55:41.647460] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:50.061 [2024-07-24 19:55:41.647509] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:50.997 19:55:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:19:50.997 19:55:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:19:50.997 19:55:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:19:50.997 19:55:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:50.997 BaseBdev1_malloc 00:19:50.997 19:55:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:51.255 true 00:19:51.255 19:55:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:51.513 [2024-07-24 19:55:43.020620] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:51.513 [2024-07-24 19:55:43.020665] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:51.513 [2024-07-24 19:55:43.020686] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12923a0 00:19:51.513 [2024-07-24 19:55:43.020699] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:51.513 [2024-07-24 19:55:43.022430] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:51.513 [2024-07-24 19:55:43.022459] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:51.513 BaseBdev1 00:19:51.513 19:55:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:19:51.513 19:55:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:51.771 BaseBdev2_malloc 00:19:51.771 19:55:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:52.029 true 00:19:52.029 19:55:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:52.287 [2024-07-24 19:55:43.755143] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:52.287 [2024-07-24 19:55:43.755188] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:52.287 [2024-07-24 19:55:43.755213] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1351370 00:19:52.287 [2024-07-24 19:55:43.755225] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:52.287 [2024-07-24 19:55:43.756834] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:52.287 [2024-07-24 19:55:43.756863] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:52.287 BaseBdev2 00:19:52.287 19:55:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:19:52.287 19:55:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:52.545 BaseBdev3_malloc 00:19:52.545 19:55:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:52.804 true 00:19:52.804 19:55:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:53.062 [2024-07-24 19:55:44.478888] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:53.062 [2024-07-24 19:55:44.478933] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:53.062 [2024-07-24 19:55:44.478957] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12872d0 00:19:53.062 [2024-07-24 19:55:44.478970] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:53.062 [2024-07-24 19:55:44.480555] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:53.062 [2024-07-24 19:55:44.480583] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:53.062 BaseBdev3 00:19:53.062 19:55:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:19:53.062 19:55:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:53.320 BaseBdev4_malloc 00:19:53.320 19:55:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:19:53.579 true 00:19:53.579 19:55:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:19:53.837 [2024-07-24 19:55:45.197356] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:19:53.837 [2024-07-24 19:55:45.197409] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:53.837 [2024-07-24 19:55:45.197436] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x128a310 00:19:53.837 [2024-07-24 19:55:45.197449] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:53.838 [2024-07-24 19:55:45.199051] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:53.838 [2024-07-24 19:55:45.199081] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:53.838 BaseBdev4 00:19:53.838 19:55:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:19:54.096 [2024-07-24 19:55:45.442045] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:54.096 [2024-07-24 19:55:45.443425] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:54.096 [2024-07-24 19:55:45.443496] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:54.096 [2024-07-24 19:55:45.443556] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:54.096 [2024-07-24 19:55:45.443793] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x128b060 00:19:54.096 [2024-07-24 19:55:45.443805] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:54.096 [2024-07-24 19:55:45.444004] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x128bc10 00:19:54.096 [2024-07-24 19:55:45.444163] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x128b060 00:19:54.096 [2024-07-24 19:55:45.444173] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x128b060 00:19:54.096 [2024-07-24 19:55:45.444279] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:54.096 19:55:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:54.096 19:55:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:54.096 19:55:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:54.096 19:55:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:54.096 19:55:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:54.096 19:55:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:54.096 19:55:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:54.096 19:55:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:54.096 19:55:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:54.096 19:55:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:54.096 19:55:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:54.096 19:55:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:54.355 19:55:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:54.355 "name": "raid_bdev1", 00:19:54.355 "uuid": "0da25986-1189-43f7-8f4f-7049315c8078", 00:19:54.355 "strip_size_kb": 64, 00:19:54.355 "state": "online", 00:19:54.355 "raid_level": "raid0", 00:19:54.355 "superblock": true, 00:19:54.355 "num_base_bdevs": 4, 00:19:54.355 "num_base_bdevs_discovered": 4, 00:19:54.355 "num_base_bdevs_operational": 4, 00:19:54.355 "base_bdevs_list": [ 00:19:54.355 { 00:19:54.355 "name": "BaseBdev1", 00:19:54.355 "uuid": "1d1b5d86-90aa-5261-bbb4-f2f87de451fa", 00:19:54.355 "is_configured": true, 00:19:54.355 "data_offset": 2048, 00:19:54.355 "data_size": 63488 00:19:54.355 }, 00:19:54.355 { 00:19:54.355 "name": "BaseBdev2", 00:19:54.355 "uuid": "0c6990af-44be-574c-ae6d-2452e6ddaaa0", 00:19:54.355 "is_configured": true, 00:19:54.355 "data_offset": 2048, 00:19:54.355 "data_size": 63488 00:19:54.355 }, 00:19:54.355 { 00:19:54.355 "name": "BaseBdev3", 00:19:54.355 "uuid": "8f41a790-5235-5302-a6cc-ba73a99787d5", 00:19:54.355 "is_configured": true, 00:19:54.355 "data_offset": 2048, 00:19:54.355 "data_size": 63488 00:19:54.355 }, 00:19:54.355 { 00:19:54.355 "name": "BaseBdev4", 00:19:54.355 "uuid": "e257addf-8f73-5b94-af61-45fff1292fe0", 00:19:54.355 "is_configured": true, 00:19:54.355 "data_offset": 2048, 00:19:54.355 "data_size": 63488 00:19:54.355 } 00:19:54.355 ] 00:19:54.355 }' 00:19:54.355 19:55:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:54.355 19:55:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:54.923 19:55:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:19:54.923 19:55:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:54.923 [2024-07-24 19:55:46.396869] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1352e20 00:19:55.859 19:55:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:19:56.117 19:55:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:19:56.118 19:55:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid0 = \r\a\i\d\1 ]] 00:19:56.118 19:55:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=4 00:19:56.118 19:55:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:56.118 19:55:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:56.118 19:55:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:56.118 19:55:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:56.118 19:55:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:56.118 19:55:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:56.118 19:55:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:56.118 19:55:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:56.118 19:55:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:56.118 19:55:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:56.118 19:55:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:56.118 19:55:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:56.377 19:55:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:56.377 "name": "raid_bdev1", 00:19:56.377 "uuid": "0da25986-1189-43f7-8f4f-7049315c8078", 00:19:56.377 "strip_size_kb": 64, 00:19:56.377 "state": "online", 00:19:56.377 "raid_level": "raid0", 00:19:56.377 "superblock": true, 00:19:56.377 "num_base_bdevs": 4, 00:19:56.377 "num_base_bdevs_discovered": 4, 00:19:56.377 "num_base_bdevs_operational": 4, 00:19:56.377 "base_bdevs_list": [ 00:19:56.377 { 00:19:56.377 "name": "BaseBdev1", 00:19:56.377 "uuid": "1d1b5d86-90aa-5261-bbb4-f2f87de451fa", 00:19:56.377 "is_configured": true, 00:19:56.377 "data_offset": 2048, 00:19:56.377 "data_size": 63488 00:19:56.377 }, 00:19:56.377 { 00:19:56.377 "name": "BaseBdev2", 00:19:56.377 "uuid": "0c6990af-44be-574c-ae6d-2452e6ddaaa0", 00:19:56.377 "is_configured": true, 00:19:56.377 "data_offset": 2048, 00:19:56.377 "data_size": 63488 00:19:56.377 }, 00:19:56.377 { 00:19:56.377 "name": "BaseBdev3", 00:19:56.377 "uuid": "8f41a790-5235-5302-a6cc-ba73a99787d5", 00:19:56.377 "is_configured": true, 00:19:56.377 "data_offset": 2048, 00:19:56.377 "data_size": 63488 00:19:56.377 }, 00:19:56.377 { 00:19:56.377 "name": "BaseBdev4", 00:19:56.377 "uuid": "e257addf-8f73-5b94-af61-45fff1292fe0", 00:19:56.377 "is_configured": true, 00:19:56.377 "data_offset": 2048, 00:19:56.377 "data_size": 63488 00:19:56.377 } 00:19:56.377 ] 00:19:56.377 }' 00:19:56.377 19:55:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:56.377 19:55:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:56.944 19:55:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:57.204 [2024-07-24 19:55:48.631069] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:57.204 [2024-07-24 19:55:48.631106] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:57.204 [2024-07-24 19:55:48.634275] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:57.204 [2024-07-24 19:55:48.634315] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:57.204 [2024-07-24 19:55:48.634354] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:57.204 [2024-07-24 19:55:48.634365] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x128b060 name raid_bdev1, state offline 00:19:57.204 0 00:19:57.204 19:55:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1449306 00:19:57.204 19:55:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 1449306 ']' 00:19:57.204 19:55:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 1449306 00:19:57.204 19:55:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:19:57.204 19:55:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:19:57.204 19:55:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1449306 00:19:57.204 19:55:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:19:57.204 19:55:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:19:57.204 19:55:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1449306' 00:19:57.204 killing process with pid 1449306 00:19:57.204 19:55:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 1449306 00:19:57.204 [2024-07-24 19:55:48.701510] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:57.204 19:55:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 1449306 00:19:57.204 [2024-07-24 19:55:48.736692] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:57.463 19:55:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.SyXFX11miM 00:19:57.463 19:55:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:19:57.464 19:55:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:19:57.464 19:55:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.45 00:19:57.464 19:55:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid0 00:19:57.464 19:55:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:57.464 19:55:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:57.464 19:55:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.45 != \0\.\0\0 ]] 00:19:57.464 00:19:57.464 real 0m7.705s 00:19:57.464 user 0m12.358s 00:19:57.464 sys 0m1.332s 00:19:57.464 19:55:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:19:57.464 19:55:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:57.464 ************************************ 00:19:57.464 END TEST raid_read_error_test 00:19:57.464 ************************************ 00:19:57.464 19:55:49 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:19:57.464 19:55:49 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:19:57.464 19:55:49 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:19:57.464 19:55:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:57.723 ************************************ 00:19:57.723 START TEST raid_write_error_test 00:19:57.723 ************************************ 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 4 write 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid0 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=4 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev4 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid0 '!=' raid1 ']' 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.H7ueYiXG9A 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1450351 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1450351 /var/tmp/spdk-raid.sock 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 1450351 ']' 00:19:57.723 19:55:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:57.724 19:55:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:19:57.724 19:55:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:57.724 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:57.724 19:55:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:19:57.724 19:55:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:57.724 [2024-07-24 19:55:49.151632] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:19:57.724 [2024-07-24 19:55:49.151703] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1450351 ] 00:19:57.724 [2024-07-24 19:55:49.281906] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:57.983 [2024-07-24 19:55:49.388894] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:57.983 [2024-07-24 19:55:49.458045] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:57.983 [2024-07-24 19:55:49.458087] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:58.551 19:55:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:19:58.551 19:55:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:19:58.551 19:55:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:19:58.551 19:55:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:58.809 BaseBdev1_malloc 00:19:58.809 19:55:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:59.068 true 00:19:59.068 19:55:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:59.327 [2024-07-24 19:55:50.764582] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:59.327 [2024-07-24 19:55:50.764625] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:59.327 [2024-07-24 19:55:50.764646] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12b53a0 00:19:59.327 [2024-07-24 19:55:50.764658] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:59.327 [2024-07-24 19:55:50.766278] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:59.327 [2024-07-24 19:55:50.766305] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:59.327 BaseBdev1 00:19:59.327 19:55:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:19:59.327 19:55:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:59.586 BaseBdev2_malloc 00:19:59.586 19:55:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:59.845 true 00:19:59.845 19:55:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:00.104 [2024-07-24 19:55:51.511182] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:00.104 [2024-07-24 19:55:51.511225] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:00.104 [2024-07-24 19:55:51.511248] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1374370 00:20:00.104 [2024-07-24 19:55:51.511260] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:00.104 [2024-07-24 19:55:51.512796] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:00.104 [2024-07-24 19:55:51.512826] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:00.104 BaseBdev2 00:20:00.104 19:55:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:20:00.104 19:55:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:00.363 BaseBdev3_malloc 00:20:00.363 19:55:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:00.621 true 00:20:00.621 19:55:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:00.880 [2024-07-24 19:55:52.245820] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:00.880 [2024-07-24 19:55:52.245867] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:00.880 [2024-07-24 19:55:52.245889] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12aa2d0 00:20:00.880 [2024-07-24 19:55:52.245902] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:00.880 [2024-07-24 19:55:52.247414] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:00.880 [2024-07-24 19:55:52.247442] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:00.880 BaseBdev3 00:20:00.880 19:55:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:20:00.880 19:55:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:01.139 BaseBdev4_malloc 00:20:01.139 19:55:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:20:01.398 true 00:20:01.398 19:55:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:20:01.657 [2024-07-24 19:55:52.992424] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:20:01.657 [2024-07-24 19:55:52.992467] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:01.657 [2024-07-24 19:55:52.992488] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12ad310 00:20:01.657 [2024-07-24 19:55:52.992501] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:01.657 [2024-07-24 19:55:52.993917] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:01.657 [2024-07-24 19:55:52.993944] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:01.657 BaseBdev4 00:20:01.657 19:55:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:20:01.657 [2024-07-24 19:55:53.237104] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:01.657 [2024-07-24 19:55:53.238362] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:01.657 [2024-07-24 19:55:53.238438] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:01.657 [2024-07-24 19:55:53.238497] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:01.657 [2024-07-24 19:55:53.238728] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x12ae060 00:20:01.657 [2024-07-24 19:55:53.238739] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:01.657 [2024-07-24 19:55:53.238932] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12aec10 00:20:01.657 [2024-07-24 19:55:53.239082] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12ae060 00:20:01.657 [2024-07-24 19:55:53.239092] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12ae060 00:20:01.657 [2024-07-24 19:55:53.239192] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:01.915 19:55:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:20:01.915 19:55:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:01.915 19:55:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:01.915 19:55:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:01.915 19:55:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:01.915 19:55:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:01.915 19:55:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:01.915 19:55:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:01.915 19:55:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:01.915 19:55:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:01.915 19:55:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:01.915 19:55:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:02.173 19:55:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:02.173 "name": "raid_bdev1", 00:20:02.173 "uuid": "dd46327d-bf22-47ab-a74e-3157674666aa", 00:20:02.173 "strip_size_kb": 64, 00:20:02.173 "state": "online", 00:20:02.173 "raid_level": "raid0", 00:20:02.173 "superblock": true, 00:20:02.173 "num_base_bdevs": 4, 00:20:02.173 "num_base_bdevs_discovered": 4, 00:20:02.173 "num_base_bdevs_operational": 4, 00:20:02.173 "base_bdevs_list": [ 00:20:02.173 { 00:20:02.173 "name": "BaseBdev1", 00:20:02.173 "uuid": "ee0ecb56-2e8c-53f1-8d45-04d4e9e15968", 00:20:02.173 "is_configured": true, 00:20:02.173 "data_offset": 2048, 00:20:02.173 "data_size": 63488 00:20:02.173 }, 00:20:02.173 { 00:20:02.173 "name": "BaseBdev2", 00:20:02.173 "uuid": "a32754be-5263-52d6-9ab6-8133ad3da408", 00:20:02.173 "is_configured": true, 00:20:02.173 "data_offset": 2048, 00:20:02.173 "data_size": 63488 00:20:02.173 }, 00:20:02.173 { 00:20:02.173 "name": "BaseBdev3", 00:20:02.173 "uuid": "cef0ef49-12db-53dd-b7ce-5a93c076b723", 00:20:02.173 "is_configured": true, 00:20:02.173 "data_offset": 2048, 00:20:02.173 "data_size": 63488 00:20:02.173 }, 00:20:02.173 { 00:20:02.173 "name": "BaseBdev4", 00:20:02.173 "uuid": "41aed550-dbc0-5070-9cb1-17c7a1a982c5", 00:20:02.173 "is_configured": true, 00:20:02.173 "data_offset": 2048, 00:20:02.173 "data_size": 63488 00:20:02.173 } 00:20:02.173 ] 00:20:02.173 }' 00:20:02.173 19:55:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:02.173 19:55:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:02.740 19:55:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:20:02.740 19:55:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:02.740 [2024-07-24 19:55:54.244051] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1375e20 00:20:03.675 19:55:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:20:03.970 19:55:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:20:03.970 19:55:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid0 = \r\a\i\d\1 ]] 00:20:03.970 19:55:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=4 00:20:03.970 19:55:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:20:03.970 19:55:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:03.970 19:55:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:03.970 19:55:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:03.970 19:55:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:03.970 19:55:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:03.970 19:55:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:03.970 19:55:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:03.970 19:55:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:03.970 19:55:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:03.970 19:55:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:03.970 19:55:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:04.230 19:55:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:04.230 "name": "raid_bdev1", 00:20:04.230 "uuid": "dd46327d-bf22-47ab-a74e-3157674666aa", 00:20:04.230 "strip_size_kb": 64, 00:20:04.230 "state": "online", 00:20:04.230 "raid_level": "raid0", 00:20:04.230 "superblock": true, 00:20:04.230 "num_base_bdevs": 4, 00:20:04.230 "num_base_bdevs_discovered": 4, 00:20:04.230 "num_base_bdevs_operational": 4, 00:20:04.230 "base_bdevs_list": [ 00:20:04.230 { 00:20:04.230 "name": "BaseBdev1", 00:20:04.230 "uuid": "ee0ecb56-2e8c-53f1-8d45-04d4e9e15968", 00:20:04.230 "is_configured": true, 00:20:04.230 "data_offset": 2048, 00:20:04.230 "data_size": 63488 00:20:04.230 }, 00:20:04.230 { 00:20:04.230 "name": "BaseBdev2", 00:20:04.230 "uuid": "a32754be-5263-52d6-9ab6-8133ad3da408", 00:20:04.230 "is_configured": true, 00:20:04.230 "data_offset": 2048, 00:20:04.230 "data_size": 63488 00:20:04.230 }, 00:20:04.230 { 00:20:04.230 "name": "BaseBdev3", 00:20:04.230 "uuid": "cef0ef49-12db-53dd-b7ce-5a93c076b723", 00:20:04.230 "is_configured": true, 00:20:04.230 "data_offset": 2048, 00:20:04.230 "data_size": 63488 00:20:04.230 }, 00:20:04.230 { 00:20:04.230 "name": "BaseBdev4", 00:20:04.230 "uuid": "41aed550-dbc0-5070-9cb1-17c7a1a982c5", 00:20:04.230 "is_configured": true, 00:20:04.230 "data_offset": 2048, 00:20:04.230 "data_size": 63488 00:20:04.230 } 00:20:04.230 ] 00:20:04.230 }' 00:20:04.230 19:55:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:04.230 19:55:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:04.798 19:55:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:05.057 [2024-07-24 19:55:56.433520] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:05.057 [2024-07-24 19:55:56.433561] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:05.057 [2024-07-24 19:55:56.436749] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:05.057 [2024-07-24 19:55:56.436790] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:05.057 [2024-07-24 19:55:56.436830] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:05.057 [2024-07-24 19:55:56.436841] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12ae060 name raid_bdev1, state offline 00:20:05.057 0 00:20:05.057 19:55:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1450351 00:20:05.057 19:55:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 1450351 ']' 00:20:05.057 19:55:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 1450351 00:20:05.057 19:55:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:20:05.057 19:55:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:20:05.057 19:55:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1450351 00:20:05.057 19:55:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:20:05.057 19:55:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:20:05.057 19:55:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1450351' 00:20:05.057 killing process with pid 1450351 00:20:05.057 19:55:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 1450351 00:20:05.057 [2024-07-24 19:55:56.518808] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:05.057 19:55:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 1450351 00:20:05.057 [2024-07-24 19:55:56.550588] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:05.316 19:55:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.H7ueYiXG9A 00:20:05.316 19:55:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:20:05.316 19:55:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:20:05.316 19:55:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.46 00:20:05.316 19:55:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid0 00:20:05.316 19:55:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:05.316 19:55:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:05.316 19:55:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.46 != \0\.\0\0 ]] 00:20:05.316 00:20:05.316 real 0m7.709s 00:20:05.316 user 0m12.337s 00:20:05.316 sys 0m1.386s 00:20:05.316 19:55:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:20:05.316 19:55:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:05.316 ************************************ 00:20:05.316 END TEST raid_write_error_test 00:20:05.316 ************************************ 00:20:05.316 19:55:56 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:20:05.316 19:55:56 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:20:05.316 19:55:56 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:20:05.316 19:55:56 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:20:05.316 19:55:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:05.316 ************************************ 00:20:05.316 START TEST raid_state_function_test 00:20:05.316 ************************************ 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 4 false 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1451506 00:20:05.316 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1451506' 00:20:05.316 Process raid pid: 1451506 00:20:05.317 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:05.317 19:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1451506 /var/tmp/spdk-raid.sock 00:20:05.317 19:55:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 1451506 ']' 00:20:05.317 19:55:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:05.317 19:55:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:20:05.317 19:55:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:05.317 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:05.317 19:55:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:20:05.317 19:55:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:05.575 [2024-07-24 19:55:56.980937] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:20:05.575 [2024-07-24 19:55:56.981073] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:05.833 [2024-07-24 19:55:57.174925] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:05.833 [2024-07-24 19:55:57.271722] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:05.833 [2024-07-24 19:55:57.336496] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:05.833 [2024-07-24 19:55:57.336532] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:06.768 19:55:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:20:06.768 19:55:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:20:06.768 19:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:07.028 [2024-07-24 19:55:58.616982] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:07.028 [2024-07-24 19:55:58.617022] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:07.028 [2024-07-24 19:55:58.617032] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:07.028 [2024-07-24 19:55:58.617044] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:07.028 [2024-07-24 19:55:58.617053] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:07.028 [2024-07-24 19:55:58.617064] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:07.028 [2024-07-24 19:55:58.617072] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:07.028 [2024-07-24 19:55:58.617083] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:07.286 19:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:07.286 19:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:07.286 19:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:07.286 19:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:07.286 19:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:07.286 19:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:07.286 19:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:07.286 19:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:07.286 19:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:07.286 19:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:07.287 19:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:07.287 19:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:07.854 19:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:07.854 "name": "Existed_Raid", 00:20:07.854 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:07.854 "strip_size_kb": 64, 00:20:07.854 "state": "configuring", 00:20:07.854 "raid_level": "concat", 00:20:07.854 "superblock": false, 00:20:07.854 "num_base_bdevs": 4, 00:20:07.854 "num_base_bdevs_discovered": 0, 00:20:07.854 "num_base_bdevs_operational": 4, 00:20:07.854 "base_bdevs_list": [ 00:20:07.854 { 00:20:07.854 "name": "BaseBdev1", 00:20:07.854 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:07.854 "is_configured": false, 00:20:07.854 "data_offset": 0, 00:20:07.854 "data_size": 0 00:20:07.854 }, 00:20:07.854 { 00:20:07.854 "name": "BaseBdev2", 00:20:07.854 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:07.854 "is_configured": false, 00:20:07.854 "data_offset": 0, 00:20:07.854 "data_size": 0 00:20:07.854 }, 00:20:07.854 { 00:20:07.854 "name": "BaseBdev3", 00:20:07.854 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:07.854 "is_configured": false, 00:20:07.854 "data_offset": 0, 00:20:07.854 "data_size": 0 00:20:07.854 }, 00:20:07.854 { 00:20:07.854 "name": "BaseBdev4", 00:20:07.854 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:07.854 "is_configured": false, 00:20:07.854 "data_offset": 0, 00:20:07.854 "data_size": 0 00:20:07.854 } 00:20:07.854 ] 00:20:07.854 }' 00:20:07.854 19:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:07.854 19:55:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:08.422 19:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:08.422 [2024-07-24 19:55:59.996487] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:08.422 [2024-07-24 19:55:59.996515] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19d1a30 name Existed_Raid, state configuring 00:20:08.681 19:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:08.681 [2024-07-24 19:56:00.241169] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:08.681 [2024-07-24 19:56:00.241209] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:08.681 [2024-07-24 19:56:00.241219] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:08.681 [2024-07-24 19:56:00.241231] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:08.681 [2024-07-24 19:56:00.241240] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:08.681 [2024-07-24 19:56:00.241250] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:08.681 [2024-07-24 19:56:00.241259] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:08.681 [2024-07-24 19:56:00.241270] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:08.681 19:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:08.940 [2024-07-24 19:56:00.499752] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:08.940 BaseBdev1 00:20:08.940 19:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:08.940 19:56:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:20:08.940 19:56:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:08.940 19:56:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:08.940 19:56:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:08.940 19:56:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:08.940 19:56:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:09.199 19:56:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:09.458 [ 00:20:09.458 { 00:20:09.458 "name": "BaseBdev1", 00:20:09.458 "aliases": [ 00:20:09.458 "b6c519b4-3627-4ac3-bab2-83098c5990cd" 00:20:09.458 ], 00:20:09.458 "product_name": "Malloc disk", 00:20:09.458 "block_size": 512, 00:20:09.458 "num_blocks": 65536, 00:20:09.458 "uuid": "b6c519b4-3627-4ac3-bab2-83098c5990cd", 00:20:09.458 "assigned_rate_limits": { 00:20:09.458 "rw_ios_per_sec": 0, 00:20:09.458 "rw_mbytes_per_sec": 0, 00:20:09.458 "r_mbytes_per_sec": 0, 00:20:09.458 "w_mbytes_per_sec": 0 00:20:09.458 }, 00:20:09.458 "claimed": true, 00:20:09.458 "claim_type": "exclusive_write", 00:20:09.458 "zoned": false, 00:20:09.458 "supported_io_types": { 00:20:09.458 "read": true, 00:20:09.458 "write": true, 00:20:09.458 "unmap": true, 00:20:09.458 "flush": true, 00:20:09.458 "reset": true, 00:20:09.458 "nvme_admin": false, 00:20:09.458 "nvme_io": false, 00:20:09.458 "nvme_io_md": false, 00:20:09.458 "write_zeroes": true, 00:20:09.458 "zcopy": true, 00:20:09.458 "get_zone_info": false, 00:20:09.458 "zone_management": false, 00:20:09.458 "zone_append": false, 00:20:09.458 "compare": false, 00:20:09.458 "compare_and_write": false, 00:20:09.458 "abort": true, 00:20:09.458 "seek_hole": false, 00:20:09.458 "seek_data": false, 00:20:09.458 "copy": true, 00:20:09.458 "nvme_iov_md": false 00:20:09.458 }, 00:20:09.458 "memory_domains": [ 00:20:09.458 { 00:20:09.458 "dma_device_id": "system", 00:20:09.458 "dma_device_type": 1 00:20:09.458 }, 00:20:09.458 { 00:20:09.458 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:09.458 "dma_device_type": 2 00:20:09.458 } 00:20:09.458 ], 00:20:09.458 "driver_specific": {} 00:20:09.458 } 00:20:09.458 ] 00:20:09.458 19:56:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:09.458 19:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:09.458 19:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:09.458 19:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:09.458 19:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:09.458 19:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:09.458 19:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:09.458 19:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:09.458 19:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:09.458 19:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:09.458 19:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:09.458 19:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:09.458 19:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:09.717 19:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:09.717 "name": "Existed_Raid", 00:20:09.717 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:09.717 "strip_size_kb": 64, 00:20:09.717 "state": "configuring", 00:20:09.717 "raid_level": "concat", 00:20:09.717 "superblock": false, 00:20:09.717 "num_base_bdevs": 4, 00:20:09.717 "num_base_bdevs_discovered": 1, 00:20:09.717 "num_base_bdevs_operational": 4, 00:20:09.717 "base_bdevs_list": [ 00:20:09.717 { 00:20:09.717 "name": "BaseBdev1", 00:20:09.717 "uuid": "b6c519b4-3627-4ac3-bab2-83098c5990cd", 00:20:09.717 "is_configured": true, 00:20:09.717 "data_offset": 0, 00:20:09.717 "data_size": 65536 00:20:09.717 }, 00:20:09.717 { 00:20:09.717 "name": "BaseBdev2", 00:20:09.717 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:09.717 "is_configured": false, 00:20:09.717 "data_offset": 0, 00:20:09.717 "data_size": 0 00:20:09.717 }, 00:20:09.717 { 00:20:09.717 "name": "BaseBdev3", 00:20:09.717 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:09.717 "is_configured": false, 00:20:09.717 "data_offset": 0, 00:20:09.717 "data_size": 0 00:20:09.717 }, 00:20:09.717 { 00:20:09.717 "name": "BaseBdev4", 00:20:09.717 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:09.717 "is_configured": false, 00:20:09.717 "data_offset": 0, 00:20:09.717 "data_size": 0 00:20:09.717 } 00:20:09.717 ] 00:20:09.717 }' 00:20:09.717 19:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:09.717 19:56:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:10.655 19:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:10.914 [2024-07-24 19:56:02.364695] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:10.915 [2024-07-24 19:56:02.364732] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19d12a0 name Existed_Raid, state configuring 00:20:10.915 19:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:11.174 [2024-07-24 19:56:02.609366] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:11.174 [2024-07-24 19:56:02.610777] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:11.174 [2024-07-24 19:56:02.610809] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:11.174 [2024-07-24 19:56:02.610819] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:11.174 [2024-07-24 19:56:02.610831] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:11.174 [2024-07-24 19:56:02.610840] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:11.174 [2024-07-24 19:56:02.610851] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:11.174 19:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:11.174 19:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:11.174 19:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:11.174 19:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:11.174 19:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:11.174 19:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:11.174 19:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:11.174 19:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:11.174 19:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:11.174 19:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:11.174 19:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:11.174 19:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:11.174 19:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:11.174 19:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:11.433 19:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:11.433 "name": "Existed_Raid", 00:20:11.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:11.433 "strip_size_kb": 64, 00:20:11.433 "state": "configuring", 00:20:11.433 "raid_level": "concat", 00:20:11.433 "superblock": false, 00:20:11.433 "num_base_bdevs": 4, 00:20:11.433 "num_base_bdevs_discovered": 1, 00:20:11.433 "num_base_bdevs_operational": 4, 00:20:11.433 "base_bdevs_list": [ 00:20:11.433 { 00:20:11.433 "name": "BaseBdev1", 00:20:11.433 "uuid": "b6c519b4-3627-4ac3-bab2-83098c5990cd", 00:20:11.433 "is_configured": true, 00:20:11.433 "data_offset": 0, 00:20:11.433 "data_size": 65536 00:20:11.433 }, 00:20:11.433 { 00:20:11.433 "name": "BaseBdev2", 00:20:11.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:11.433 "is_configured": false, 00:20:11.433 "data_offset": 0, 00:20:11.433 "data_size": 0 00:20:11.433 }, 00:20:11.433 { 00:20:11.433 "name": "BaseBdev3", 00:20:11.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:11.433 "is_configured": false, 00:20:11.433 "data_offset": 0, 00:20:11.433 "data_size": 0 00:20:11.433 }, 00:20:11.433 { 00:20:11.433 "name": "BaseBdev4", 00:20:11.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:11.433 "is_configured": false, 00:20:11.433 "data_offset": 0, 00:20:11.433 "data_size": 0 00:20:11.433 } 00:20:11.433 ] 00:20:11.433 }' 00:20:11.433 19:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:11.433 19:56:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:12.001 19:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:12.260 [2024-07-24 19:56:03.727744] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:12.260 BaseBdev2 00:20:12.260 19:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:12.260 19:56:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:20:12.260 19:56:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:12.260 19:56:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:12.260 19:56:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:12.260 19:56:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:12.260 19:56:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:12.519 19:56:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:12.778 [ 00:20:12.778 { 00:20:12.778 "name": "BaseBdev2", 00:20:12.778 "aliases": [ 00:20:12.778 "71efabdf-8b4e-4c81-ab43-19efe342f9fc" 00:20:12.778 ], 00:20:12.778 "product_name": "Malloc disk", 00:20:12.778 "block_size": 512, 00:20:12.778 "num_blocks": 65536, 00:20:12.778 "uuid": "71efabdf-8b4e-4c81-ab43-19efe342f9fc", 00:20:12.778 "assigned_rate_limits": { 00:20:12.778 "rw_ios_per_sec": 0, 00:20:12.778 "rw_mbytes_per_sec": 0, 00:20:12.778 "r_mbytes_per_sec": 0, 00:20:12.778 "w_mbytes_per_sec": 0 00:20:12.778 }, 00:20:12.778 "claimed": true, 00:20:12.778 "claim_type": "exclusive_write", 00:20:12.778 "zoned": false, 00:20:12.778 "supported_io_types": { 00:20:12.778 "read": true, 00:20:12.778 "write": true, 00:20:12.778 "unmap": true, 00:20:12.778 "flush": true, 00:20:12.778 "reset": true, 00:20:12.778 "nvme_admin": false, 00:20:12.778 "nvme_io": false, 00:20:12.778 "nvme_io_md": false, 00:20:12.778 "write_zeroes": true, 00:20:12.778 "zcopy": true, 00:20:12.778 "get_zone_info": false, 00:20:12.778 "zone_management": false, 00:20:12.778 "zone_append": false, 00:20:12.778 "compare": false, 00:20:12.778 "compare_and_write": false, 00:20:12.778 "abort": true, 00:20:12.778 "seek_hole": false, 00:20:12.778 "seek_data": false, 00:20:12.778 "copy": true, 00:20:12.778 "nvme_iov_md": false 00:20:12.778 }, 00:20:12.778 "memory_domains": [ 00:20:12.778 { 00:20:12.778 "dma_device_id": "system", 00:20:12.778 "dma_device_type": 1 00:20:12.778 }, 00:20:12.778 { 00:20:12.778 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:12.778 "dma_device_type": 2 00:20:12.778 } 00:20:12.778 ], 00:20:12.778 "driver_specific": {} 00:20:12.778 } 00:20:12.778 ] 00:20:12.778 19:56:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:12.778 19:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:12.778 19:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:12.778 19:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:12.778 19:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:12.778 19:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:12.778 19:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:12.778 19:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:12.778 19:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:12.778 19:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:12.778 19:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:12.778 19:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:12.778 19:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:12.778 19:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:12.778 19:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:13.036 19:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:13.036 "name": "Existed_Raid", 00:20:13.036 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:13.036 "strip_size_kb": 64, 00:20:13.036 "state": "configuring", 00:20:13.036 "raid_level": "concat", 00:20:13.036 "superblock": false, 00:20:13.037 "num_base_bdevs": 4, 00:20:13.037 "num_base_bdevs_discovered": 2, 00:20:13.037 "num_base_bdevs_operational": 4, 00:20:13.037 "base_bdevs_list": [ 00:20:13.037 { 00:20:13.037 "name": "BaseBdev1", 00:20:13.037 "uuid": "b6c519b4-3627-4ac3-bab2-83098c5990cd", 00:20:13.037 "is_configured": true, 00:20:13.037 "data_offset": 0, 00:20:13.037 "data_size": 65536 00:20:13.037 }, 00:20:13.037 { 00:20:13.037 "name": "BaseBdev2", 00:20:13.037 "uuid": "71efabdf-8b4e-4c81-ab43-19efe342f9fc", 00:20:13.037 "is_configured": true, 00:20:13.037 "data_offset": 0, 00:20:13.037 "data_size": 65536 00:20:13.037 }, 00:20:13.037 { 00:20:13.037 "name": "BaseBdev3", 00:20:13.037 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:13.037 "is_configured": false, 00:20:13.037 "data_offset": 0, 00:20:13.037 "data_size": 0 00:20:13.037 }, 00:20:13.037 { 00:20:13.037 "name": "BaseBdev4", 00:20:13.037 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:13.037 "is_configured": false, 00:20:13.037 "data_offset": 0, 00:20:13.037 "data_size": 0 00:20:13.037 } 00:20:13.037 ] 00:20:13.037 }' 00:20:13.037 19:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:13.037 19:56:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:13.603 19:56:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:13.861 [2024-07-24 19:56:05.331405] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:13.861 BaseBdev3 00:20:13.861 19:56:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:20:13.861 19:56:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:20:13.861 19:56:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:13.862 19:56:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:13.862 19:56:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:13.862 19:56:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:13.862 19:56:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:14.121 19:56:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:14.380 [ 00:20:14.380 { 00:20:14.380 "name": "BaseBdev3", 00:20:14.380 "aliases": [ 00:20:14.380 "5f71b0f2-9c58-4845-8479-771e60cb2de9" 00:20:14.380 ], 00:20:14.380 "product_name": "Malloc disk", 00:20:14.380 "block_size": 512, 00:20:14.380 "num_blocks": 65536, 00:20:14.380 "uuid": "5f71b0f2-9c58-4845-8479-771e60cb2de9", 00:20:14.380 "assigned_rate_limits": { 00:20:14.380 "rw_ios_per_sec": 0, 00:20:14.380 "rw_mbytes_per_sec": 0, 00:20:14.380 "r_mbytes_per_sec": 0, 00:20:14.380 "w_mbytes_per_sec": 0 00:20:14.380 }, 00:20:14.380 "claimed": true, 00:20:14.380 "claim_type": "exclusive_write", 00:20:14.380 "zoned": false, 00:20:14.380 "supported_io_types": { 00:20:14.380 "read": true, 00:20:14.380 "write": true, 00:20:14.380 "unmap": true, 00:20:14.380 "flush": true, 00:20:14.380 "reset": true, 00:20:14.380 "nvme_admin": false, 00:20:14.380 "nvme_io": false, 00:20:14.380 "nvme_io_md": false, 00:20:14.380 "write_zeroes": true, 00:20:14.380 "zcopy": true, 00:20:14.380 "get_zone_info": false, 00:20:14.380 "zone_management": false, 00:20:14.380 "zone_append": false, 00:20:14.380 "compare": false, 00:20:14.380 "compare_and_write": false, 00:20:14.380 "abort": true, 00:20:14.380 "seek_hole": false, 00:20:14.380 "seek_data": false, 00:20:14.380 "copy": true, 00:20:14.380 "nvme_iov_md": false 00:20:14.380 }, 00:20:14.380 "memory_domains": [ 00:20:14.380 { 00:20:14.380 "dma_device_id": "system", 00:20:14.380 "dma_device_type": 1 00:20:14.380 }, 00:20:14.380 { 00:20:14.380 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:14.380 "dma_device_type": 2 00:20:14.380 } 00:20:14.380 ], 00:20:14.380 "driver_specific": {} 00:20:14.380 } 00:20:14.380 ] 00:20:14.380 19:56:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:14.380 19:56:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:14.380 19:56:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:14.380 19:56:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:14.380 19:56:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:14.380 19:56:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:14.380 19:56:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:14.380 19:56:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:14.380 19:56:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:14.380 19:56:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:14.380 19:56:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:14.380 19:56:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:14.380 19:56:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:14.380 19:56:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:14.380 19:56:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:14.639 19:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:14.639 "name": "Existed_Raid", 00:20:14.639 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:14.639 "strip_size_kb": 64, 00:20:14.639 "state": "configuring", 00:20:14.639 "raid_level": "concat", 00:20:14.639 "superblock": false, 00:20:14.639 "num_base_bdevs": 4, 00:20:14.639 "num_base_bdevs_discovered": 3, 00:20:14.639 "num_base_bdevs_operational": 4, 00:20:14.639 "base_bdevs_list": [ 00:20:14.640 { 00:20:14.640 "name": "BaseBdev1", 00:20:14.640 "uuid": "b6c519b4-3627-4ac3-bab2-83098c5990cd", 00:20:14.640 "is_configured": true, 00:20:14.640 "data_offset": 0, 00:20:14.640 "data_size": 65536 00:20:14.640 }, 00:20:14.640 { 00:20:14.640 "name": "BaseBdev2", 00:20:14.640 "uuid": "71efabdf-8b4e-4c81-ab43-19efe342f9fc", 00:20:14.640 "is_configured": true, 00:20:14.640 "data_offset": 0, 00:20:14.640 "data_size": 65536 00:20:14.640 }, 00:20:14.640 { 00:20:14.640 "name": "BaseBdev3", 00:20:14.640 "uuid": "5f71b0f2-9c58-4845-8479-771e60cb2de9", 00:20:14.640 "is_configured": true, 00:20:14.640 "data_offset": 0, 00:20:14.640 "data_size": 65536 00:20:14.640 }, 00:20:14.640 { 00:20:14.640 "name": "BaseBdev4", 00:20:14.640 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:14.640 "is_configured": false, 00:20:14.640 "data_offset": 0, 00:20:14.640 "data_size": 0 00:20:14.640 } 00:20:14.640 ] 00:20:14.640 }' 00:20:14.640 19:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:14.640 19:56:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:15.321 19:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:15.580 [2024-07-24 19:56:06.943093] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:15.580 [2024-07-24 19:56:06.943131] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x19d2300 00:20:15.580 [2024-07-24 19:56:06.943140] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:20:15.580 [2024-07-24 19:56:06.943357] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19d3280 00:20:15.580 [2024-07-24 19:56:06.943492] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19d2300 00:20:15.580 [2024-07-24 19:56:06.943503] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x19d2300 00:20:15.580 [2024-07-24 19:56:06.943666] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:15.580 BaseBdev4 00:20:15.580 19:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:20:15.580 19:56:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:20:15.580 19:56:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:15.580 19:56:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:15.580 19:56:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:15.580 19:56:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:15.580 19:56:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:15.839 19:56:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:16.098 [ 00:20:16.098 { 00:20:16.098 "name": "BaseBdev4", 00:20:16.098 "aliases": [ 00:20:16.098 "b2764efd-4f1c-4563-b3de-50c4e885d538" 00:20:16.098 ], 00:20:16.098 "product_name": "Malloc disk", 00:20:16.098 "block_size": 512, 00:20:16.098 "num_blocks": 65536, 00:20:16.098 "uuid": "b2764efd-4f1c-4563-b3de-50c4e885d538", 00:20:16.098 "assigned_rate_limits": { 00:20:16.098 "rw_ios_per_sec": 0, 00:20:16.099 "rw_mbytes_per_sec": 0, 00:20:16.099 "r_mbytes_per_sec": 0, 00:20:16.099 "w_mbytes_per_sec": 0 00:20:16.099 }, 00:20:16.099 "claimed": true, 00:20:16.099 "claim_type": "exclusive_write", 00:20:16.099 "zoned": false, 00:20:16.099 "supported_io_types": { 00:20:16.099 "read": true, 00:20:16.099 "write": true, 00:20:16.099 "unmap": true, 00:20:16.099 "flush": true, 00:20:16.099 "reset": true, 00:20:16.099 "nvme_admin": false, 00:20:16.099 "nvme_io": false, 00:20:16.099 "nvme_io_md": false, 00:20:16.099 "write_zeroes": true, 00:20:16.099 "zcopy": true, 00:20:16.099 "get_zone_info": false, 00:20:16.099 "zone_management": false, 00:20:16.099 "zone_append": false, 00:20:16.099 "compare": false, 00:20:16.099 "compare_and_write": false, 00:20:16.099 "abort": true, 00:20:16.099 "seek_hole": false, 00:20:16.099 "seek_data": false, 00:20:16.099 "copy": true, 00:20:16.099 "nvme_iov_md": false 00:20:16.099 }, 00:20:16.099 "memory_domains": [ 00:20:16.099 { 00:20:16.099 "dma_device_id": "system", 00:20:16.099 "dma_device_type": 1 00:20:16.099 }, 00:20:16.099 { 00:20:16.099 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:16.099 "dma_device_type": 2 00:20:16.099 } 00:20:16.099 ], 00:20:16.099 "driver_specific": {} 00:20:16.099 } 00:20:16.099 ] 00:20:16.099 19:56:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:16.099 19:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:16.099 19:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:16.099 19:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:20:16.099 19:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:16.099 19:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:16.099 19:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:16.099 19:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:16.099 19:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:16.099 19:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:16.099 19:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:16.099 19:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:16.099 19:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:16.099 19:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:16.099 19:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:16.358 19:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:16.358 "name": "Existed_Raid", 00:20:16.358 "uuid": "1d7ab9a8-48cf-430f-83b6-2d8533a6806d", 00:20:16.358 "strip_size_kb": 64, 00:20:16.358 "state": "online", 00:20:16.358 "raid_level": "concat", 00:20:16.358 "superblock": false, 00:20:16.358 "num_base_bdevs": 4, 00:20:16.358 "num_base_bdevs_discovered": 4, 00:20:16.358 "num_base_bdevs_operational": 4, 00:20:16.358 "base_bdevs_list": [ 00:20:16.358 { 00:20:16.358 "name": "BaseBdev1", 00:20:16.358 "uuid": "b6c519b4-3627-4ac3-bab2-83098c5990cd", 00:20:16.358 "is_configured": true, 00:20:16.358 "data_offset": 0, 00:20:16.358 "data_size": 65536 00:20:16.358 }, 00:20:16.358 { 00:20:16.358 "name": "BaseBdev2", 00:20:16.358 "uuid": "71efabdf-8b4e-4c81-ab43-19efe342f9fc", 00:20:16.358 "is_configured": true, 00:20:16.358 "data_offset": 0, 00:20:16.358 "data_size": 65536 00:20:16.358 }, 00:20:16.358 { 00:20:16.358 "name": "BaseBdev3", 00:20:16.358 "uuid": "5f71b0f2-9c58-4845-8479-771e60cb2de9", 00:20:16.358 "is_configured": true, 00:20:16.358 "data_offset": 0, 00:20:16.358 "data_size": 65536 00:20:16.358 }, 00:20:16.358 { 00:20:16.358 "name": "BaseBdev4", 00:20:16.358 "uuid": "b2764efd-4f1c-4563-b3de-50c4e885d538", 00:20:16.358 "is_configured": true, 00:20:16.358 "data_offset": 0, 00:20:16.359 "data_size": 65536 00:20:16.359 } 00:20:16.359 ] 00:20:16.359 }' 00:20:16.359 19:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:16.359 19:56:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:16.926 19:56:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:20:16.926 19:56:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:16.926 19:56:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:16.926 19:56:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:16.926 19:56:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:16.926 19:56:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:16.926 19:56:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:16.927 19:56:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:17.186 [2024-07-24 19:56:08.551708] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:17.186 19:56:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:17.186 "name": "Existed_Raid", 00:20:17.186 "aliases": [ 00:20:17.186 "1d7ab9a8-48cf-430f-83b6-2d8533a6806d" 00:20:17.186 ], 00:20:17.186 "product_name": "Raid Volume", 00:20:17.186 "block_size": 512, 00:20:17.186 "num_blocks": 262144, 00:20:17.186 "uuid": "1d7ab9a8-48cf-430f-83b6-2d8533a6806d", 00:20:17.186 "assigned_rate_limits": { 00:20:17.186 "rw_ios_per_sec": 0, 00:20:17.186 "rw_mbytes_per_sec": 0, 00:20:17.186 "r_mbytes_per_sec": 0, 00:20:17.186 "w_mbytes_per_sec": 0 00:20:17.186 }, 00:20:17.186 "claimed": false, 00:20:17.186 "zoned": false, 00:20:17.186 "supported_io_types": { 00:20:17.186 "read": true, 00:20:17.186 "write": true, 00:20:17.186 "unmap": true, 00:20:17.186 "flush": true, 00:20:17.186 "reset": true, 00:20:17.186 "nvme_admin": false, 00:20:17.186 "nvme_io": false, 00:20:17.186 "nvme_io_md": false, 00:20:17.186 "write_zeroes": true, 00:20:17.186 "zcopy": false, 00:20:17.186 "get_zone_info": false, 00:20:17.186 "zone_management": false, 00:20:17.186 "zone_append": false, 00:20:17.186 "compare": false, 00:20:17.186 "compare_and_write": false, 00:20:17.186 "abort": false, 00:20:17.186 "seek_hole": false, 00:20:17.186 "seek_data": false, 00:20:17.186 "copy": false, 00:20:17.186 "nvme_iov_md": false 00:20:17.186 }, 00:20:17.186 "memory_domains": [ 00:20:17.186 { 00:20:17.186 "dma_device_id": "system", 00:20:17.186 "dma_device_type": 1 00:20:17.186 }, 00:20:17.186 { 00:20:17.186 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:17.186 "dma_device_type": 2 00:20:17.186 }, 00:20:17.186 { 00:20:17.186 "dma_device_id": "system", 00:20:17.186 "dma_device_type": 1 00:20:17.186 }, 00:20:17.186 { 00:20:17.186 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:17.186 "dma_device_type": 2 00:20:17.186 }, 00:20:17.186 { 00:20:17.186 "dma_device_id": "system", 00:20:17.186 "dma_device_type": 1 00:20:17.186 }, 00:20:17.186 { 00:20:17.186 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:17.186 "dma_device_type": 2 00:20:17.186 }, 00:20:17.186 { 00:20:17.186 "dma_device_id": "system", 00:20:17.186 "dma_device_type": 1 00:20:17.186 }, 00:20:17.186 { 00:20:17.186 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:17.186 "dma_device_type": 2 00:20:17.186 } 00:20:17.186 ], 00:20:17.186 "driver_specific": { 00:20:17.186 "raid": { 00:20:17.186 "uuid": "1d7ab9a8-48cf-430f-83b6-2d8533a6806d", 00:20:17.186 "strip_size_kb": 64, 00:20:17.186 "state": "online", 00:20:17.186 "raid_level": "concat", 00:20:17.186 "superblock": false, 00:20:17.186 "num_base_bdevs": 4, 00:20:17.186 "num_base_bdevs_discovered": 4, 00:20:17.186 "num_base_bdevs_operational": 4, 00:20:17.186 "base_bdevs_list": [ 00:20:17.186 { 00:20:17.186 "name": "BaseBdev1", 00:20:17.186 "uuid": "b6c519b4-3627-4ac3-bab2-83098c5990cd", 00:20:17.186 "is_configured": true, 00:20:17.186 "data_offset": 0, 00:20:17.186 "data_size": 65536 00:20:17.186 }, 00:20:17.186 { 00:20:17.186 "name": "BaseBdev2", 00:20:17.186 "uuid": "71efabdf-8b4e-4c81-ab43-19efe342f9fc", 00:20:17.186 "is_configured": true, 00:20:17.186 "data_offset": 0, 00:20:17.186 "data_size": 65536 00:20:17.186 }, 00:20:17.186 { 00:20:17.186 "name": "BaseBdev3", 00:20:17.186 "uuid": "5f71b0f2-9c58-4845-8479-771e60cb2de9", 00:20:17.186 "is_configured": true, 00:20:17.186 "data_offset": 0, 00:20:17.186 "data_size": 65536 00:20:17.186 }, 00:20:17.186 { 00:20:17.186 "name": "BaseBdev4", 00:20:17.186 "uuid": "b2764efd-4f1c-4563-b3de-50c4e885d538", 00:20:17.186 "is_configured": true, 00:20:17.186 "data_offset": 0, 00:20:17.186 "data_size": 65536 00:20:17.186 } 00:20:17.186 ] 00:20:17.186 } 00:20:17.186 } 00:20:17.186 }' 00:20:17.186 19:56:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:17.186 19:56:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:20:17.186 BaseBdev2 00:20:17.186 BaseBdev3 00:20:17.186 BaseBdev4' 00:20:17.186 19:56:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:17.186 19:56:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:20:17.186 19:56:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:17.446 19:56:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:17.446 "name": "BaseBdev1", 00:20:17.446 "aliases": [ 00:20:17.446 "b6c519b4-3627-4ac3-bab2-83098c5990cd" 00:20:17.446 ], 00:20:17.446 "product_name": "Malloc disk", 00:20:17.446 "block_size": 512, 00:20:17.446 "num_blocks": 65536, 00:20:17.446 "uuid": "b6c519b4-3627-4ac3-bab2-83098c5990cd", 00:20:17.446 "assigned_rate_limits": { 00:20:17.446 "rw_ios_per_sec": 0, 00:20:17.446 "rw_mbytes_per_sec": 0, 00:20:17.446 "r_mbytes_per_sec": 0, 00:20:17.446 "w_mbytes_per_sec": 0 00:20:17.446 }, 00:20:17.446 "claimed": true, 00:20:17.446 "claim_type": "exclusive_write", 00:20:17.446 "zoned": false, 00:20:17.446 "supported_io_types": { 00:20:17.446 "read": true, 00:20:17.446 "write": true, 00:20:17.446 "unmap": true, 00:20:17.446 "flush": true, 00:20:17.446 "reset": true, 00:20:17.446 "nvme_admin": false, 00:20:17.446 "nvme_io": false, 00:20:17.446 "nvme_io_md": false, 00:20:17.446 "write_zeroes": true, 00:20:17.446 "zcopy": true, 00:20:17.446 "get_zone_info": false, 00:20:17.446 "zone_management": false, 00:20:17.446 "zone_append": false, 00:20:17.446 "compare": false, 00:20:17.446 "compare_and_write": false, 00:20:17.446 "abort": true, 00:20:17.446 "seek_hole": false, 00:20:17.446 "seek_data": false, 00:20:17.446 "copy": true, 00:20:17.446 "nvme_iov_md": false 00:20:17.446 }, 00:20:17.446 "memory_domains": [ 00:20:17.446 { 00:20:17.446 "dma_device_id": "system", 00:20:17.446 "dma_device_type": 1 00:20:17.446 }, 00:20:17.446 { 00:20:17.446 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:17.446 "dma_device_type": 2 00:20:17.446 } 00:20:17.446 ], 00:20:17.446 "driver_specific": {} 00:20:17.446 }' 00:20:17.446 19:56:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:17.446 19:56:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:17.446 19:56:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:17.446 19:56:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:17.446 19:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:17.705 19:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:17.705 19:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:17.705 19:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:17.705 19:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:17.706 19:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:17.706 19:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:17.706 19:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:17.706 19:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:17.706 19:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:17.706 19:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:17.965 19:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:17.965 "name": "BaseBdev2", 00:20:17.965 "aliases": [ 00:20:17.965 "71efabdf-8b4e-4c81-ab43-19efe342f9fc" 00:20:17.965 ], 00:20:17.965 "product_name": "Malloc disk", 00:20:17.965 "block_size": 512, 00:20:17.965 "num_blocks": 65536, 00:20:17.965 "uuid": "71efabdf-8b4e-4c81-ab43-19efe342f9fc", 00:20:17.965 "assigned_rate_limits": { 00:20:17.965 "rw_ios_per_sec": 0, 00:20:17.965 "rw_mbytes_per_sec": 0, 00:20:17.965 "r_mbytes_per_sec": 0, 00:20:17.965 "w_mbytes_per_sec": 0 00:20:17.965 }, 00:20:17.965 "claimed": true, 00:20:17.965 "claim_type": "exclusive_write", 00:20:17.965 "zoned": false, 00:20:17.965 "supported_io_types": { 00:20:17.965 "read": true, 00:20:17.965 "write": true, 00:20:17.965 "unmap": true, 00:20:17.965 "flush": true, 00:20:17.965 "reset": true, 00:20:17.965 "nvme_admin": false, 00:20:17.965 "nvme_io": false, 00:20:17.965 "nvme_io_md": false, 00:20:17.965 "write_zeroes": true, 00:20:17.965 "zcopy": true, 00:20:17.965 "get_zone_info": false, 00:20:17.965 "zone_management": false, 00:20:17.965 "zone_append": false, 00:20:17.965 "compare": false, 00:20:17.965 "compare_and_write": false, 00:20:17.965 "abort": true, 00:20:17.965 "seek_hole": false, 00:20:17.965 "seek_data": false, 00:20:17.965 "copy": true, 00:20:17.965 "nvme_iov_md": false 00:20:17.965 }, 00:20:17.965 "memory_domains": [ 00:20:17.965 { 00:20:17.965 "dma_device_id": "system", 00:20:17.965 "dma_device_type": 1 00:20:17.965 }, 00:20:17.965 { 00:20:17.965 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:17.965 "dma_device_type": 2 00:20:17.965 } 00:20:17.965 ], 00:20:17.965 "driver_specific": {} 00:20:17.965 }' 00:20:17.965 19:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:17.965 19:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:17.965 19:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:17.965 19:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:18.223 19:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:18.223 19:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:18.223 19:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:18.223 19:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:18.223 19:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:18.223 19:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:18.223 19:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:18.482 19:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:18.482 19:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:18.482 19:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:18.482 19:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:18.482 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:18.482 "name": "BaseBdev3", 00:20:18.482 "aliases": [ 00:20:18.482 "5f71b0f2-9c58-4845-8479-771e60cb2de9" 00:20:18.482 ], 00:20:18.482 "product_name": "Malloc disk", 00:20:18.482 "block_size": 512, 00:20:18.482 "num_blocks": 65536, 00:20:18.482 "uuid": "5f71b0f2-9c58-4845-8479-771e60cb2de9", 00:20:18.482 "assigned_rate_limits": { 00:20:18.482 "rw_ios_per_sec": 0, 00:20:18.482 "rw_mbytes_per_sec": 0, 00:20:18.482 "r_mbytes_per_sec": 0, 00:20:18.482 "w_mbytes_per_sec": 0 00:20:18.482 }, 00:20:18.482 "claimed": true, 00:20:18.482 "claim_type": "exclusive_write", 00:20:18.482 "zoned": false, 00:20:18.482 "supported_io_types": { 00:20:18.482 "read": true, 00:20:18.482 "write": true, 00:20:18.482 "unmap": true, 00:20:18.482 "flush": true, 00:20:18.482 "reset": true, 00:20:18.482 "nvme_admin": false, 00:20:18.482 "nvme_io": false, 00:20:18.482 "nvme_io_md": false, 00:20:18.482 "write_zeroes": true, 00:20:18.482 "zcopy": true, 00:20:18.482 "get_zone_info": false, 00:20:18.482 "zone_management": false, 00:20:18.482 "zone_append": false, 00:20:18.482 "compare": false, 00:20:18.482 "compare_and_write": false, 00:20:18.482 "abort": true, 00:20:18.482 "seek_hole": false, 00:20:18.482 "seek_data": false, 00:20:18.482 "copy": true, 00:20:18.482 "nvme_iov_md": false 00:20:18.482 }, 00:20:18.482 "memory_domains": [ 00:20:18.482 { 00:20:18.482 "dma_device_id": "system", 00:20:18.482 "dma_device_type": 1 00:20:18.482 }, 00:20:18.482 { 00:20:18.482 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:18.482 "dma_device_type": 2 00:20:18.482 } 00:20:18.482 ], 00:20:18.482 "driver_specific": {} 00:20:18.482 }' 00:20:18.740 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:18.740 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:18.740 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:18.741 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:18.741 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:18.741 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:18.741 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:18.741 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:18.741 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:18.741 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:18.998 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:18.998 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:18.998 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:18.998 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:18.998 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:19.257 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:19.257 "name": "BaseBdev4", 00:20:19.257 "aliases": [ 00:20:19.257 "b2764efd-4f1c-4563-b3de-50c4e885d538" 00:20:19.257 ], 00:20:19.257 "product_name": "Malloc disk", 00:20:19.257 "block_size": 512, 00:20:19.257 "num_blocks": 65536, 00:20:19.257 "uuid": "b2764efd-4f1c-4563-b3de-50c4e885d538", 00:20:19.257 "assigned_rate_limits": { 00:20:19.257 "rw_ios_per_sec": 0, 00:20:19.257 "rw_mbytes_per_sec": 0, 00:20:19.257 "r_mbytes_per_sec": 0, 00:20:19.257 "w_mbytes_per_sec": 0 00:20:19.257 }, 00:20:19.257 "claimed": true, 00:20:19.257 "claim_type": "exclusive_write", 00:20:19.257 "zoned": false, 00:20:19.257 "supported_io_types": { 00:20:19.257 "read": true, 00:20:19.257 "write": true, 00:20:19.257 "unmap": true, 00:20:19.257 "flush": true, 00:20:19.257 "reset": true, 00:20:19.257 "nvme_admin": false, 00:20:19.257 "nvme_io": false, 00:20:19.257 "nvme_io_md": false, 00:20:19.257 "write_zeroes": true, 00:20:19.257 "zcopy": true, 00:20:19.257 "get_zone_info": false, 00:20:19.257 "zone_management": false, 00:20:19.257 "zone_append": false, 00:20:19.257 "compare": false, 00:20:19.257 "compare_and_write": false, 00:20:19.257 "abort": true, 00:20:19.257 "seek_hole": false, 00:20:19.257 "seek_data": false, 00:20:19.257 "copy": true, 00:20:19.257 "nvme_iov_md": false 00:20:19.257 }, 00:20:19.257 "memory_domains": [ 00:20:19.257 { 00:20:19.257 "dma_device_id": "system", 00:20:19.257 "dma_device_type": 1 00:20:19.257 }, 00:20:19.257 { 00:20:19.257 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:19.257 "dma_device_type": 2 00:20:19.257 } 00:20:19.257 ], 00:20:19.257 "driver_specific": {} 00:20:19.257 }' 00:20:19.257 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:19.257 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:19.257 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:19.257 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:19.257 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:19.257 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:19.257 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:19.516 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:19.516 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:19.516 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:19.516 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:19.516 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:19.516 19:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:19.775 [2024-07-24 19:56:11.222659] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:19.775 [2024-07-24 19:56:11.222684] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:19.775 [2024-07-24 19:56:11.222729] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:19.775 19:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:20:19.775 19:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:20:19.775 19:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:19.775 19:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:19.775 19:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:20:19.775 19:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:20:19.775 19:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:19.775 19:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:20:19.775 19:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:19.775 19:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:19.775 19:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:19.775 19:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:19.775 19:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:19.775 19:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:19.775 19:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:19.775 19:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:19.775 19:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:20.034 19:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:20.034 "name": "Existed_Raid", 00:20:20.034 "uuid": "1d7ab9a8-48cf-430f-83b6-2d8533a6806d", 00:20:20.034 "strip_size_kb": 64, 00:20:20.034 "state": "offline", 00:20:20.034 "raid_level": "concat", 00:20:20.034 "superblock": false, 00:20:20.034 "num_base_bdevs": 4, 00:20:20.034 "num_base_bdevs_discovered": 3, 00:20:20.034 "num_base_bdevs_operational": 3, 00:20:20.034 "base_bdevs_list": [ 00:20:20.034 { 00:20:20.034 "name": null, 00:20:20.034 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:20.034 "is_configured": false, 00:20:20.034 "data_offset": 0, 00:20:20.034 "data_size": 65536 00:20:20.034 }, 00:20:20.034 { 00:20:20.034 "name": "BaseBdev2", 00:20:20.034 "uuid": "71efabdf-8b4e-4c81-ab43-19efe342f9fc", 00:20:20.034 "is_configured": true, 00:20:20.034 "data_offset": 0, 00:20:20.034 "data_size": 65536 00:20:20.034 }, 00:20:20.034 { 00:20:20.034 "name": "BaseBdev3", 00:20:20.034 "uuid": "5f71b0f2-9c58-4845-8479-771e60cb2de9", 00:20:20.034 "is_configured": true, 00:20:20.034 "data_offset": 0, 00:20:20.034 "data_size": 65536 00:20:20.034 }, 00:20:20.034 { 00:20:20.034 "name": "BaseBdev4", 00:20:20.034 "uuid": "b2764efd-4f1c-4563-b3de-50c4e885d538", 00:20:20.034 "is_configured": true, 00:20:20.034 "data_offset": 0, 00:20:20.034 "data_size": 65536 00:20:20.034 } 00:20:20.034 ] 00:20:20.034 }' 00:20:20.034 19:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:20.034 19:56:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:20.602 19:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:20:20.602 19:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:20.602 19:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:20.602 19:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:20.860 19:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:20.860 19:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:20.860 19:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:20:21.119 [2024-07-24 19:56:12.543179] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:21.119 19:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:21.119 19:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:21.119 19:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:21.119 19:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:21.377 19:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:21.377 19:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:21.377 19:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:20:21.636 [2024-07-24 19:56:13.031004] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:21.636 19:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:21.636 19:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:21.636 19:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:21.636 19:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:21.894 19:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:21.895 19:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:21.895 19:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:20:22.153 [2024-07-24 19:56:13.539966] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:20:22.153 [2024-07-24 19:56:13.540005] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19d2300 name Existed_Raid, state offline 00:20:22.153 19:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:22.153 19:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:22.153 19:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:22.153 19:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:20:22.412 19:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:20:22.412 19:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:20:22.412 19:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:20:22.412 19:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:20:22.412 19:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:22.412 19:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:22.670 BaseBdev2 00:20:22.670 19:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:20:22.670 19:56:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:20:22.670 19:56:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:22.670 19:56:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:22.670 19:56:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:22.670 19:56:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:22.670 19:56:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:22.928 19:56:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:23.187 [ 00:20:23.187 { 00:20:23.187 "name": "BaseBdev2", 00:20:23.187 "aliases": [ 00:20:23.187 "6ddad680-8500-4a1b-8d9b-a8bd04e8d6bf" 00:20:23.187 ], 00:20:23.187 "product_name": "Malloc disk", 00:20:23.187 "block_size": 512, 00:20:23.187 "num_blocks": 65536, 00:20:23.187 "uuid": "6ddad680-8500-4a1b-8d9b-a8bd04e8d6bf", 00:20:23.187 "assigned_rate_limits": { 00:20:23.187 "rw_ios_per_sec": 0, 00:20:23.187 "rw_mbytes_per_sec": 0, 00:20:23.187 "r_mbytes_per_sec": 0, 00:20:23.187 "w_mbytes_per_sec": 0 00:20:23.187 }, 00:20:23.187 "claimed": false, 00:20:23.187 "zoned": false, 00:20:23.187 "supported_io_types": { 00:20:23.187 "read": true, 00:20:23.187 "write": true, 00:20:23.187 "unmap": true, 00:20:23.187 "flush": true, 00:20:23.187 "reset": true, 00:20:23.187 "nvme_admin": false, 00:20:23.187 "nvme_io": false, 00:20:23.187 "nvme_io_md": false, 00:20:23.187 "write_zeroes": true, 00:20:23.187 "zcopy": true, 00:20:23.187 "get_zone_info": false, 00:20:23.187 "zone_management": false, 00:20:23.187 "zone_append": false, 00:20:23.187 "compare": false, 00:20:23.187 "compare_and_write": false, 00:20:23.187 "abort": true, 00:20:23.187 "seek_hole": false, 00:20:23.187 "seek_data": false, 00:20:23.187 "copy": true, 00:20:23.187 "nvme_iov_md": false 00:20:23.187 }, 00:20:23.187 "memory_domains": [ 00:20:23.187 { 00:20:23.187 "dma_device_id": "system", 00:20:23.187 "dma_device_type": 1 00:20:23.187 }, 00:20:23.187 { 00:20:23.187 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:23.187 "dma_device_type": 2 00:20:23.187 } 00:20:23.187 ], 00:20:23.187 "driver_specific": {} 00:20:23.187 } 00:20:23.187 ] 00:20:23.187 19:56:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:23.187 19:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:23.187 19:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:23.187 19:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:23.446 BaseBdev3 00:20:23.446 19:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:20:23.446 19:56:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:20:23.446 19:56:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:23.446 19:56:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:23.446 19:56:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:23.446 19:56:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:23.446 19:56:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:23.705 19:56:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:23.705 [ 00:20:23.705 { 00:20:23.705 "name": "BaseBdev3", 00:20:23.705 "aliases": [ 00:20:23.705 "0255f2ef-51aa-47dc-9c77-0315e1e53f56" 00:20:23.705 ], 00:20:23.705 "product_name": "Malloc disk", 00:20:23.705 "block_size": 512, 00:20:23.705 "num_blocks": 65536, 00:20:23.705 "uuid": "0255f2ef-51aa-47dc-9c77-0315e1e53f56", 00:20:23.705 "assigned_rate_limits": { 00:20:23.705 "rw_ios_per_sec": 0, 00:20:23.705 "rw_mbytes_per_sec": 0, 00:20:23.705 "r_mbytes_per_sec": 0, 00:20:23.705 "w_mbytes_per_sec": 0 00:20:23.705 }, 00:20:23.705 "claimed": false, 00:20:23.705 "zoned": false, 00:20:23.705 "supported_io_types": { 00:20:23.705 "read": true, 00:20:23.705 "write": true, 00:20:23.705 "unmap": true, 00:20:23.705 "flush": true, 00:20:23.705 "reset": true, 00:20:23.705 "nvme_admin": false, 00:20:23.705 "nvme_io": false, 00:20:23.705 "nvme_io_md": false, 00:20:23.705 "write_zeroes": true, 00:20:23.705 "zcopy": true, 00:20:23.705 "get_zone_info": false, 00:20:23.705 "zone_management": false, 00:20:23.705 "zone_append": false, 00:20:23.705 "compare": false, 00:20:23.705 "compare_and_write": false, 00:20:23.705 "abort": true, 00:20:23.705 "seek_hole": false, 00:20:23.705 "seek_data": false, 00:20:23.705 "copy": true, 00:20:23.705 "nvme_iov_md": false 00:20:23.705 }, 00:20:23.705 "memory_domains": [ 00:20:23.705 { 00:20:23.705 "dma_device_id": "system", 00:20:23.705 "dma_device_type": 1 00:20:23.705 }, 00:20:23.705 { 00:20:23.705 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:23.705 "dma_device_type": 2 00:20:23.705 } 00:20:23.705 ], 00:20:23.705 "driver_specific": {} 00:20:23.705 } 00:20:23.705 ] 00:20:23.964 19:56:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:23.964 19:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:23.964 19:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:23.964 19:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:23.964 BaseBdev4 00:20:24.223 19:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:20:24.223 19:56:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:20:24.223 19:56:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:24.223 19:56:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:24.223 19:56:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:24.223 19:56:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:24.223 19:56:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:24.223 19:56:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:24.482 [ 00:20:24.482 { 00:20:24.482 "name": "BaseBdev4", 00:20:24.482 "aliases": [ 00:20:24.482 "9d13fba3-838e-4625-b4d5-d0b86de5aa2e" 00:20:24.482 ], 00:20:24.482 "product_name": "Malloc disk", 00:20:24.482 "block_size": 512, 00:20:24.482 "num_blocks": 65536, 00:20:24.482 "uuid": "9d13fba3-838e-4625-b4d5-d0b86de5aa2e", 00:20:24.482 "assigned_rate_limits": { 00:20:24.482 "rw_ios_per_sec": 0, 00:20:24.482 "rw_mbytes_per_sec": 0, 00:20:24.482 "r_mbytes_per_sec": 0, 00:20:24.482 "w_mbytes_per_sec": 0 00:20:24.482 }, 00:20:24.482 "claimed": false, 00:20:24.482 "zoned": false, 00:20:24.482 "supported_io_types": { 00:20:24.482 "read": true, 00:20:24.482 "write": true, 00:20:24.482 "unmap": true, 00:20:24.482 "flush": true, 00:20:24.483 "reset": true, 00:20:24.483 "nvme_admin": false, 00:20:24.483 "nvme_io": false, 00:20:24.483 "nvme_io_md": false, 00:20:24.483 "write_zeroes": true, 00:20:24.483 "zcopy": true, 00:20:24.483 "get_zone_info": false, 00:20:24.483 "zone_management": false, 00:20:24.483 "zone_append": false, 00:20:24.483 "compare": false, 00:20:24.483 "compare_and_write": false, 00:20:24.483 "abort": true, 00:20:24.483 "seek_hole": false, 00:20:24.483 "seek_data": false, 00:20:24.483 "copy": true, 00:20:24.483 "nvme_iov_md": false 00:20:24.483 }, 00:20:24.483 "memory_domains": [ 00:20:24.483 { 00:20:24.483 "dma_device_id": "system", 00:20:24.483 "dma_device_type": 1 00:20:24.483 }, 00:20:24.483 { 00:20:24.483 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:24.483 "dma_device_type": 2 00:20:24.483 } 00:20:24.483 ], 00:20:24.483 "driver_specific": {} 00:20:24.483 } 00:20:24.483 ] 00:20:24.483 19:56:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:24.483 19:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:24.483 19:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:24.483 19:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:24.741 [2024-07-24 19:56:16.267346] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:24.741 [2024-07-24 19:56:16.267384] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:24.741 [2024-07-24 19:56:16.267409] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:24.741 [2024-07-24 19:56:16.268906] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:24.741 [2024-07-24 19:56:16.268948] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:24.742 19:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:24.742 19:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:24.742 19:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:24.742 19:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:24.742 19:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:24.742 19:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:24.742 19:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:24.742 19:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:24.742 19:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:24.742 19:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:24.742 19:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:24.742 19:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:25.000 19:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:25.000 "name": "Existed_Raid", 00:20:25.000 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:25.000 "strip_size_kb": 64, 00:20:25.000 "state": "configuring", 00:20:25.000 "raid_level": "concat", 00:20:25.000 "superblock": false, 00:20:25.000 "num_base_bdevs": 4, 00:20:25.000 "num_base_bdevs_discovered": 3, 00:20:25.000 "num_base_bdevs_operational": 4, 00:20:25.000 "base_bdevs_list": [ 00:20:25.000 { 00:20:25.000 "name": "BaseBdev1", 00:20:25.000 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:25.000 "is_configured": false, 00:20:25.000 "data_offset": 0, 00:20:25.000 "data_size": 0 00:20:25.001 }, 00:20:25.001 { 00:20:25.001 "name": "BaseBdev2", 00:20:25.001 "uuid": "6ddad680-8500-4a1b-8d9b-a8bd04e8d6bf", 00:20:25.001 "is_configured": true, 00:20:25.001 "data_offset": 0, 00:20:25.001 "data_size": 65536 00:20:25.001 }, 00:20:25.001 { 00:20:25.001 "name": "BaseBdev3", 00:20:25.001 "uuid": "0255f2ef-51aa-47dc-9c77-0315e1e53f56", 00:20:25.001 "is_configured": true, 00:20:25.001 "data_offset": 0, 00:20:25.001 "data_size": 65536 00:20:25.001 }, 00:20:25.001 { 00:20:25.001 "name": "BaseBdev4", 00:20:25.001 "uuid": "9d13fba3-838e-4625-b4d5-d0b86de5aa2e", 00:20:25.001 "is_configured": true, 00:20:25.001 "data_offset": 0, 00:20:25.001 "data_size": 65536 00:20:25.001 } 00:20:25.001 ] 00:20:25.001 }' 00:20:25.001 19:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:25.001 19:56:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:25.937 19:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:25.937 [2024-07-24 19:56:17.406342] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:25.937 19:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:25.937 19:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:25.937 19:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:25.937 19:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:25.937 19:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:25.937 19:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:25.937 19:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:25.937 19:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:25.937 19:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:25.937 19:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:25.937 19:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:25.937 19:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:26.196 19:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:26.196 "name": "Existed_Raid", 00:20:26.196 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:26.196 "strip_size_kb": 64, 00:20:26.196 "state": "configuring", 00:20:26.196 "raid_level": "concat", 00:20:26.196 "superblock": false, 00:20:26.196 "num_base_bdevs": 4, 00:20:26.196 "num_base_bdevs_discovered": 2, 00:20:26.196 "num_base_bdevs_operational": 4, 00:20:26.196 "base_bdevs_list": [ 00:20:26.196 { 00:20:26.196 "name": "BaseBdev1", 00:20:26.196 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:26.196 "is_configured": false, 00:20:26.196 "data_offset": 0, 00:20:26.196 "data_size": 0 00:20:26.196 }, 00:20:26.196 { 00:20:26.196 "name": null, 00:20:26.196 "uuid": "6ddad680-8500-4a1b-8d9b-a8bd04e8d6bf", 00:20:26.196 "is_configured": false, 00:20:26.196 "data_offset": 0, 00:20:26.196 "data_size": 65536 00:20:26.196 }, 00:20:26.196 { 00:20:26.196 "name": "BaseBdev3", 00:20:26.196 "uuid": "0255f2ef-51aa-47dc-9c77-0315e1e53f56", 00:20:26.196 "is_configured": true, 00:20:26.196 "data_offset": 0, 00:20:26.196 "data_size": 65536 00:20:26.196 }, 00:20:26.196 { 00:20:26.196 "name": "BaseBdev4", 00:20:26.196 "uuid": "9d13fba3-838e-4625-b4d5-d0b86de5aa2e", 00:20:26.196 "is_configured": true, 00:20:26.196 "data_offset": 0, 00:20:26.196 "data_size": 65536 00:20:26.196 } 00:20:26.196 ] 00:20:26.196 }' 00:20:26.196 19:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:26.196 19:56:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:26.766 19:56:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:26.766 19:56:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:27.025 19:56:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:20:27.025 19:56:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:27.285 [2024-07-24 19:56:18.758485] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:27.285 BaseBdev1 00:20:27.285 19:56:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:20:27.285 19:56:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:20:27.285 19:56:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:27.285 19:56:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:27.285 19:56:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:27.285 19:56:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:27.285 19:56:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:27.543 19:56:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:27.803 [ 00:20:27.803 { 00:20:27.803 "name": "BaseBdev1", 00:20:27.803 "aliases": [ 00:20:27.803 "5e555668-3a7f-4752-84dd-7b3caf0a34a3" 00:20:27.803 ], 00:20:27.803 "product_name": "Malloc disk", 00:20:27.803 "block_size": 512, 00:20:27.803 "num_blocks": 65536, 00:20:27.803 "uuid": "5e555668-3a7f-4752-84dd-7b3caf0a34a3", 00:20:27.803 "assigned_rate_limits": { 00:20:27.803 "rw_ios_per_sec": 0, 00:20:27.803 "rw_mbytes_per_sec": 0, 00:20:27.803 "r_mbytes_per_sec": 0, 00:20:27.803 "w_mbytes_per_sec": 0 00:20:27.803 }, 00:20:27.803 "claimed": true, 00:20:27.803 "claim_type": "exclusive_write", 00:20:27.803 "zoned": false, 00:20:27.803 "supported_io_types": { 00:20:27.803 "read": true, 00:20:27.803 "write": true, 00:20:27.803 "unmap": true, 00:20:27.803 "flush": true, 00:20:27.803 "reset": true, 00:20:27.803 "nvme_admin": false, 00:20:27.803 "nvme_io": false, 00:20:27.803 "nvme_io_md": false, 00:20:27.803 "write_zeroes": true, 00:20:27.803 "zcopy": true, 00:20:27.803 "get_zone_info": false, 00:20:27.803 "zone_management": false, 00:20:27.803 "zone_append": false, 00:20:27.803 "compare": false, 00:20:27.803 "compare_and_write": false, 00:20:27.803 "abort": true, 00:20:27.803 "seek_hole": false, 00:20:27.803 "seek_data": false, 00:20:27.803 "copy": true, 00:20:27.803 "nvme_iov_md": false 00:20:27.803 }, 00:20:27.803 "memory_domains": [ 00:20:27.803 { 00:20:27.803 "dma_device_id": "system", 00:20:27.803 "dma_device_type": 1 00:20:27.803 }, 00:20:27.803 { 00:20:27.803 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:27.803 "dma_device_type": 2 00:20:27.803 } 00:20:27.803 ], 00:20:27.803 "driver_specific": {} 00:20:27.803 } 00:20:27.803 ] 00:20:27.803 19:56:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:27.803 19:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:27.803 19:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:27.803 19:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:27.803 19:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:27.803 19:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:27.803 19:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:27.803 19:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:27.803 19:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:27.803 19:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:27.803 19:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:27.803 19:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.803 19:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:28.062 19:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:28.062 "name": "Existed_Raid", 00:20:28.062 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:28.062 "strip_size_kb": 64, 00:20:28.062 "state": "configuring", 00:20:28.062 "raid_level": "concat", 00:20:28.062 "superblock": false, 00:20:28.062 "num_base_bdevs": 4, 00:20:28.062 "num_base_bdevs_discovered": 3, 00:20:28.062 "num_base_bdevs_operational": 4, 00:20:28.062 "base_bdevs_list": [ 00:20:28.062 { 00:20:28.062 "name": "BaseBdev1", 00:20:28.062 "uuid": "5e555668-3a7f-4752-84dd-7b3caf0a34a3", 00:20:28.062 "is_configured": true, 00:20:28.062 "data_offset": 0, 00:20:28.062 "data_size": 65536 00:20:28.062 }, 00:20:28.062 { 00:20:28.062 "name": null, 00:20:28.062 "uuid": "6ddad680-8500-4a1b-8d9b-a8bd04e8d6bf", 00:20:28.062 "is_configured": false, 00:20:28.062 "data_offset": 0, 00:20:28.062 "data_size": 65536 00:20:28.062 }, 00:20:28.062 { 00:20:28.062 "name": "BaseBdev3", 00:20:28.062 "uuid": "0255f2ef-51aa-47dc-9c77-0315e1e53f56", 00:20:28.062 "is_configured": true, 00:20:28.062 "data_offset": 0, 00:20:28.062 "data_size": 65536 00:20:28.062 }, 00:20:28.062 { 00:20:28.062 "name": "BaseBdev4", 00:20:28.062 "uuid": "9d13fba3-838e-4625-b4d5-d0b86de5aa2e", 00:20:28.062 "is_configured": true, 00:20:28.062 "data_offset": 0, 00:20:28.062 "data_size": 65536 00:20:28.062 } 00:20:28.062 ] 00:20:28.062 }' 00:20:28.062 19:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:28.062 19:56:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:28.630 19:56:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:28.630 19:56:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:28.889 19:56:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:20:28.889 19:56:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:20:29.459 [2024-07-24 19:56:20.868102] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:29.459 19:56:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:29.459 19:56:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:29.459 19:56:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:29.459 19:56:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:29.459 19:56:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:29.459 19:56:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:29.459 19:56:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:29.459 19:56:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:29.459 19:56:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:29.459 19:56:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:29.459 19:56:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:29.459 19:56:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:29.718 19:56:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:29.718 "name": "Existed_Raid", 00:20:29.718 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:29.718 "strip_size_kb": 64, 00:20:29.718 "state": "configuring", 00:20:29.718 "raid_level": "concat", 00:20:29.718 "superblock": false, 00:20:29.718 "num_base_bdevs": 4, 00:20:29.718 "num_base_bdevs_discovered": 2, 00:20:29.718 "num_base_bdevs_operational": 4, 00:20:29.719 "base_bdevs_list": [ 00:20:29.719 { 00:20:29.719 "name": "BaseBdev1", 00:20:29.719 "uuid": "5e555668-3a7f-4752-84dd-7b3caf0a34a3", 00:20:29.719 "is_configured": true, 00:20:29.719 "data_offset": 0, 00:20:29.719 "data_size": 65536 00:20:29.719 }, 00:20:29.719 { 00:20:29.719 "name": null, 00:20:29.719 "uuid": "6ddad680-8500-4a1b-8d9b-a8bd04e8d6bf", 00:20:29.719 "is_configured": false, 00:20:29.719 "data_offset": 0, 00:20:29.719 "data_size": 65536 00:20:29.719 }, 00:20:29.719 { 00:20:29.719 "name": null, 00:20:29.719 "uuid": "0255f2ef-51aa-47dc-9c77-0315e1e53f56", 00:20:29.719 "is_configured": false, 00:20:29.719 "data_offset": 0, 00:20:29.719 "data_size": 65536 00:20:29.719 }, 00:20:29.719 { 00:20:29.719 "name": "BaseBdev4", 00:20:29.719 "uuid": "9d13fba3-838e-4625-b4d5-d0b86de5aa2e", 00:20:29.719 "is_configured": true, 00:20:29.719 "data_offset": 0, 00:20:29.719 "data_size": 65536 00:20:29.719 } 00:20:29.719 ] 00:20:29.719 }' 00:20:29.719 19:56:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:29.719 19:56:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:30.286 19:56:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:30.286 19:56:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:30.545 19:56:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:20:30.545 19:56:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:20:30.804 [2024-07-24 19:56:22.211759] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:30.804 19:56:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:30.804 19:56:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:30.804 19:56:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:30.804 19:56:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:30.804 19:56:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:30.804 19:56:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:30.804 19:56:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:30.804 19:56:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:30.804 19:56:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:30.804 19:56:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:30.804 19:56:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:30.804 19:56:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:31.064 19:56:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:31.064 "name": "Existed_Raid", 00:20:31.064 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:31.064 "strip_size_kb": 64, 00:20:31.064 "state": "configuring", 00:20:31.064 "raid_level": "concat", 00:20:31.064 "superblock": false, 00:20:31.064 "num_base_bdevs": 4, 00:20:31.064 "num_base_bdevs_discovered": 3, 00:20:31.064 "num_base_bdevs_operational": 4, 00:20:31.064 "base_bdevs_list": [ 00:20:31.064 { 00:20:31.064 "name": "BaseBdev1", 00:20:31.064 "uuid": "5e555668-3a7f-4752-84dd-7b3caf0a34a3", 00:20:31.064 "is_configured": true, 00:20:31.064 "data_offset": 0, 00:20:31.064 "data_size": 65536 00:20:31.064 }, 00:20:31.064 { 00:20:31.064 "name": null, 00:20:31.064 "uuid": "6ddad680-8500-4a1b-8d9b-a8bd04e8d6bf", 00:20:31.064 "is_configured": false, 00:20:31.064 "data_offset": 0, 00:20:31.064 "data_size": 65536 00:20:31.064 }, 00:20:31.064 { 00:20:31.064 "name": "BaseBdev3", 00:20:31.064 "uuid": "0255f2ef-51aa-47dc-9c77-0315e1e53f56", 00:20:31.064 "is_configured": true, 00:20:31.064 "data_offset": 0, 00:20:31.064 "data_size": 65536 00:20:31.064 }, 00:20:31.064 { 00:20:31.064 "name": "BaseBdev4", 00:20:31.064 "uuid": "9d13fba3-838e-4625-b4d5-d0b86de5aa2e", 00:20:31.064 "is_configured": true, 00:20:31.064 "data_offset": 0, 00:20:31.064 "data_size": 65536 00:20:31.064 } 00:20:31.064 ] 00:20:31.064 }' 00:20:31.064 19:56:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:31.064 19:56:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:31.632 19:56:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:31.632 19:56:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:31.897 19:56:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:20:31.897 19:56:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:32.156 [2024-07-24 19:56:23.563357] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:32.156 19:56:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:32.156 19:56:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:32.156 19:56:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:32.156 19:56:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:32.156 19:56:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:32.156 19:56:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:32.156 19:56:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:32.156 19:56:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:32.156 19:56:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:32.156 19:56:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:32.156 19:56:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:32.156 19:56:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:32.415 19:56:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:32.415 "name": "Existed_Raid", 00:20:32.415 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:32.415 "strip_size_kb": 64, 00:20:32.415 "state": "configuring", 00:20:32.415 "raid_level": "concat", 00:20:32.415 "superblock": false, 00:20:32.415 "num_base_bdevs": 4, 00:20:32.415 "num_base_bdevs_discovered": 2, 00:20:32.415 "num_base_bdevs_operational": 4, 00:20:32.415 "base_bdevs_list": [ 00:20:32.415 { 00:20:32.415 "name": null, 00:20:32.415 "uuid": "5e555668-3a7f-4752-84dd-7b3caf0a34a3", 00:20:32.415 "is_configured": false, 00:20:32.415 "data_offset": 0, 00:20:32.415 "data_size": 65536 00:20:32.415 }, 00:20:32.415 { 00:20:32.415 "name": null, 00:20:32.415 "uuid": "6ddad680-8500-4a1b-8d9b-a8bd04e8d6bf", 00:20:32.415 "is_configured": false, 00:20:32.415 "data_offset": 0, 00:20:32.415 "data_size": 65536 00:20:32.415 }, 00:20:32.415 { 00:20:32.415 "name": "BaseBdev3", 00:20:32.415 "uuid": "0255f2ef-51aa-47dc-9c77-0315e1e53f56", 00:20:32.415 "is_configured": true, 00:20:32.415 "data_offset": 0, 00:20:32.415 "data_size": 65536 00:20:32.415 }, 00:20:32.415 { 00:20:32.415 "name": "BaseBdev4", 00:20:32.415 "uuid": "9d13fba3-838e-4625-b4d5-d0b86de5aa2e", 00:20:32.415 "is_configured": true, 00:20:32.415 "data_offset": 0, 00:20:32.415 "data_size": 65536 00:20:32.415 } 00:20:32.415 ] 00:20:32.415 }' 00:20:32.415 19:56:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:32.415 19:56:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:32.982 19:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:32.982 19:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:33.240 19:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:20:33.240 19:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:20:33.499 [2024-07-24 19:56:24.923354] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:33.499 19:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:33.499 19:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:33.499 19:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:33.499 19:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:33.499 19:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:33.499 19:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:33.499 19:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:33.499 19:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:33.499 19:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:33.499 19:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:33.499 19:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:33.499 19:56:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:33.759 19:56:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:33.759 "name": "Existed_Raid", 00:20:33.759 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:33.759 "strip_size_kb": 64, 00:20:33.759 "state": "configuring", 00:20:33.759 "raid_level": "concat", 00:20:33.759 "superblock": false, 00:20:33.759 "num_base_bdevs": 4, 00:20:33.759 "num_base_bdevs_discovered": 3, 00:20:33.759 "num_base_bdevs_operational": 4, 00:20:33.759 "base_bdevs_list": [ 00:20:33.759 { 00:20:33.759 "name": null, 00:20:33.759 "uuid": "5e555668-3a7f-4752-84dd-7b3caf0a34a3", 00:20:33.759 "is_configured": false, 00:20:33.759 "data_offset": 0, 00:20:33.759 "data_size": 65536 00:20:33.759 }, 00:20:33.759 { 00:20:33.759 "name": "BaseBdev2", 00:20:33.759 "uuid": "6ddad680-8500-4a1b-8d9b-a8bd04e8d6bf", 00:20:33.759 "is_configured": true, 00:20:33.759 "data_offset": 0, 00:20:33.759 "data_size": 65536 00:20:33.759 }, 00:20:33.759 { 00:20:33.759 "name": "BaseBdev3", 00:20:33.759 "uuid": "0255f2ef-51aa-47dc-9c77-0315e1e53f56", 00:20:33.759 "is_configured": true, 00:20:33.759 "data_offset": 0, 00:20:33.759 "data_size": 65536 00:20:33.759 }, 00:20:33.759 { 00:20:33.759 "name": "BaseBdev4", 00:20:33.759 "uuid": "9d13fba3-838e-4625-b4d5-d0b86de5aa2e", 00:20:33.759 "is_configured": true, 00:20:33.759 "data_offset": 0, 00:20:33.759 "data_size": 65536 00:20:33.759 } 00:20:33.759 ] 00:20:33.759 }' 00:20:33.759 19:56:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:33.759 19:56:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:34.327 19:56:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:34.327 19:56:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:34.586 19:56:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:20:34.586 19:56:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:34.586 19:56:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:20:34.586 19:56:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 5e555668-3a7f-4752-84dd-7b3caf0a34a3 00:20:34.845 [2024-07-24 19:56:26.306457] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:20:34.845 [2024-07-24 19:56:26.306493] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x19d54f0 00:20:34.845 [2024-07-24 19:56:26.306501] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:20:34.845 [2024-07-24 19:56:26.306693] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19d6250 00:20:34.845 [2024-07-24 19:56:26.306819] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19d54f0 00:20:34.845 [2024-07-24 19:56:26.306829] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x19d54f0 00:20:34.845 [2024-07-24 19:56:26.306983] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:34.845 NewBaseBdev 00:20:34.845 19:56:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:20:34.845 19:56:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:20:34.845 19:56:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:34.845 19:56:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:34.845 19:56:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:34.845 19:56:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:34.845 19:56:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:35.104 19:56:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:20:35.364 [ 00:20:35.364 { 00:20:35.364 "name": "NewBaseBdev", 00:20:35.364 "aliases": [ 00:20:35.364 "5e555668-3a7f-4752-84dd-7b3caf0a34a3" 00:20:35.364 ], 00:20:35.364 "product_name": "Malloc disk", 00:20:35.364 "block_size": 512, 00:20:35.364 "num_blocks": 65536, 00:20:35.364 "uuid": "5e555668-3a7f-4752-84dd-7b3caf0a34a3", 00:20:35.364 "assigned_rate_limits": { 00:20:35.364 "rw_ios_per_sec": 0, 00:20:35.364 "rw_mbytes_per_sec": 0, 00:20:35.364 "r_mbytes_per_sec": 0, 00:20:35.364 "w_mbytes_per_sec": 0 00:20:35.364 }, 00:20:35.364 "claimed": true, 00:20:35.364 "claim_type": "exclusive_write", 00:20:35.364 "zoned": false, 00:20:35.364 "supported_io_types": { 00:20:35.364 "read": true, 00:20:35.364 "write": true, 00:20:35.364 "unmap": true, 00:20:35.364 "flush": true, 00:20:35.364 "reset": true, 00:20:35.364 "nvme_admin": false, 00:20:35.364 "nvme_io": false, 00:20:35.364 "nvme_io_md": false, 00:20:35.364 "write_zeroes": true, 00:20:35.364 "zcopy": true, 00:20:35.364 "get_zone_info": false, 00:20:35.364 "zone_management": false, 00:20:35.364 "zone_append": false, 00:20:35.364 "compare": false, 00:20:35.364 "compare_and_write": false, 00:20:35.364 "abort": true, 00:20:35.364 "seek_hole": false, 00:20:35.364 "seek_data": false, 00:20:35.364 "copy": true, 00:20:35.364 "nvme_iov_md": false 00:20:35.364 }, 00:20:35.364 "memory_domains": [ 00:20:35.364 { 00:20:35.364 "dma_device_id": "system", 00:20:35.364 "dma_device_type": 1 00:20:35.364 }, 00:20:35.364 { 00:20:35.364 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:35.364 "dma_device_type": 2 00:20:35.364 } 00:20:35.364 ], 00:20:35.364 "driver_specific": {} 00:20:35.364 } 00:20:35.364 ] 00:20:35.364 19:56:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:35.364 19:56:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:20:35.364 19:56:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:35.364 19:56:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:35.364 19:56:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:35.364 19:56:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:35.364 19:56:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:35.364 19:56:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:35.364 19:56:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:35.364 19:56:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:35.364 19:56:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:35.364 19:56:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:35.364 19:56:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:35.622 19:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:35.622 "name": "Existed_Raid", 00:20:35.622 "uuid": "3329fcb3-cb7c-4694-88da-8fdd3e5b69d1", 00:20:35.622 "strip_size_kb": 64, 00:20:35.622 "state": "online", 00:20:35.622 "raid_level": "concat", 00:20:35.622 "superblock": false, 00:20:35.622 "num_base_bdevs": 4, 00:20:35.622 "num_base_bdevs_discovered": 4, 00:20:35.622 "num_base_bdevs_operational": 4, 00:20:35.622 "base_bdevs_list": [ 00:20:35.622 { 00:20:35.622 "name": "NewBaseBdev", 00:20:35.622 "uuid": "5e555668-3a7f-4752-84dd-7b3caf0a34a3", 00:20:35.622 "is_configured": true, 00:20:35.622 "data_offset": 0, 00:20:35.622 "data_size": 65536 00:20:35.622 }, 00:20:35.622 { 00:20:35.622 "name": "BaseBdev2", 00:20:35.622 "uuid": "6ddad680-8500-4a1b-8d9b-a8bd04e8d6bf", 00:20:35.622 "is_configured": true, 00:20:35.622 "data_offset": 0, 00:20:35.622 "data_size": 65536 00:20:35.622 }, 00:20:35.622 { 00:20:35.622 "name": "BaseBdev3", 00:20:35.622 "uuid": "0255f2ef-51aa-47dc-9c77-0315e1e53f56", 00:20:35.622 "is_configured": true, 00:20:35.622 "data_offset": 0, 00:20:35.622 "data_size": 65536 00:20:35.622 }, 00:20:35.622 { 00:20:35.622 "name": "BaseBdev4", 00:20:35.622 "uuid": "9d13fba3-838e-4625-b4d5-d0b86de5aa2e", 00:20:35.622 "is_configured": true, 00:20:35.622 "data_offset": 0, 00:20:35.622 "data_size": 65536 00:20:35.622 } 00:20:35.622 ] 00:20:35.622 }' 00:20:35.622 19:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:35.622 19:56:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:36.189 19:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:20:36.189 19:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:36.189 19:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:36.189 19:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:36.189 19:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:36.189 19:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:36.189 19:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:36.189 19:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:36.448 [2024-07-24 19:56:27.903014] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:36.448 19:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:36.448 "name": "Existed_Raid", 00:20:36.448 "aliases": [ 00:20:36.448 "3329fcb3-cb7c-4694-88da-8fdd3e5b69d1" 00:20:36.448 ], 00:20:36.448 "product_name": "Raid Volume", 00:20:36.448 "block_size": 512, 00:20:36.448 "num_blocks": 262144, 00:20:36.448 "uuid": "3329fcb3-cb7c-4694-88da-8fdd3e5b69d1", 00:20:36.448 "assigned_rate_limits": { 00:20:36.448 "rw_ios_per_sec": 0, 00:20:36.448 "rw_mbytes_per_sec": 0, 00:20:36.448 "r_mbytes_per_sec": 0, 00:20:36.448 "w_mbytes_per_sec": 0 00:20:36.448 }, 00:20:36.448 "claimed": false, 00:20:36.448 "zoned": false, 00:20:36.448 "supported_io_types": { 00:20:36.448 "read": true, 00:20:36.448 "write": true, 00:20:36.448 "unmap": true, 00:20:36.448 "flush": true, 00:20:36.448 "reset": true, 00:20:36.448 "nvme_admin": false, 00:20:36.448 "nvme_io": false, 00:20:36.448 "nvme_io_md": false, 00:20:36.448 "write_zeroes": true, 00:20:36.448 "zcopy": false, 00:20:36.448 "get_zone_info": false, 00:20:36.448 "zone_management": false, 00:20:36.448 "zone_append": false, 00:20:36.448 "compare": false, 00:20:36.448 "compare_and_write": false, 00:20:36.448 "abort": false, 00:20:36.448 "seek_hole": false, 00:20:36.448 "seek_data": false, 00:20:36.448 "copy": false, 00:20:36.448 "nvme_iov_md": false 00:20:36.448 }, 00:20:36.448 "memory_domains": [ 00:20:36.448 { 00:20:36.448 "dma_device_id": "system", 00:20:36.448 "dma_device_type": 1 00:20:36.448 }, 00:20:36.448 { 00:20:36.448 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:36.448 "dma_device_type": 2 00:20:36.448 }, 00:20:36.448 { 00:20:36.448 "dma_device_id": "system", 00:20:36.448 "dma_device_type": 1 00:20:36.448 }, 00:20:36.448 { 00:20:36.448 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:36.448 "dma_device_type": 2 00:20:36.448 }, 00:20:36.448 { 00:20:36.448 "dma_device_id": "system", 00:20:36.448 "dma_device_type": 1 00:20:36.448 }, 00:20:36.448 { 00:20:36.448 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:36.448 "dma_device_type": 2 00:20:36.448 }, 00:20:36.448 { 00:20:36.448 "dma_device_id": "system", 00:20:36.448 "dma_device_type": 1 00:20:36.448 }, 00:20:36.448 { 00:20:36.448 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:36.448 "dma_device_type": 2 00:20:36.448 } 00:20:36.448 ], 00:20:36.448 "driver_specific": { 00:20:36.448 "raid": { 00:20:36.448 "uuid": "3329fcb3-cb7c-4694-88da-8fdd3e5b69d1", 00:20:36.448 "strip_size_kb": 64, 00:20:36.448 "state": "online", 00:20:36.448 "raid_level": "concat", 00:20:36.448 "superblock": false, 00:20:36.448 "num_base_bdevs": 4, 00:20:36.448 "num_base_bdevs_discovered": 4, 00:20:36.448 "num_base_bdevs_operational": 4, 00:20:36.448 "base_bdevs_list": [ 00:20:36.448 { 00:20:36.448 "name": "NewBaseBdev", 00:20:36.448 "uuid": "5e555668-3a7f-4752-84dd-7b3caf0a34a3", 00:20:36.448 "is_configured": true, 00:20:36.448 "data_offset": 0, 00:20:36.448 "data_size": 65536 00:20:36.448 }, 00:20:36.448 { 00:20:36.448 "name": "BaseBdev2", 00:20:36.448 "uuid": "6ddad680-8500-4a1b-8d9b-a8bd04e8d6bf", 00:20:36.448 "is_configured": true, 00:20:36.448 "data_offset": 0, 00:20:36.448 "data_size": 65536 00:20:36.448 }, 00:20:36.448 { 00:20:36.448 "name": "BaseBdev3", 00:20:36.448 "uuid": "0255f2ef-51aa-47dc-9c77-0315e1e53f56", 00:20:36.448 "is_configured": true, 00:20:36.448 "data_offset": 0, 00:20:36.448 "data_size": 65536 00:20:36.448 }, 00:20:36.448 { 00:20:36.448 "name": "BaseBdev4", 00:20:36.448 "uuid": "9d13fba3-838e-4625-b4d5-d0b86de5aa2e", 00:20:36.448 "is_configured": true, 00:20:36.448 "data_offset": 0, 00:20:36.448 "data_size": 65536 00:20:36.448 } 00:20:36.448 ] 00:20:36.448 } 00:20:36.448 } 00:20:36.448 }' 00:20:36.448 19:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:36.448 19:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:20:36.448 BaseBdev2 00:20:36.448 BaseBdev3 00:20:36.448 BaseBdev4' 00:20:36.448 19:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:36.448 19:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:20:36.448 19:56:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:36.708 19:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:36.708 "name": "NewBaseBdev", 00:20:36.708 "aliases": [ 00:20:36.708 "5e555668-3a7f-4752-84dd-7b3caf0a34a3" 00:20:36.708 ], 00:20:36.708 "product_name": "Malloc disk", 00:20:36.708 "block_size": 512, 00:20:36.708 "num_blocks": 65536, 00:20:36.708 "uuid": "5e555668-3a7f-4752-84dd-7b3caf0a34a3", 00:20:36.708 "assigned_rate_limits": { 00:20:36.708 "rw_ios_per_sec": 0, 00:20:36.708 "rw_mbytes_per_sec": 0, 00:20:36.708 "r_mbytes_per_sec": 0, 00:20:36.708 "w_mbytes_per_sec": 0 00:20:36.708 }, 00:20:36.708 "claimed": true, 00:20:36.708 "claim_type": "exclusive_write", 00:20:36.708 "zoned": false, 00:20:36.708 "supported_io_types": { 00:20:36.708 "read": true, 00:20:36.708 "write": true, 00:20:36.708 "unmap": true, 00:20:36.708 "flush": true, 00:20:36.708 "reset": true, 00:20:36.708 "nvme_admin": false, 00:20:36.708 "nvme_io": false, 00:20:36.708 "nvme_io_md": false, 00:20:36.708 "write_zeroes": true, 00:20:36.708 "zcopy": true, 00:20:36.708 "get_zone_info": false, 00:20:36.708 "zone_management": false, 00:20:36.708 "zone_append": false, 00:20:36.708 "compare": false, 00:20:36.708 "compare_and_write": false, 00:20:36.708 "abort": true, 00:20:36.708 "seek_hole": false, 00:20:36.708 "seek_data": false, 00:20:36.708 "copy": true, 00:20:36.708 "nvme_iov_md": false 00:20:36.708 }, 00:20:36.708 "memory_domains": [ 00:20:36.708 { 00:20:36.708 "dma_device_id": "system", 00:20:36.708 "dma_device_type": 1 00:20:36.708 }, 00:20:36.708 { 00:20:36.708 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:36.708 "dma_device_type": 2 00:20:36.708 } 00:20:36.708 ], 00:20:36.708 "driver_specific": {} 00:20:36.708 }' 00:20:36.708 19:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:36.708 19:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:36.966 19:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:36.966 19:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:36.966 19:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:36.966 19:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:36.966 19:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:36.966 19:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:36.966 19:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:36.966 19:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:37.226 19:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:37.226 19:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:37.226 19:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:37.226 19:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:37.226 19:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:37.485 19:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:37.485 "name": "BaseBdev2", 00:20:37.485 "aliases": [ 00:20:37.485 "6ddad680-8500-4a1b-8d9b-a8bd04e8d6bf" 00:20:37.485 ], 00:20:37.485 "product_name": "Malloc disk", 00:20:37.485 "block_size": 512, 00:20:37.485 "num_blocks": 65536, 00:20:37.485 "uuid": "6ddad680-8500-4a1b-8d9b-a8bd04e8d6bf", 00:20:37.485 "assigned_rate_limits": { 00:20:37.485 "rw_ios_per_sec": 0, 00:20:37.485 "rw_mbytes_per_sec": 0, 00:20:37.485 "r_mbytes_per_sec": 0, 00:20:37.485 "w_mbytes_per_sec": 0 00:20:37.485 }, 00:20:37.485 "claimed": true, 00:20:37.485 "claim_type": "exclusive_write", 00:20:37.485 "zoned": false, 00:20:37.485 "supported_io_types": { 00:20:37.485 "read": true, 00:20:37.485 "write": true, 00:20:37.485 "unmap": true, 00:20:37.485 "flush": true, 00:20:37.485 "reset": true, 00:20:37.485 "nvme_admin": false, 00:20:37.485 "nvme_io": false, 00:20:37.485 "nvme_io_md": false, 00:20:37.485 "write_zeroes": true, 00:20:37.485 "zcopy": true, 00:20:37.485 "get_zone_info": false, 00:20:37.485 "zone_management": false, 00:20:37.485 "zone_append": false, 00:20:37.485 "compare": false, 00:20:37.486 "compare_and_write": false, 00:20:37.486 "abort": true, 00:20:37.486 "seek_hole": false, 00:20:37.486 "seek_data": false, 00:20:37.486 "copy": true, 00:20:37.486 "nvme_iov_md": false 00:20:37.486 }, 00:20:37.486 "memory_domains": [ 00:20:37.486 { 00:20:37.486 "dma_device_id": "system", 00:20:37.486 "dma_device_type": 1 00:20:37.486 }, 00:20:37.486 { 00:20:37.486 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:37.486 "dma_device_type": 2 00:20:37.486 } 00:20:37.486 ], 00:20:37.486 "driver_specific": {} 00:20:37.486 }' 00:20:37.486 19:56:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:37.486 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:37.486 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:37.486 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:37.744 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:37.744 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:37.744 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:37.744 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:37.744 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:37.744 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:37.744 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:37.744 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:37.744 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:37.744 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:37.745 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:38.004 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:38.004 "name": "BaseBdev3", 00:20:38.004 "aliases": [ 00:20:38.004 "0255f2ef-51aa-47dc-9c77-0315e1e53f56" 00:20:38.004 ], 00:20:38.004 "product_name": "Malloc disk", 00:20:38.004 "block_size": 512, 00:20:38.004 "num_blocks": 65536, 00:20:38.004 "uuid": "0255f2ef-51aa-47dc-9c77-0315e1e53f56", 00:20:38.004 "assigned_rate_limits": { 00:20:38.004 "rw_ios_per_sec": 0, 00:20:38.004 "rw_mbytes_per_sec": 0, 00:20:38.004 "r_mbytes_per_sec": 0, 00:20:38.004 "w_mbytes_per_sec": 0 00:20:38.004 }, 00:20:38.004 "claimed": true, 00:20:38.004 "claim_type": "exclusive_write", 00:20:38.004 "zoned": false, 00:20:38.004 "supported_io_types": { 00:20:38.004 "read": true, 00:20:38.004 "write": true, 00:20:38.004 "unmap": true, 00:20:38.004 "flush": true, 00:20:38.004 "reset": true, 00:20:38.004 "nvme_admin": false, 00:20:38.004 "nvme_io": false, 00:20:38.004 "nvme_io_md": false, 00:20:38.004 "write_zeroes": true, 00:20:38.004 "zcopy": true, 00:20:38.004 "get_zone_info": false, 00:20:38.004 "zone_management": false, 00:20:38.004 "zone_append": false, 00:20:38.004 "compare": false, 00:20:38.004 "compare_and_write": false, 00:20:38.004 "abort": true, 00:20:38.004 "seek_hole": false, 00:20:38.004 "seek_data": false, 00:20:38.004 "copy": true, 00:20:38.004 "nvme_iov_md": false 00:20:38.004 }, 00:20:38.004 "memory_domains": [ 00:20:38.004 { 00:20:38.004 "dma_device_id": "system", 00:20:38.004 "dma_device_type": 1 00:20:38.004 }, 00:20:38.004 { 00:20:38.004 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:38.004 "dma_device_type": 2 00:20:38.004 } 00:20:38.004 ], 00:20:38.004 "driver_specific": {} 00:20:38.004 }' 00:20:38.004 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:38.263 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:38.263 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:38.263 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:38.263 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:38.263 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:38.263 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:38.263 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:38.263 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:38.263 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:38.522 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:38.522 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:38.522 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:38.522 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:38.522 19:56:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:38.781 19:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:38.781 "name": "BaseBdev4", 00:20:38.781 "aliases": [ 00:20:38.781 "9d13fba3-838e-4625-b4d5-d0b86de5aa2e" 00:20:38.781 ], 00:20:38.781 "product_name": "Malloc disk", 00:20:38.781 "block_size": 512, 00:20:38.781 "num_blocks": 65536, 00:20:38.781 "uuid": "9d13fba3-838e-4625-b4d5-d0b86de5aa2e", 00:20:38.781 "assigned_rate_limits": { 00:20:38.781 "rw_ios_per_sec": 0, 00:20:38.781 "rw_mbytes_per_sec": 0, 00:20:38.781 "r_mbytes_per_sec": 0, 00:20:38.781 "w_mbytes_per_sec": 0 00:20:38.781 }, 00:20:38.781 "claimed": true, 00:20:38.781 "claim_type": "exclusive_write", 00:20:38.781 "zoned": false, 00:20:38.781 "supported_io_types": { 00:20:38.781 "read": true, 00:20:38.781 "write": true, 00:20:38.781 "unmap": true, 00:20:38.781 "flush": true, 00:20:38.781 "reset": true, 00:20:38.781 "nvme_admin": false, 00:20:38.781 "nvme_io": false, 00:20:38.781 "nvme_io_md": false, 00:20:38.781 "write_zeroes": true, 00:20:38.781 "zcopy": true, 00:20:38.781 "get_zone_info": false, 00:20:38.781 "zone_management": false, 00:20:38.781 "zone_append": false, 00:20:38.781 "compare": false, 00:20:38.781 "compare_and_write": false, 00:20:38.781 "abort": true, 00:20:38.781 "seek_hole": false, 00:20:38.781 "seek_data": false, 00:20:38.781 "copy": true, 00:20:38.781 "nvme_iov_md": false 00:20:38.781 }, 00:20:38.781 "memory_domains": [ 00:20:38.781 { 00:20:38.781 "dma_device_id": "system", 00:20:38.781 "dma_device_type": 1 00:20:38.781 }, 00:20:38.781 { 00:20:38.781 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:38.781 "dma_device_type": 2 00:20:38.781 } 00:20:38.781 ], 00:20:38.781 "driver_specific": {} 00:20:38.781 }' 00:20:38.781 19:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:38.781 19:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:38.781 19:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:38.781 19:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:38.781 19:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:38.781 19:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:39.040 19:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:39.040 19:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:39.040 19:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:39.040 19:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:39.040 19:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:39.040 19:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:39.040 19:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:39.299 [2024-07-24 19:56:30.762304] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:39.299 [2024-07-24 19:56:30.762334] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:39.300 [2024-07-24 19:56:30.762381] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:39.300 [2024-07-24 19:56:30.762448] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:39.300 [2024-07-24 19:56:30.762461] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19d54f0 name Existed_Raid, state offline 00:20:39.300 19:56:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1451506 00:20:39.300 19:56:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 1451506 ']' 00:20:39.300 19:56:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 1451506 00:20:39.300 19:56:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:20:39.300 19:56:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:20:39.300 19:56:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1451506 00:20:39.300 19:56:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:20:39.300 19:56:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:20:39.300 19:56:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1451506' 00:20:39.300 killing process with pid 1451506 00:20:39.300 19:56:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 1451506 00:20:39.300 [2024-07-24 19:56:30.835119] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:39.300 19:56:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 1451506 00:20:39.300 [2024-07-24 19:56:30.873348] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:39.558 19:56:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:20:39.558 00:20:39.558 real 0m34.228s 00:20:39.558 user 1m2.816s 00:20:39.558 sys 0m6.094s 00:20:39.558 19:56:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:20:39.558 19:56:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:39.558 ************************************ 00:20:39.558 END TEST raid_state_function_test 00:20:39.558 ************************************ 00:20:39.558 19:56:31 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:20:39.558 19:56:31 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:20:39.558 19:56:31 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:20:39.559 19:56:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:39.818 ************************************ 00:20:39.818 START TEST raid_state_function_test_sb 00:20:39.818 ************************************ 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 4 true 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1456559 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1456559' 00:20:39.818 Process raid pid: 1456559 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1456559 /var/tmp/spdk-raid.sock 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1456559 ']' 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:39.818 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:20:39.818 19:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:39.818 [2024-07-24 19:56:31.258011] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:20:39.818 [2024-07-24 19:56:31.258094] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:39.818 [2024-07-24 19:56:31.392657] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:40.077 [2024-07-24 19:56:31.495349] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:40.077 [2024-07-24 19:56:31.566867] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:40.077 [2024-07-24 19:56:31.566901] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:40.693 19:56:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:20:40.693 19:56:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:20:40.693 19:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:41.018 [2024-07-24 19:56:32.420410] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:41.018 [2024-07-24 19:56:32.420450] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:41.018 [2024-07-24 19:56:32.420461] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:41.019 [2024-07-24 19:56:32.420474] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:41.019 [2024-07-24 19:56:32.420482] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:41.019 [2024-07-24 19:56:32.420493] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:41.019 [2024-07-24 19:56:32.420502] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:41.019 [2024-07-24 19:56:32.420513] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:41.019 19:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:41.019 19:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:41.019 19:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:41.019 19:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:41.019 19:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:41.019 19:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:41.019 19:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:41.019 19:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:41.019 19:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:41.019 19:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:41.019 19:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:41.019 19:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:41.278 19:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:41.278 "name": "Existed_Raid", 00:20:41.278 "uuid": "e104bea5-b912-4963-8013-f72add57dddd", 00:20:41.278 "strip_size_kb": 64, 00:20:41.278 "state": "configuring", 00:20:41.278 "raid_level": "concat", 00:20:41.278 "superblock": true, 00:20:41.278 "num_base_bdevs": 4, 00:20:41.278 "num_base_bdevs_discovered": 0, 00:20:41.278 "num_base_bdevs_operational": 4, 00:20:41.278 "base_bdevs_list": [ 00:20:41.278 { 00:20:41.278 "name": "BaseBdev1", 00:20:41.278 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:41.278 "is_configured": false, 00:20:41.278 "data_offset": 0, 00:20:41.278 "data_size": 0 00:20:41.278 }, 00:20:41.278 { 00:20:41.278 "name": "BaseBdev2", 00:20:41.278 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:41.278 "is_configured": false, 00:20:41.278 "data_offset": 0, 00:20:41.278 "data_size": 0 00:20:41.278 }, 00:20:41.278 { 00:20:41.278 "name": "BaseBdev3", 00:20:41.278 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:41.278 "is_configured": false, 00:20:41.278 "data_offset": 0, 00:20:41.278 "data_size": 0 00:20:41.278 }, 00:20:41.278 { 00:20:41.278 "name": "BaseBdev4", 00:20:41.278 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:41.278 "is_configured": false, 00:20:41.278 "data_offset": 0, 00:20:41.278 "data_size": 0 00:20:41.278 } 00:20:41.278 ] 00:20:41.278 }' 00:20:41.278 19:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:41.278 19:56:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:41.846 19:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:42.104 [2024-07-24 19:56:33.563278] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:42.104 [2024-07-24 19:56:33.563316] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2526a30 name Existed_Raid, state configuring 00:20:42.104 19:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:42.363 [2024-07-24 19:56:33.811964] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:42.363 [2024-07-24 19:56:33.811998] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:42.363 [2024-07-24 19:56:33.812009] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:42.363 [2024-07-24 19:56:33.812021] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:42.363 [2024-07-24 19:56:33.812030] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:42.363 [2024-07-24 19:56:33.812041] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:42.363 [2024-07-24 19:56:33.812049] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:42.363 [2024-07-24 19:56:33.812060] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:42.363 19:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:42.622 [2024-07-24 19:56:34.070501] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:42.622 BaseBdev1 00:20:42.622 19:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:42.622 19:56:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:20:42.622 19:56:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:42.622 19:56:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:20:42.622 19:56:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:42.622 19:56:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:42.622 19:56:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:42.881 19:56:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:43.141 [ 00:20:43.141 { 00:20:43.141 "name": "BaseBdev1", 00:20:43.141 "aliases": [ 00:20:43.141 "fd333952-4a0e-427c-9b10-b72dfbeda3cf" 00:20:43.141 ], 00:20:43.141 "product_name": "Malloc disk", 00:20:43.141 "block_size": 512, 00:20:43.141 "num_blocks": 65536, 00:20:43.141 "uuid": "fd333952-4a0e-427c-9b10-b72dfbeda3cf", 00:20:43.141 "assigned_rate_limits": { 00:20:43.141 "rw_ios_per_sec": 0, 00:20:43.141 "rw_mbytes_per_sec": 0, 00:20:43.141 "r_mbytes_per_sec": 0, 00:20:43.141 "w_mbytes_per_sec": 0 00:20:43.141 }, 00:20:43.141 "claimed": true, 00:20:43.141 "claim_type": "exclusive_write", 00:20:43.141 "zoned": false, 00:20:43.141 "supported_io_types": { 00:20:43.141 "read": true, 00:20:43.141 "write": true, 00:20:43.141 "unmap": true, 00:20:43.141 "flush": true, 00:20:43.141 "reset": true, 00:20:43.141 "nvme_admin": false, 00:20:43.141 "nvme_io": false, 00:20:43.141 "nvme_io_md": false, 00:20:43.141 "write_zeroes": true, 00:20:43.141 "zcopy": true, 00:20:43.141 "get_zone_info": false, 00:20:43.141 "zone_management": false, 00:20:43.141 "zone_append": false, 00:20:43.141 "compare": false, 00:20:43.141 "compare_and_write": false, 00:20:43.141 "abort": true, 00:20:43.141 "seek_hole": false, 00:20:43.141 "seek_data": false, 00:20:43.141 "copy": true, 00:20:43.141 "nvme_iov_md": false 00:20:43.141 }, 00:20:43.141 "memory_domains": [ 00:20:43.141 { 00:20:43.141 "dma_device_id": "system", 00:20:43.141 "dma_device_type": 1 00:20:43.141 }, 00:20:43.141 { 00:20:43.141 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:43.141 "dma_device_type": 2 00:20:43.141 } 00:20:43.141 ], 00:20:43.141 "driver_specific": {} 00:20:43.141 } 00:20:43.141 ] 00:20:43.141 19:56:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:20:43.141 19:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:43.141 19:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:43.141 19:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:43.141 19:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:43.141 19:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:43.141 19:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:43.141 19:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:43.141 19:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:43.141 19:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:43.141 19:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:43.141 19:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:43.141 19:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:43.401 19:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:43.401 "name": "Existed_Raid", 00:20:43.401 "uuid": "c3e8ef3e-6651-4e10-bc57-1be470f10f8b", 00:20:43.401 "strip_size_kb": 64, 00:20:43.401 "state": "configuring", 00:20:43.401 "raid_level": "concat", 00:20:43.401 "superblock": true, 00:20:43.401 "num_base_bdevs": 4, 00:20:43.401 "num_base_bdevs_discovered": 1, 00:20:43.401 "num_base_bdevs_operational": 4, 00:20:43.401 "base_bdevs_list": [ 00:20:43.401 { 00:20:43.401 "name": "BaseBdev1", 00:20:43.401 "uuid": "fd333952-4a0e-427c-9b10-b72dfbeda3cf", 00:20:43.401 "is_configured": true, 00:20:43.401 "data_offset": 2048, 00:20:43.401 "data_size": 63488 00:20:43.401 }, 00:20:43.401 { 00:20:43.401 "name": "BaseBdev2", 00:20:43.401 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:43.401 "is_configured": false, 00:20:43.401 "data_offset": 0, 00:20:43.401 "data_size": 0 00:20:43.401 }, 00:20:43.401 { 00:20:43.401 "name": "BaseBdev3", 00:20:43.401 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:43.401 "is_configured": false, 00:20:43.401 "data_offset": 0, 00:20:43.401 "data_size": 0 00:20:43.401 }, 00:20:43.401 { 00:20:43.401 "name": "BaseBdev4", 00:20:43.401 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:43.401 "is_configured": false, 00:20:43.401 "data_offset": 0, 00:20:43.401 "data_size": 0 00:20:43.401 } 00:20:43.401 ] 00:20:43.401 }' 00:20:43.401 19:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:43.401 19:56:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:43.967 19:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:44.227 [2024-07-24 19:56:35.702833] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:44.227 [2024-07-24 19:56:35.702874] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25262a0 name Existed_Raid, state configuring 00:20:44.227 19:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:44.486 [2024-07-24 19:56:35.951540] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:44.486 [2024-07-24 19:56:35.952951] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:44.486 [2024-07-24 19:56:35.952983] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:44.486 [2024-07-24 19:56:35.952994] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:44.486 [2024-07-24 19:56:35.953006] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:44.486 [2024-07-24 19:56:35.953015] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:44.486 [2024-07-24 19:56:35.953026] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:44.486 19:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:44.486 19:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:44.486 19:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:44.486 19:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:44.486 19:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:44.486 19:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:44.486 19:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:44.486 19:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:44.486 19:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:44.486 19:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:44.486 19:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:44.486 19:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:44.486 19:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:44.486 19:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:44.745 19:56:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:44.745 "name": "Existed_Raid", 00:20:44.745 "uuid": "e086bfdc-1395-4345-82fe-39094943388e", 00:20:44.745 "strip_size_kb": 64, 00:20:44.745 "state": "configuring", 00:20:44.745 "raid_level": "concat", 00:20:44.745 "superblock": true, 00:20:44.745 "num_base_bdevs": 4, 00:20:44.745 "num_base_bdevs_discovered": 1, 00:20:44.745 "num_base_bdevs_operational": 4, 00:20:44.745 "base_bdevs_list": [ 00:20:44.745 { 00:20:44.745 "name": "BaseBdev1", 00:20:44.745 "uuid": "fd333952-4a0e-427c-9b10-b72dfbeda3cf", 00:20:44.745 "is_configured": true, 00:20:44.745 "data_offset": 2048, 00:20:44.745 "data_size": 63488 00:20:44.745 }, 00:20:44.745 { 00:20:44.745 "name": "BaseBdev2", 00:20:44.745 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:44.745 "is_configured": false, 00:20:44.745 "data_offset": 0, 00:20:44.745 "data_size": 0 00:20:44.745 }, 00:20:44.745 { 00:20:44.745 "name": "BaseBdev3", 00:20:44.745 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:44.745 "is_configured": false, 00:20:44.745 "data_offset": 0, 00:20:44.745 "data_size": 0 00:20:44.745 }, 00:20:44.745 { 00:20:44.745 "name": "BaseBdev4", 00:20:44.745 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:44.745 "is_configured": false, 00:20:44.745 "data_offset": 0, 00:20:44.745 "data_size": 0 00:20:44.745 } 00:20:44.745 ] 00:20:44.745 }' 00:20:44.745 19:56:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:44.745 19:56:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:45.683 19:56:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:45.683 [2024-07-24 19:56:37.243003] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:45.683 BaseBdev2 00:20:45.683 19:56:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:45.683 19:56:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:20:45.683 19:56:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:45.683 19:56:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:20:45.683 19:56:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:45.683 19:56:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:45.683 19:56:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:46.251 19:56:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:46.510 [ 00:20:46.510 { 00:20:46.510 "name": "BaseBdev2", 00:20:46.510 "aliases": [ 00:20:46.510 "da2aef6d-0fbc-4861-b217-6bd185358bd1" 00:20:46.510 ], 00:20:46.510 "product_name": "Malloc disk", 00:20:46.510 "block_size": 512, 00:20:46.510 "num_blocks": 65536, 00:20:46.510 "uuid": "da2aef6d-0fbc-4861-b217-6bd185358bd1", 00:20:46.510 "assigned_rate_limits": { 00:20:46.510 "rw_ios_per_sec": 0, 00:20:46.510 "rw_mbytes_per_sec": 0, 00:20:46.510 "r_mbytes_per_sec": 0, 00:20:46.510 "w_mbytes_per_sec": 0 00:20:46.510 }, 00:20:46.510 "claimed": true, 00:20:46.510 "claim_type": "exclusive_write", 00:20:46.510 "zoned": false, 00:20:46.510 "supported_io_types": { 00:20:46.510 "read": true, 00:20:46.510 "write": true, 00:20:46.510 "unmap": true, 00:20:46.510 "flush": true, 00:20:46.510 "reset": true, 00:20:46.510 "nvme_admin": false, 00:20:46.510 "nvme_io": false, 00:20:46.510 "nvme_io_md": false, 00:20:46.510 "write_zeroes": true, 00:20:46.510 "zcopy": true, 00:20:46.510 "get_zone_info": false, 00:20:46.510 "zone_management": false, 00:20:46.510 "zone_append": false, 00:20:46.510 "compare": false, 00:20:46.510 "compare_and_write": false, 00:20:46.510 "abort": true, 00:20:46.510 "seek_hole": false, 00:20:46.510 "seek_data": false, 00:20:46.510 "copy": true, 00:20:46.510 "nvme_iov_md": false 00:20:46.510 }, 00:20:46.510 "memory_domains": [ 00:20:46.510 { 00:20:46.510 "dma_device_id": "system", 00:20:46.510 "dma_device_type": 1 00:20:46.510 }, 00:20:46.510 { 00:20:46.510 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:46.510 "dma_device_type": 2 00:20:46.510 } 00:20:46.510 ], 00:20:46.510 "driver_specific": {} 00:20:46.510 } 00:20:46.510 ] 00:20:46.510 19:56:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:20:46.510 19:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:46.510 19:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:46.510 19:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:46.510 19:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:46.510 19:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:46.510 19:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:46.510 19:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:46.510 19:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:46.510 19:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:46.510 19:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:46.510 19:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:46.510 19:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:46.510 19:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:46.510 19:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:47.078 19:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:47.078 "name": "Existed_Raid", 00:20:47.078 "uuid": "e086bfdc-1395-4345-82fe-39094943388e", 00:20:47.078 "strip_size_kb": 64, 00:20:47.078 "state": "configuring", 00:20:47.078 "raid_level": "concat", 00:20:47.078 "superblock": true, 00:20:47.078 "num_base_bdevs": 4, 00:20:47.078 "num_base_bdevs_discovered": 2, 00:20:47.078 "num_base_bdevs_operational": 4, 00:20:47.078 "base_bdevs_list": [ 00:20:47.078 { 00:20:47.078 "name": "BaseBdev1", 00:20:47.078 "uuid": "fd333952-4a0e-427c-9b10-b72dfbeda3cf", 00:20:47.078 "is_configured": true, 00:20:47.078 "data_offset": 2048, 00:20:47.078 "data_size": 63488 00:20:47.078 }, 00:20:47.078 { 00:20:47.078 "name": "BaseBdev2", 00:20:47.078 "uuid": "da2aef6d-0fbc-4861-b217-6bd185358bd1", 00:20:47.078 "is_configured": true, 00:20:47.078 "data_offset": 2048, 00:20:47.078 "data_size": 63488 00:20:47.078 }, 00:20:47.078 { 00:20:47.078 "name": "BaseBdev3", 00:20:47.078 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:47.078 "is_configured": false, 00:20:47.078 "data_offset": 0, 00:20:47.078 "data_size": 0 00:20:47.078 }, 00:20:47.078 { 00:20:47.078 "name": "BaseBdev4", 00:20:47.078 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:47.078 "is_configured": false, 00:20:47.078 "data_offset": 0, 00:20:47.078 "data_size": 0 00:20:47.078 } 00:20:47.078 ] 00:20:47.078 }' 00:20:47.078 19:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:47.078 19:56:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:47.646 19:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:47.904 [2024-07-24 19:56:39.456238] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:47.904 BaseBdev3 00:20:47.904 19:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:20:47.904 19:56:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:20:47.904 19:56:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:47.904 19:56:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:20:47.904 19:56:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:47.904 19:56:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:47.904 19:56:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:48.164 19:56:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:48.423 [ 00:20:48.423 { 00:20:48.423 "name": "BaseBdev3", 00:20:48.423 "aliases": [ 00:20:48.423 "903370ea-9e32-4f81-86ba-f647d5d38b9c" 00:20:48.423 ], 00:20:48.423 "product_name": "Malloc disk", 00:20:48.423 "block_size": 512, 00:20:48.423 "num_blocks": 65536, 00:20:48.423 "uuid": "903370ea-9e32-4f81-86ba-f647d5d38b9c", 00:20:48.423 "assigned_rate_limits": { 00:20:48.423 "rw_ios_per_sec": 0, 00:20:48.423 "rw_mbytes_per_sec": 0, 00:20:48.423 "r_mbytes_per_sec": 0, 00:20:48.423 "w_mbytes_per_sec": 0 00:20:48.423 }, 00:20:48.423 "claimed": true, 00:20:48.423 "claim_type": "exclusive_write", 00:20:48.423 "zoned": false, 00:20:48.423 "supported_io_types": { 00:20:48.423 "read": true, 00:20:48.423 "write": true, 00:20:48.423 "unmap": true, 00:20:48.423 "flush": true, 00:20:48.423 "reset": true, 00:20:48.423 "nvme_admin": false, 00:20:48.423 "nvme_io": false, 00:20:48.423 "nvme_io_md": false, 00:20:48.423 "write_zeroes": true, 00:20:48.423 "zcopy": true, 00:20:48.423 "get_zone_info": false, 00:20:48.423 "zone_management": false, 00:20:48.423 "zone_append": false, 00:20:48.423 "compare": false, 00:20:48.423 "compare_and_write": false, 00:20:48.423 "abort": true, 00:20:48.423 "seek_hole": false, 00:20:48.423 "seek_data": false, 00:20:48.423 "copy": true, 00:20:48.423 "nvme_iov_md": false 00:20:48.423 }, 00:20:48.423 "memory_domains": [ 00:20:48.423 { 00:20:48.423 "dma_device_id": "system", 00:20:48.423 "dma_device_type": 1 00:20:48.423 }, 00:20:48.423 { 00:20:48.423 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:48.423 "dma_device_type": 2 00:20:48.423 } 00:20:48.423 ], 00:20:48.423 "driver_specific": {} 00:20:48.423 } 00:20:48.423 ] 00:20:48.423 19:56:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:20:48.423 19:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:48.423 19:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:48.423 19:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:48.423 19:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:48.423 19:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:48.423 19:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:48.423 19:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:48.423 19:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:48.423 19:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:48.423 19:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:48.423 19:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:48.423 19:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:48.423 19:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:48.423 19:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:48.682 19:56:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:48.682 "name": "Existed_Raid", 00:20:48.682 "uuid": "e086bfdc-1395-4345-82fe-39094943388e", 00:20:48.682 "strip_size_kb": 64, 00:20:48.682 "state": "configuring", 00:20:48.682 "raid_level": "concat", 00:20:48.682 "superblock": true, 00:20:48.682 "num_base_bdevs": 4, 00:20:48.682 "num_base_bdevs_discovered": 3, 00:20:48.682 "num_base_bdevs_operational": 4, 00:20:48.682 "base_bdevs_list": [ 00:20:48.682 { 00:20:48.682 "name": "BaseBdev1", 00:20:48.682 "uuid": "fd333952-4a0e-427c-9b10-b72dfbeda3cf", 00:20:48.682 "is_configured": true, 00:20:48.682 "data_offset": 2048, 00:20:48.682 "data_size": 63488 00:20:48.682 }, 00:20:48.682 { 00:20:48.682 "name": "BaseBdev2", 00:20:48.682 "uuid": "da2aef6d-0fbc-4861-b217-6bd185358bd1", 00:20:48.682 "is_configured": true, 00:20:48.683 "data_offset": 2048, 00:20:48.683 "data_size": 63488 00:20:48.683 }, 00:20:48.683 { 00:20:48.683 "name": "BaseBdev3", 00:20:48.683 "uuid": "903370ea-9e32-4f81-86ba-f647d5d38b9c", 00:20:48.683 "is_configured": true, 00:20:48.683 "data_offset": 2048, 00:20:48.683 "data_size": 63488 00:20:48.683 }, 00:20:48.683 { 00:20:48.683 "name": "BaseBdev4", 00:20:48.683 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:48.683 "is_configured": false, 00:20:48.683 "data_offset": 0, 00:20:48.683 "data_size": 0 00:20:48.683 } 00:20:48.683 ] 00:20:48.683 }' 00:20:48.683 19:56:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:48.683 19:56:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:49.250 19:56:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:49.509 [2024-07-24 19:56:41.035818] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:49.509 [2024-07-24 19:56:41.035990] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2527300 00:20:49.509 [2024-07-24 19:56:41.036004] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:49.509 [2024-07-24 19:56:41.036186] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2528280 00:20:49.509 [2024-07-24 19:56:41.036316] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2527300 00:20:49.509 [2024-07-24 19:56:41.036326] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2527300 00:20:49.509 [2024-07-24 19:56:41.036435] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:49.509 BaseBdev4 00:20:49.509 19:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:20:49.509 19:56:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:20:49.509 19:56:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:49.509 19:56:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:20:49.509 19:56:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:49.509 19:56:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:49.509 19:56:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:49.767 19:56:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:50.026 [ 00:20:50.026 { 00:20:50.026 "name": "BaseBdev4", 00:20:50.026 "aliases": [ 00:20:50.026 "7bd38552-e13b-4f19-8883-64af40d98bcd" 00:20:50.026 ], 00:20:50.026 "product_name": "Malloc disk", 00:20:50.026 "block_size": 512, 00:20:50.026 "num_blocks": 65536, 00:20:50.026 "uuid": "7bd38552-e13b-4f19-8883-64af40d98bcd", 00:20:50.026 "assigned_rate_limits": { 00:20:50.026 "rw_ios_per_sec": 0, 00:20:50.026 "rw_mbytes_per_sec": 0, 00:20:50.026 "r_mbytes_per_sec": 0, 00:20:50.026 "w_mbytes_per_sec": 0 00:20:50.026 }, 00:20:50.026 "claimed": true, 00:20:50.026 "claim_type": "exclusive_write", 00:20:50.026 "zoned": false, 00:20:50.026 "supported_io_types": { 00:20:50.026 "read": true, 00:20:50.026 "write": true, 00:20:50.026 "unmap": true, 00:20:50.026 "flush": true, 00:20:50.026 "reset": true, 00:20:50.026 "nvme_admin": false, 00:20:50.026 "nvme_io": false, 00:20:50.026 "nvme_io_md": false, 00:20:50.026 "write_zeroes": true, 00:20:50.026 "zcopy": true, 00:20:50.026 "get_zone_info": false, 00:20:50.026 "zone_management": false, 00:20:50.026 "zone_append": false, 00:20:50.026 "compare": false, 00:20:50.026 "compare_and_write": false, 00:20:50.026 "abort": true, 00:20:50.026 "seek_hole": false, 00:20:50.026 "seek_data": false, 00:20:50.026 "copy": true, 00:20:50.026 "nvme_iov_md": false 00:20:50.026 }, 00:20:50.026 "memory_domains": [ 00:20:50.026 { 00:20:50.026 "dma_device_id": "system", 00:20:50.026 "dma_device_type": 1 00:20:50.026 }, 00:20:50.026 { 00:20:50.026 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:50.026 "dma_device_type": 2 00:20:50.026 } 00:20:50.026 ], 00:20:50.026 "driver_specific": {} 00:20:50.026 } 00:20:50.026 ] 00:20:50.026 19:56:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:20:50.026 19:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:50.026 19:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:50.026 19:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:20:50.026 19:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:50.026 19:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:50.026 19:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:50.026 19:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:50.027 19:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:50.027 19:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:50.027 19:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:50.027 19:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:50.027 19:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:50.027 19:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:50.027 19:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:50.285 19:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:50.285 "name": "Existed_Raid", 00:20:50.285 "uuid": "e086bfdc-1395-4345-82fe-39094943388e", 00:20:50.285 "strip_size_kb": 64, 00:20:50.285 "state": "online", 00:20:50.285 "raid_level": "concat", 00:20:50.285 "superblock": true, 00:20:50.285 "num_base_bdevs": 4, 00:20:50.285 "num_base_bdevs_discovered": 4, 00:20:50.285 "num_base_bdevs_operational": 4, 00:20:50.285 "base_bdevs_list": [ 00:20:50.285 { 00:20:50.285 "name": "BaseBdev1", 00:20:50.285 "uuid": "fd333952-4a0e-427c-9b10-b72dfbeda3cf", 00:20:50.285 "is_configured": true, 00:20:50.285 "data_offset": 2048, 00:20:50.285 "data_size": 63488 00:20:50.285 }, 00:20:50.285 { 00:20:50.285 "name": "BaseBdev2", 00:20:50.285 "uuid": "da2aef6d-0fbc-4861-b217-6bd185358bd1", 00:20:50.285 "is_configured": true, 00:20:50.285 "data_offset": 2048, 00:20:50.285 "data_size": 63488 00:20:50.285 }, 00:20:50.285 { 00:20:50.285 "name": "BaseBdev3", 00:20:50.285 "uuid": "903370ea-9e32-4f81-86ba-f647d5d38b9c", 00:20:50.285 "is_configured": true, 00:20:50.285 "data_offset": 2048, 00:20:50.285 "data_size": 63488 00:20:50.285 }, 00:20:50.285 { 00:20:50.285 "name": "BaseBdev4", 00:20:50.285 "uuid": "7bd38552-e13b-4f19-8883-64af40d98bcd", 00:20:50.285 "is_configured": true, 00:20:50.285 "data_offset": 2048, 00:20:50.285 "data_size": 63488 00:20:50.285 } 00:20:50.285 ] 00:20:50.285 }' 00:20:50.285 19:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:50.285 19:56:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:50.852 19:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:20:50.852 19:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:50.852 19:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:50.852 19:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:50.852 19:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:50.852 19:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:20:50.852 19:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:50.852 19:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:50.852 [2024-07-24 19:56:42.432037] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:51.111 19:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:51.111 "name": "Existed_Raid", 00:20:51.111 "aliases": [ 00:20:51.111 "e086bfdc-1395-4345-82fe-39094943388e" 00:20:51.111 ], 00:20:51.111 "product_name": "Raid Volume", 00:20:51.111 "block_size": 512, 00:20:51.111 "num_blocks": 253952, 00:20:51.111 "uuid": "e086bfdc-1395-4345-82fe-39094943388e", 00:20:51.111 "assigned_rate_limits": { 00:20:51.111 "rw_ios_per_sec": 0, 00:20:51.111 "rw_mbytes_per_sec": 0, 00:20:51.111 "r_mbytes_per_sec": 0, 00:20:51.111 "w_mbytes_per_sec": 0 00:20:51.111 }, 00:20:51.111 "claimed": false, 00:20:51.111 "zoned": false, 00:20:51.111 "supported_io_types": { 00:20:51.111 "read": true, 00:20:51.111 "write": true, 00:20:51.111 "unmap": true, 00:20:51.111 "flush": true, 00:20:51.111 "reset": true, 00:20:51.111 "nvme_admin": false, 00:20:51.111 "nvme_io": false, 00:20:51.111 "nvme_io_md": false, 00:20:51.111 "write_zeroes": true, 00:20:51.111 "zcopy": false, 00:20:51.111 "get_zone_info": false, 00:20:51.111 "zone_management": false, 00:20:51.111 "zone_append": false, 00:20:51.111 "compare": false, 00:20:51.111 "compare_and_write": false, 00:20:51.111 "abort": false, 00:20:51.111 "seek_hole": false, 00:20:51.111 "seek_data": false, 00:20:51.111 "copy": false, 00:20:51.111 "nvme_iov_md": false 00:20:51.111 }, 00:20:51.111 "memory_domains": [ 00:20:51.111 { 00:20:51.111 "dma_device_id": "system", 00:20:51.111 "dma_device_type": 1 00:20:51.111 }, 00:20:51.111 { 00:20:51.111 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:51.111 "dma_device_type": 2 00:20:51.111 }, 00:20:51.111 { 00:20:51.111 "dma_device_id": "system", 00:20:51.111 "dma_device_type": 1 00:20:51.111 }, 00:20:51.111 { 00:20:51.111 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:51.111 "dma_device_type": 2 00:20:51.111 }, 00:20:51.111 { 00:20:51.111 "dma_device_id": "system", 00:20:51.111 "dma_device_type": 1 00:20:51.111 }, 00:20:51.111 { 00:20:51.111 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:51.111 "dma_device_type": 2 00:20:51.111 }, 00:20:51.111 { 00:20:51.111 "dma_device_id": "system", 00:20:51.111 "dma_device_type": 1 00:20:51.111 }, 00:20:51.111 { 00:20:51.111 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:51.111 "dma_device_type": 2 00:20:51.111 } 00:20:51.111 ], 00:20:51.111 "driver_specific": { 00:20:51.111 "raid": { 00:20:51.111 "uuid": "e086bfdc-1395-4345-82fe-39094943388e", 00:20:51.111 "strip_size_kb": 64, 00:20:51.111 "state": "online", 00:20:51.111 "raid_level": "concat", 00:20:51.111 "superblock": true, 00:20:51.111 "num_base_bdevs": 4, 00:20:51.111 "num_base_bdevs_discovered": 4, 00:20:51.111 "num_base_bdevs_operational": 4, 00:20:51.111 "base_bdevs_list": [ 00:20:51.111 { 00:20:51.111 "name": "BaseBdev1", 00:20:51.111 "uuid": "fd333952-4a0e-427c-9b10-b72dfbeda3cf", 00:20:51.111 "is_configured": true, 00:20:51.111 "data_offset": 2048, 00:20:51.111 "data_size": 63488 00:20:51.111 }, 00:20:51.111 { 00:20:51.111 "name": "BaseBdev2", 00:20:51.111 "uuid": "da2aef6d-0fbc-4861-b217-6bd185358bd1", 00:20:51.111 "is_configured": true, 00:20:51.111 "data_offset": 2048, 00:20:51.111 "data_size": 63488 00:20:51.111 }, 00:20:51.111 { 00:20:51.111 "name": "BaseBdev3", 00:20:51.111 "uuid": "903370ea-9e32-4f81-86ba-f647d5d38b9c", 00:20:51.111 "is_configured": true, 00:20:51.111 "data_offset": 2048, 00:20:51.111 "data_size": 63488 00:20:51.111 }, 00:20:51.111 { 00:20:51.112 "name": "BaseBdev4", 00:20:51.112 "uuid": "7bd38552-e13b-4f19-8883-64af40d98bcd", 00:20:51.112 "is_configured": true, 00:20:51.112 "data_offset": 2048, 00:20:51.112 "data_size": 63488 00:20:51.112 } 00:20:51.112 ] 00:20:51.112 } 00:20:51.112 } 00:20:51.112 }' 00:20:51.112 19:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:51.112 19:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:20:51.112 BaseBdev2 00:20:51.112 BaseBdev3 00:20:51.112 BaseBdev4' 00:20:51.112 19:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:51.112 19:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:20:51.112 19:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:51.370 19:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:51.370 "name": "BaseBdev1", 00:20:51.370 "aliases": [ 00:20:51.370 "fd333952-4a0e-427c-9b10-b72dfbeda3cf" 00:20:51.370 ], 00:20:51.370 "product_name": "Malloc disk", 00:20:51.370 "block_size": 512, 00:20:51.370 "num_blocks": 65536, 00:20:51.370 "uuid": "fd333952-4a0e-427c-9b10-b72dfbeda3cf", 00:20:51.370 "assigned_rate_limits": { 00:20:51.370 "rw_ios_per_sec": 0, 00:20:51.370 "rw_mbytes_per_sec": 0, 00:20:51.370 "r_mbytes_per_sec": 0, 00:20:51.370 "w_mbytes_per_sec": 0 00:20:51.370 }, 00:20:51.370 "claimed": true, 00:20:51.370 "claim_type": "exclusive_write", 00:20:51.370 "zoned": false, 00:20:51.370 "supported_io_types": { 00:20:51.370 "read": true, 00:20:51.370 "write": true, 00:20:51.370 "unmap": true, 00:20:51.370 "flush": true, 00:20:51.370 "reset": true, 00:20:51.370 "nvme_admin": false, 00:20:51.370 "nvme_io": false, 00:20:51.370 "nvme_io_md": false, 00:20:51.370 "write_zeroes": true, 00:20:51.370 "zcopy": true, 00:20:51.370 "get_zone_info": false, 00:20:51.370 "zone_management": false, 00:20:51.370 "zone_append": false, 00:20:51.370 "compare": false, 00:20:51.370 "compare_and_write": false, 00:20:51.370 "abort": true, 00:20:51.370 "seek_hole": false, 00:20:51.370 "seek_data": false, 00:20:51.370 "copy": true, 00:20:51.370 "nvme_iov_md": false 00:20:51.370 }, 00:20:51.370 "memory_domains": [ 00:20:51.370 { 00:20:51.370 "dma_device_id": "system", 00:20:51.370 "dma_device_type": 1 00:20:51.370 }, 00:20:51.370 { 00:20:51.370 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:51.370 "dma_device_type": 2 00:20:51.370 } 00:20:51.370 ], 00:20:51.370 "driver_specific": {} 00:20:51.370 }' 00:20:51.370 19:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:51.370 19:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:51.370 19:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:51.370 19:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:51.370 19:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:51.370 19:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:51.370 19:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:51.629 19:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:51.629 19:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:51.629 19:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:51.629 19:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:51.629 19:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:51.629 19:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:51.629 19:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:51.629 19:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:51.886 19:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:51.886 "name": "BaseBdev2", 00:20:51.886 "aliases": [ 00:20:51.886 "da2aef6d-0fbc-4861-b217-6bd185358bd1" 00:20:51.886 ], 00:20:51.886 "product_name": "Malloc disk", 00:20:51.886 "block_size": 512, 00:20:51.886 "num_blocks": 65536, 00:20:51.886 "uuid": "da2aef6d-0fbc-4861-b217-6bd185358bd1", 00:20:51.886 "assigned_rate_limits": { 00:20:51.886 "rw_ios_per_sec": 0, 00:20:51.886 "rw_mbytes_per_sec": 0, 00:20:51.886 "r_mbytes_per_sec": 0, 00:20:51.886 "w_mbytes_per_sec": 0 00:20:51.886 }, 00:20:51.886 "claimed": true, 00:20:51.886 "claim_type": "exclusive_write", 00:20:51.886 "zoned": false, 00:20:51.886 "supported_io_types": { 00:20:51.886 "read": true, 00:20:51.886 "write": true, 00:20:51.886 "unmap": true, 00:20:51.886 "flush": true, 00:20:51.886 "reset": true, 00:20:51.886 "nvme_admin": false, 00:20:51.886 "nvme_io": false, 00:20:51.886 "nvme_io_md": false, 00:20:51.886 "write_zeroes": true, 00:20:51.886 "zcopy": true, 00:20:51.886 "get_zone_info": false, 00:20:51.886 "zone_management": false, 00:20:51.886 "zone_append": false, 00:20:51.886 "compare": false, 00:20:51.886 "compare_and_write": false, 00:20:51.886 "abort": true, 00:20:51.886 "seek_hole": false, 00:20:51.886 "seek_data": false, 00:20:51.886 "copy": true, 00:20:51.886 "nvme_iov_md": false 00:20:51.886 }, 00:20:51.886 "memory_domains": [ 00:20:51.886 { 00:20:51.886 "dma_device_id": "system", 00:20:51.886 "dma_device_type": 1 00:20:51.886 }, 00:20:51.886 { 00:20:51.886 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:51.886 "dma_device_type": 2 00:20:51.886 } 00:20:51.886 ], 00:20:51.886 "driver_specific": {} 00:20:51.886 }' 00:20:51.886 19:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:51.886 19:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:51.886 19:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:51.886 19:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:52.143 19:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:52.143 19:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:52.143 19:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:52.143 19:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:52.143 19:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:52.143 19:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:52.143 19:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:52.143 19:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:52.143 19:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:52.400 19:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:52.400 19:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:52.400 19:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:52.400 "name": "BaseBdev3", 00:20:52.400 "aliases": [ 00:20:52.400 "903370ea-9e32-4f81-86ba-f647d5d38b9c" 00:20:52.400 ], 00:20:52.400 "product_name": "Malloc disk", 00:20:52.400 "block_size": 512, 00:20:52.400 "num_blocks": 65536, 00:20:52.400 "uuid": "903370ea-9e32-4f81-86ba-f647d5d38b9c", 00:20:52.400 "assigned_rate_limits": { 00:20:52.400 "rw_ios_per_sec": 0, 00:20:52.400 "rw_mbytes_per_sec": 0, 00:20:52.400 "r_mbytes_per_sec": 0, 00:20:52.400 "w_mbytes_per_sec": 0 00:20:52.400 }, 00:20:52.400 "claimed": true, 00:20:52.400 "claim_type": "exclusive_write", 00:20:52.400 "zoned": false, 00:20:52.400 "supported_io_types": { 00:20:52.400 "read": true, 00:20:52.400 "write": true, 00:20:52.400 "unmap": true, 00:20:52.400 "flush": true, 00:20:52.400 "reset": true, 00:20:52.400 "nvme_admin": false, 00:20:52.400 "nvme_io": false, 00:20:52.400 "nvme_io_md": false, 00:20:52.400 "write_zeroes": true, 00:20:52.400 "zcopy": true, 00:20:52.400 "get_zone_info": false, 00:20:52.400 "zone_management": false, 00:20:52.400 "zone_append": false, 00:20:52.400 "compare": false, 00:20:52.400 "compare_and_write": false, 00:20:52.400 "abort": true, 00:20:52.400 "seek_hole": false, 00:20:52.400 "seek_data": false, 00:20:52.400 "copy": true, 00:20:52.400 "nvme_iov_md": false 00:20:52.400 }, 00:20:52.400 "memory_domains": [ 00:20:52.400 { 00:20:52.400 "dma_device_id": "system", 00:20:52.400 "dma_device_type": 1 00:20:52.400 }, 00:20:52.400 { 00:20:52.400 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:52.400 "dma_device_type": 2 00:20:52.400 } 00:20:52.400 ], 00:20:52.400 "driver_specific": {} 00:20:52.400 }' 00:20:52.400 19:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:52.658 19:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:52.658 19:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:52.658 19:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:52.658 19:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:52.658 19:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:52.658 19:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:52.658 19:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:52.658 19:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:52.658 19:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:52.916 19:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:52.916 19:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:52.916 19:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:52.916 19:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:52.916 19:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:53.174 19:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:53.174 "name": "BaseBdev4", 00:20:53.174 "aliases": [ 00:20:53.174 "7bd38552-e13b-4f19-8883-64af40d98bcd" 00:20:53.174 ], 00:20:53.174 "product_name": "Malloc disk", 00:20:53.174 "block_size": 512, 00:20:53.174 "num_blocks": 65536, 00:20:53.174 "uuid": "7bd38552-e13b-4f19-8883-64af40d98bcd", 00:20:53.174 "assigned_rate_limits": { 00:20:53.174 "rw_ios_per_sec": 0, 00:20:53.174 "rw_mbytes_per_sec": 0, 00:20:53.174 "r_mbytes_per_sec": 0, 00:20:53.174 "w_mbytes_per_sec": 0 00:20:53.174 }, 00:20:53.174 "claimed": true, 00:20:53.174 "claim_type": "exclusive_write", 00:20:53.174 "zoned": false, 00:20:53.174 "supported_io_types": { 00:20:53.174 "read": true, 00:20:53.174 "write": true, 00:20:53.174 "unmap": true, 00:20:53.174 "flush": true, 00:20:53.174 "reset": true, 00:20:53.174 "nvme_admin": false, 00:20:53.174 "nvme_io": false, 00:20:53.174 "nvme_io_md": false, 00:20:53.174 "write_zeroes": true, 00:20:53.174 "zcopy": true, 00:20:53.174 "get_zone_info": false, 00:20:53.174 "zone_management": false, 00:20:53.174 "zone_append": false, 00:20:53.174 "compare": false, 00:20:53.174 "compare_and_write": false, 00:20:53.174 "abort": true, 00:20:53.174 "seek_hole": false, 00:20:53.174 "seek_data": false, 00:20:53.174 "copy": true, 00:20:53.174 "nvme_iov_md": false 00:20:53.174 }, 00:20:53.174 "memory_domains": [ 00:20:53.174 { 00:20:53.174 "dma_device_id": "system", 00:20:53.174 "dma_device_type": 1 00:20:53.174 }, 00:20:53.174 { 00:20:53.174 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:53.174 "dma_device_type": 2 00:20:53.174 } 00:20:53.174 ], 00:20:53.174 "driver_specific": {} 00:20:53.174 }' 00:20:53.174 19:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:53.174 19:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:53.175 19:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:53.175 19:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:53.175 19:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:53.175 19:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:53.175 19:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:53.433 19:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:53.433 19:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:53.433 19:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:53.433 19:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:53.433 19:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:53.433 19:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:53.691 [2024-07-24 19:56:45.150984] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:53.691 [2024-07-24 19:56:45.151013] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:53.691 [2024-07-24 19:56:45.151062] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:53.691 19:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:20:53.691 19:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:20:53.691 19:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:53.691 19:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:20:53.691 19:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:20:53.691 19:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:20:53.691 19:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:53.691 19:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:20:53.691 19:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:53.691 19:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:53.691 19:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:53.691 19:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:53.691 19:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:53.691 19:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:53.691 19:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:53.691 19:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:53.691 19:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:54.001 19:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:54.001 "name": "Existed_Raid", 00:20:54.001 "uuid": "e086bfdc-1395-4345-82fe-39094943388e", 00:20:54.001 "strip_size_kb": 64, 00:20:54.001 "state": "offline", 00:20:54.001 "raid_level": "concat", 00:20:54.001 "superblock": true, 00:20:54.001 "num_base_bdevs": 4, 00:20:54.001 "num_base_bdevs_discovered": 3, 00:20:54.001 "num_base_bdevs_operational": 3, 00:20:54.001 "base_bdevs_list": [ 00:20:54.001 { 00:20:54.001 "name": null, 00:20:54.001 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:54.001 "is_configured": false, 00:20:54.001 "data_offset": 2048, 00:20:54.001 "data_size": 63488 00:20:54.001 }, 00:20:54.001 { 00:20:54.001 "name": "BaseBdev2", 00:20:54.001 "uuid": "da2aef6d-0fbc-4861-b217-6bd185358bd1", 00:20:54.001 "is_configured": true, 00:20:54.001 "data_offset": 2048, 00:20:54.001 "data_size": 63488 00:20:54.001 }, 00:20:54.001 { 00:20:54.001 "name": "BaseBdev3", 00:20:54.001 "uuid": "903370ea-9e32-4f81-86ba-f647d5d38b9c", 00:20:54.001 "is_configured": true, 00:20:54.001 "data_offset": 2048, 00:20:54.001 "data_size": 63488 00:20:54.001 }, 00:20:54.001 { 00:20:54.001 "name": "BaseBdev4", 00:20:54.001 "uuid": "7bd38552-e13b-4f19-8883-64af40d98bcd", 00:20:54.001 "is_configured": true, 00:20:54.001 "data_offset": 2048, 00:20:54.001 "data_size": 63488 00:20:54.001 } 00:20:54.001 ] 00:20:54.001 }' 00:20:54.001 19:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:54.001 19:56:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:54.569 19:56:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:20:54.569 19:56:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:54.569 19:56:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:54.569 19:56:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:54.827 19:56:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:54.827 19:56:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:54.827 19:56:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:20:55.394 [2024-07-24 19:56:46.756248] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:55.394 19:56:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:55.394 19:56:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:55.395 19:56:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:55.395 19:56:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:55.653 19:56:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:55.653 19:56:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:55.653 19:56:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:20:55.911 [2024-07-24 19:56:47.322128] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:55.911 19:56:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:55.911 19:56:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:55.911 19:56:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:55.911 19:56:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:56.169 19:56:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:56.169 19:56:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:56.169 19:56:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:20:56.427 [2024-07-24 19:56:47.810159] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:20:56.427 [2024-07-24 19:56:47.810203] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2527300 name Existed_Raid, state offline 00:20:56.427 19:56:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:56.427 19:56:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:56.427 19:56:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:56.427 19:56:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:20:56.686 19:56:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:20:56.686 19:56:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:20:56.686 19:56:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:20:56.686 19:56:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:20:56.686 19:56:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:56.686 19:56:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:57.312 BaseBdev2 00:20:57.312 19:56:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:20:57.312 19:56:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:20:57.312 19:56:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:57.312 19:56:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:20:57.312 19:56:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:57.312 19:56:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:57.312 19:56:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:57.312 19:56:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:57.570 [ 00:20:57.570 { 00:20:57.570 "name": "BaseBdev2", 00:20:57.570 "aliases": [ 00:20:57.570 "88558d07-53ff-4bf0-80dd-8ad2f7ada37f" 00:20:57.570 ], 00:20:57.570 "product_name": "Malloc disk", 00:20:57.570 "block_size": 512, 00:20:57.570 "num_blocks": 65536, 00:20:57.570 "uuid": "88558d07-53ff-4bf0-80dd-8ad2f7ada37f", 00:20:57.570 "assigned_rate_limits": { 00:20:57.570 "rw_ios_per_sec": 0, 00:20:57.570 "rw_mbytes_per_sec": 0, 00:20:57.570 "r_mbytes_per_sec": 0, 00:20:57.570 "w_mbytes_per_sec": 0 00:20:57.570 }, 00:20:57.570 "claimed": false, 00:20:57.570 "zoned": false, 00:20:57.570 "supported_io_types": { 00:20:57.570 "read": true, 00:20:57.570 "write": true, 00:20:57.570 "unmap": true, 00:20:57.570 "flush": true, 00:20:57.570 "reset": true, 00:20:57.570 "nvme_admin": false, 00:20:57.570 "nvme_io": false, 00:20:57.570 "nvme_io_md": false, 00:20:57.570 "write_zeroes": true, 00:20:57.570 "zcopy": true, 00:20:57.570 "get_zone_info": false, 00:20:57.570 "zone_management": false, 00:20:57.570 "zone_append": false, 00:20:57.570 "compare": false, 00:20:57.570 "compare_and_write": false, 00:20:57.570 "abort": true, 00:20:57.570 "seek_hole": false, 00:20:57.570 "seek_data": false, 00:20:57.570 "copy": true, 00:20:57.570 "nvme_iov_md": false 00:20:57.570 }, 00:20:57.570 "memory_domains": [ 00:20:57.570 { 00:20:57.570 "dma_device_id": "system", 00:20:57.570 "dma_device_type": 1 00:20:57.570 }, 00:20:57.570 { 00:20:57.570 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:57.570 "dma_device_type": 2 00:20:57.570 } 00:20:57.570 ], 00:20:57.570 "driver_specific": {} 00:20:57.570 } 00:20:57.570 ] 00:20:57.570 19:56:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:20:57.570 19:56:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:57.570 19:56:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:57.571 19:56:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:57.828 BaseBdev3 00:20:57.828 19:56:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:20:57.828 19:56:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:20:57.828 19:56:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:57.828 19:56:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:20:57.828 19:56:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:57.828 19:56:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:57.828 19:56:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:58.087 19:56:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:58.346 [ 00:20:58.346 { 00:20:58.346 "name": "BaseBdev3", 00:20:58.346 "aliases": [ 00:20:58.346 "2beb4e84-0af7-4402-89f0-2f8abc4eae00" 00:20:58.346 ], 00:20:58.346 "product_name": "Malloc disk", 00:20:58.346 "block_size": 512, 00:20:58.346 "num_blocks": 65536, 00:20:58.346 "uuid": "2beb4e84-0af7-4402-89f0-2f8abc4eae00", 00:20:58.346 "assigned_rate_limits": { 00:20:58.346 "rw_ios_per_sec": 0, 00:20:58.346 "rw_mbytes_per_sec": 0, 00:20:58.346 "r_mbytes_per_sec": 0, 00:20:58.346 "w_mbytes_per_sec": 0 00:20:58.346 }, 00:20:58.346 "claimed": false, 00:20:58.346 "zoned": false, 00:20:58.346 "supported_io_types": { 00:20:58.346 "read": true, 00:20:58.346 "write": true, 00:20:58.346 "unmap": true, 00:20:58.346 "flush": true, 00:20:58.346 "reset": true, 00:20:58.346 "nvme_admin": false, 00:20:58.346 "nvme_io": false, 00:20:58.346 "nvme_io_md": false, 00:20:58.346 "write_zeroes": true, 00:20:58.346 "zcopy": true, 00:20:58.346 "get_zone_info": false, 00:20:58.346 "zone_management": false, 00:20:58.346 "zone_append": false, 00:20:58.346 "compare": false, 00:20:58.346 "compare_and_write": false, 00:20:58.346 "abort": true, 00:20:58.346 "seek_hole": false, 00:20:58.346 "seek_data": false, 00:20:58.346 "copy": true, 00:20:58.346 "nvme_iov_md": false 00:20:58.346 }, 00:20:58.346 "memory_domains": [ 00:20:58.346 { 00:20:58.346 "dma_device_id": "system", 00:20:58.346 "dma_device_type": 1 00:20:58.346 }, 00:20:58.346 { 00:20:58.346 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:58.346 "dma_device_type": 2 00:20:58.346 } 00:20:58.346 ], 00:20:58.346 "driver_specific": {} 00:20:58.346 } 00:20:58.346 ] 00:20:58.346 19:56:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:20:58.346 19:56:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:58.346 19:56:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:58.346 19:56:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:58.605 BaseBdev4 00:20:58.605 19:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:20:58.605 19:56:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:20:58.605 19:56:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:58.605 19:56:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:20:58.605 19:56:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:58.605 19:56:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:58.605 19:56:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:58.864 19:56:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:58.864 [ 00:20:58.864 { 00:20:58.864 "name": "BaseBdev4", 00:20:58.864 "aliases": [ 00:20:58.864 "8218172e-c5e2-427d-b3e1-2ca84f7460de" 00:20:58.864 ], 00:20:58.864 "product_name": "Malloc disk", 00:20:58.864 "block_size": 512, 00:20:58.864 "num_blocks": 65536, 00:20:58.864 "uuid": "8218172e-c5e2-427d-b3e1-2ca84f7460de", 00:20:58.864 "assigned_rate_limits": { 00:20:58.864 "rw_ios_per_sec": 0, 00:20:58.864 "rw_mbytes_per_sec": 0, 00:20:58.864 "r_mbytes_per_sec": 0, 00:20:58.864 "w_mbytes_per_sec": 0 00:20:58.864 }, 00:20:58.864 "claimed": false, 00:20:58.864 "zoned": false, 00:20:58.864 "supported_io_types": { 00:20:58.864 "read": true, 00:20:58.864 "write": true, 00:20:58.864 "unmap": true, 00:20:58.864 "flush": true, 00:20:58.864 "reset": true, 00:20:58.864 "nvme_admin": false, 00:20:58.864 "nvme_io": false, 00:20:58.864 "nvme_io_md": false, 00:20:58.864 "write_zeroes": true, 00:20:58.864 "zcopy": true, 00:20:58.864 "get_zone_info": false, 00:20:58.864 "zone_management": false, 00:20:58.864 "zone_append": false, 00:20:58.864 "compare": false, 00:20:58.864 "compare_and_write": false, 00:20:58.864 "abort": true, 00:20:58.864 "seek_hole": false, 00:20:58.864 "seek_data": false, 00:20:58.864 "copy": true, 00:20:58.864 "nvme_iov_md": false 00:20:58.864 }, 00:20:58.864 "memory_domains": [ 00:20:58.864 { 00:20:58.864 "dma_device_id": "system", 00:20:58.864 "dma_device_type": 1 00:20:58.864 }, 00:20:58.864 { 00:20:58.864 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:58.864 "dma_device_type": 2 00:20:58.864 } 00:20:58.864 ], 00:20:58.864 "driver_specific": {} 00:20:58.864 } 00:20:58.864 ] 00:20:59.124 19:56:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:20:59.124 19:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:59.124 19:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:59.124 19:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:59.124 [2024-07-24 19:56:50.707547] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:59.124 [2024-07-24 19:56:50.707588] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:59.124 [2024-07-24 19:56:50.707608] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:59.124 [2024-07-24 19:56:50.708939] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:59.124 [2024-07-24 19:56:50.708981] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:59.383 19:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:59.383 19:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:59.383 19:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:59.383 19:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:59.383 19:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:59.383 19:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:59.383 19:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:59.383 19:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:59.383 19:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:59.383 19:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:59.383 19:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:59.383 19:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:59.642 19:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:59.642 "name": "Existed_Raid", 00:20:59.642 "uuid": "b132390a-e902-4348-a67d-53f10b5d05ce", 00:20:59.642 "strip_size_kb": 64, 00:20:59.642 "state": "configuring", 00:20:59.642 "raid_level": "concat", 00:20:59.642 "superblock": true, 00:20:59.642 "num_base_bdevs": 4, 00:20:59.642 "num_base_bdevs_discovered": 3, 00:20:59.642 "num_base_bdevs_operational": 4, 00:20:59.642 "base_bdevs_list": [ 00:20:59.642 { 00:20:59.642 "name": "BaseBdev1", 00:20:59.642 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:59.642 "is_configured": false, 00:20:59.642 "data_offset": 0, 00:20:59.642 "data_size": 0 00:20:59.642 }, 00:20:59.642 { 00:20:59.642 "name": "BaseBdev2", 00:20:59.642 "uuid": "88558d07-53ff-4bf0-80dd-8ad2f7ada37f", 00:20:59.642 "is_configured": true, 00:20:59.642 "data_offset": 2048, 00:20:59.642 "data_size": 63488 00:20:59.642 }, 00:20:59.642 { 00:20:59.642 "name": "BaseBdev3", 00:20:59.642 "uuid": "2beb4e84-0af7-4402-89f0-2f8abc4eae00", 00:20:59.642 "is_configured": true, 00:20:59.642 "data_offset": 2048, 00:20:59.642 "data_size": 63488 00:20:59.642 }, 00:20:59.642 { 00:20:59.642 "name": "BaseBdev4", 00:20:59.642 "uuid": "8218172e-c5e2-427d-b3e1-2ca84f7460de", 00:20:59.642 "is_configured": true, 00:20:59.642 "data_offset": 2048, 00:20:59.642 "data_size": 63488 00:20:59.642 } 00:20:59.643 ] 00:20:59.643 }' 00:20:59.643 19:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:59.643 19:56:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:00.211 19:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:00.211 [2024-07-24 19:56:51.730216] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:00.211 19:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:00.211 19:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:00.211 19:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:00.211 19:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:00.211 19:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:00.211 19:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:00.211 19:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:00.211 19:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:00.211 19:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:00.211 19:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:00.211 19:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:00.211 19:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:00.470 19:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:00.470 "name": "Existed_Raid", 00:21:00.470 "uuid": "b132390a-e902-4348-a67d-53f10b5d05ce", 00:21:00.470 "strip_size_kb": 64, 00:21:00.470 "state": "configuring", 00:21:00.470 "raid_level": "concat", 00:21:00.470 "superblock": true, 00:21:00.470 "num_base_bdevs": 4, 00:21:00.470 "num_base_bdevs_discovered": 2, 00:21:00.470 "num_base_bdevs_operational": 4, 00:21:00.470 "base_bdevs_list": [ 00:21:00.470 { 00:21:00.470 "name": "BaseBdev1", 00:21:00.470 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:00.470 "is_configured": false, 00:21:00.470 "data_offset": 0, 00:21:00.470 "data_size": 0 00:21:00.470 }, 00:21:00.470 { 00:21:00.470 "name": null, 00:21:00.470 "uuid": "88558d07-53ff-4bf0-80dd-8ad2f7ada37f", 00:21:00.470 "is_configured": false, 00:21:00.470 "data_offset": 2048, 00:21:00.470 "data_size": 63488 00:21:00.470 }, 00:21:00.470 { 00:21:00.470 "name": "BaseBdev3", 00:21:00.470 "uuid": "2beb4e84-0af7-4402-89f0-2f8abc4eae00", 00:21:00.470 "is_configured": true, 00:21:00.470 "data_offset": 2048, 00:21:00.470 "data_size": 63488 00:21:00.470 }, 00:21:00.470 { 00:21:00.470 "name": "BaseBdev4", 00:21:00.470 "uuid": "8218172e-c5e2-427d-b3e1-2ca84f7460de", 00:21:00.470 "is_configured": true, 00:21:00.470 "data_offset": 2048, 00:21:00.470 "data_size": 63488 00:21:00.470 } 00:21:00.470 ] 00:21:00.470 }' 00:21:00.470 19:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:00.470 19:56:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:01.038 19:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:01.038 19:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:01.297 19:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:21:01.297 19:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:01.556 [2024-07-24 19:56:53.002183] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:01.556 BaseBdev1 00:21:01.556 19:56:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:21:01.556 19:56:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:21:01.556 19:56:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:01.556 19:56:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:21:01.556 19:56:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:01.556 19:56:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:01.556 19:56:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:02.124 19:56:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:02.383 [ 00:21:02.383 { 00:21:02.383 "name": "BaseBdev1", 00:21:02.383 "aliases": [ 00:21:02.383 "35a823d8-a0b1-4472-a6dd-4048bd01c2cf" 00:21:02.383 ], 00:21:02.383 "product_name": "Malloc disk", 00:21:02.383 "block_size": 512, 00:21:02.383 "num_blocks": 65536, 00:21:02.383 "uuid": "35a823d8-a0b1-4472-a6dd-4048bd01c2cf", 00:21:02.383 "assigned_rate_limits": { 00:21:02.383 "rw_ios_per_sec": 0, 00:21:02.383 "rw_mbytes_per_sec": 0, 00:21:02.383 "r_mbytes_per_sec": 0, 00:21:02.383 "w_mbytes_per_sec": 0 00:21:02.383 }, 00:21:02.383 "claimed": true, 00:21:02.383 "claim_type": "exclusive_write", 00:21:02.383 "zoned": false, 00:21:02.383 "supported_io_types": { 00:21:02.383 "read": true, 00:21:02.383 "write": true, 00:21:02.383 "unmap": true, 00:21:02.383 "flush": true, 00:21:02.383 "reset": true, 00:21:02.383 "nvme_admin": false, 00:21:02.383 "nvme_io": false, 00:21:02.383 "nvme_io_md": false, 00:21:02.383 "write_zeroes": true, 00:21:02.383 "zcopy": true, 00:21:02.383 "get_zone_info": false, 00:21:02.383 "zone_management": false, 00:21:02.383 "zone_append": false, 00:21:02.383 "compare": false, 00:21:02.383 "compare_and_write": false, 00:21:02.383 "abort": true, 00:21:02.383 "seek_hole": false, 00:21:02.383 "seek_data": false, 00:21:02.383 "copy": true, 00:21:02.383 "nvme_iov_md": false 00:21:02.383 }, 00:21:02.383 "memory_domains": [ 00:21:02.383 { 00:21:02.383 "dma_device_id": "system", 00:21:02.383 "dma_device_type": 1 00:21:02.383 }, 00:21:02.383 { 00:21:02.383 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:02.383 "dma_device_type": 2 00:21:02.383 } 00:21:02.383 ], 00:21:02.383 "driver_specific": {} 00:21:02.383 } 00:21:02.383 ] 00:21:02.383 19:56:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:21:02.383 19:56:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:02.383 19:56:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:02.383 19:56:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:02.384 19:56:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:02.384 19:56:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:02.384 19:56:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:02.384 19:56:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:02.384 19:56:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:02.384 19:56:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:02.384 19:56:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:02.384 19:56:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:02.384 19:56:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:02.643 19:56:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:02.643 "name": "Existed_Raid", 00:21:02.643 "uuid": "b132390a-e902-4348-a67d-53f10b5d05ce", 00:21:02.643 "strip_size_kb": 64, 00:21:02.643 "state": "configuring", 00:21:02.643 "raid_level": "concat", 00:21:02.643 "superblock": true, 00:21:02.643 "num_base_bdevs": 4, 00:21:02.643 "num_base_bdevs_discovered": 3, 00:21:02.643 "num_base_bdevs_operational": 4, 00:21:02.643 "base_bdevs_list": [ 00:21:02.643 { 00:21:02.643 "name": "BaseBdev1", 00:21:02.643 "uuid": "35a823d8-a0b1-4472-a6dd-4048bd01c2cf", 00:21:02.643 "is_configured": true, 00:21:02.643 "data_offset": 2048, 00:21:02.643 "data_size": 63488 00:21:02.643 }, 00:21:02.643 { 00:21:02.643 "name": null, 00:21:02.643 "uuid": "88558d07-53ff-4bf0-80dd-8ad2f7ada37f", 00:21:02.643 "is_configured": false, 00:21:02.643 "data_offset": 2048, 00:21:02.643 "data_size": 63488 00:21:02.643 }, 00:21:02.643 { 00:21:02.643 "name": "BaseBdev3", 00:21:02.643 "uuid": "2beb4e84-0af7-4402-89f0-2f8abc4eae00", 00:21:02.643 "is_configured": true, 00:21:02.643 "data_offset": 2048, 00:21:02.643 "data_size": 63488 00:21:02.643 }, 00:21:02.643 { 00:21:02.643 "name": "BaseBdev4", 00:21:02.643 "uuid": "8218172e-c5e2-427d-b3e1-2ca84f7460de", 00:21:02.643 "is_configured": true, 00:21:02.643 "data_offset": 2048, 00:21:02.643 "data_size": 63488 00:21:02.643 } 00:21:02.643 ] 00:21:02.643 }' 00:21:02.643 19:56:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:02.643 19:56:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:03.211 19:56:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:03.211 19:56:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:03.470 19:56:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:21:03.470 19:56:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:21:03.729 [2024-07-24 19:56:55.296286] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:03.988 19:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:03.988 19:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:03.988 19:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:03.988 19:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:03.988 19:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:03.988 19:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:03.988 19:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:03.988 19:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:03.988 19:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:03.988 19:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:03.988 19:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:03.988 19:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:03.988 19:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:03.988 "name": "Existed_Raid", 00:21:03.988 "uuid": "b132390a-e902-4348-a67d-53f10b5d05ce", 00:21:03.988 "strip_size_kb": 64, 00:21:03.988 "state": "configuring", 00:21:03.988 "raid_level": "concat", 00:21:03.988 "superblock": true, 00:21:03.988 "num_base_bdevs": 4, 00:21:03.988 "num_base_bdevs_discovered": 2, 00:21:03.988 "num_base_bdevs_operational": 4, 00:21:03.988 "base_bdevs_list": [ 00:21:03.988 { 00:21:03.988 "name": "BaseBdev1", 00:21:03.988 "uuid": "35a823d8-a0b1-4472-a6dd-4048bd01c2cf", 00:21:03.988 "is_configured": true, 00:21:03.988 "data_offset": 2048, 00:21:03.988 "data_size": 63488 00:21:03.988 }, 00:21:03.988 { 00:21:03.988 "name": null, 00:21:03.988 "uuid": "88558d07-53ff-4bf0-80dd-8ad2f7ada37f", 00:21:03.988 "is_configured": false, 00:21:03.988 "data_offset": 2048, 00:21:03.988 "data_size": 63488 00:21:03.988 }, 00:21:03.988 { 00:21:03.988 "name": null, 00:21:03.988 "uuid": "2beb4e84-0af7-4402-89f0-2f8abc4eae00", 00:21:03.988 "is_configured": false, 00:21:03.988 "data_offset": 2048, 00:21:03.988 "data_size": 63488 00:21:03.988 }, 00:21:03.988 { 00:21:03.988 "name": "BaseBdev4", 00:21:03.988 "uuid": "8218172e-c5e2-427d-b3e1-2ca84f7460de", 00:21:03.988 "is_configured": true, 00:21:03.988 "data_offset": 2048, 00:21:03.988 "data_size": 63488 00:21:03.988 } 00:21:03.988 ] 00:21:03.988 }' 00:21:03.988 19:56:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:03.988 19:56:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:04.925 19:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:04.925 19:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:04.925 19:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:21:04.925 19:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:21:05.184 [2024-07-24 19:56:56.651903] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:05.184 19:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:05.184 19:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:05.184 19:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:05.184 19:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:05.184 19:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:05.184 19:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:05.184 19:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:05.184 19:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:05.184 19:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:05.184 19:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:05.184 19:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:05.184 19:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:05.443 19:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:05.443 "name": "Existed_Raid", 00:21:05.443 "uuid": "b132390a-e902-4348-a67d-53f10b5d05ce", 00:21:05.443 "strip_size_kb": 64, 00:21:05.443 "state": "configuring", 00:21:05.443 "raid_level": "concat", 00:21:05.443 "superblock": true, 00:21:05.443 "num_base_bdevs": 4, 00:21:05.443 "num_base_bdevs_discovered": 3, 00:21:05.443 "num_base_bdevs_operational": 4, 00:21:05.443 "base_bdevs_list": [ 00:21:05.443 { 00:21:05.443 "name": "BaseBdev1", 00:21:05.443 "uuid": "35a823d8-a0b1-4472-a6dd-4048bd01c2cf", 00:21:05.443 "is_configured": true, 00:21:05.443 "data_offset": 2048, 00:21:05.443 "data_size": 63488 00:21:05.443 }, 00:21:05.443 { 00:21:05.443 "name": null, 00:21:05.443 "uuid": "88558d07-53ff-4bf0-80dd-8ad2f7ada37f", 00:21:05.443 "is_configured": false, 00:21:05.443 "data_offset": 2048, 00:21:05.443 "data_size": 63488 00:21:05.443 }, 00:21:05.443 { 00:21:05.443 "name": "BaseBdev3", 00:21:05.443 "uuid": "2beb4e84-0af7-4402-89f0-2f8abc4eae00", 00:21:05.443 "is_configured": true, 00:21:05.443 "data_offset": 2048, 00:21:05.443 "data_size": 63488 00:21:05.443 }, 00:21:05.443 { 00:21:05.443 "name": "BaseBdev4", 00:21:05.443 "uuid": "8218172e-c5e2-427d-b3e1-2ca84f7460de", 00:21:05.443 "is_configured": true, 00:21:05.443 "data_offset": 2048, 00:21:05.443 "data_size": 63488 00:21:05.443 } 00:21:05.443 ] 00:21:05.443 }' 00:21:05.443 19:56:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:05.443 19:56:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:06.011 19:56:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:06.011 19:56:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:06.270 19:56:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:21:06.270 19:56:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:06.529 [2024-07-24 19:56:57.983434] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:06.529 19:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:06.529 19:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:06.529 19:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:06.529 19:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:06.529 19:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:06.529 19:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:06.529 19:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:06.529 19:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:06.529 19:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:06.529 19:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:06.529 19:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:06.529 19:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:06.788 19:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:06.788 "name": "Existed_Raid", 00:21:06.788 "uuid": "b132390a-e902-4348-a67d-53f10b5d05ce", 00:21:06.788 "strip_size_kb": 64, 00:21:06.788 "state": "configuring", 00:21:06.788 "raid_level": "concat", 00:21:06.788 "superblock": true, 00:21:06.788 "num_base_bdevs": 4, 00:21:06.788 "num_base_bdevs_discovered": 2, 00:21:06.788 "num_base_bdevs_operational": 4, 00:21:06.788 "base_bdevs_list": [ 00:21:06.788 { 00:21:06.788 "name": null, 00:21:06.788 "uuid": "35a823d8-a0b1-4472-a6dd-4048bd01c2cf", 00:21:06.788 "is_configured": false, 00:21:06.788 "data_offset": 2048, 00:21:06.788 "data_size": 63488 00:21:06.788 }, 00:21:06.788 { 00:21:06.788 "name": null, 00:21:06.788 "uuid": "88558d07-53ff-4bf0-80dd-8ad2f7ada37f", 00:21:06.788 "is_configured": false, 00:21:06.788 "data_offset": 2048, 00:21:06.788 "data_size": 63488 00:21:06.788 }, 00:21:06.788 { 00:21:06.788 "name": "BaseBdev3", 00:21:06.788 "uuid": "2beb4e84-0af7-4402-89f0-2f8abc4eae00", 00:21:06.788 "is_configured": true, 00:21:06.788 "data_offset": 2048, 00:21:06.788 "data_size": 63488 00:21:06.788 }, 00:21:06.788 { 00:21:06.788 "name": "BaseBdev4", 00:21:06.788 "uuid": "8218172e-c5e2-427d-b3e1-2ca84f7460de", 00:21:06.788 "is_configured": true, 00:21:06.788 "data_offset": 2048, 00:21:06.788 "data_size": 63488 00:21:06.788 } 00:21:06.788 ] 00:21:06.788 }' 00:21:06.788 19:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:06.788 19:56:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:07.356 19:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:07.356 19:56:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:07.616 19:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:21:07.616 19:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:21:07.875 [2024-07-24 19:56:59.377792] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:07.875 19:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:07.875 19:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:07.875 19:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:07.875 19:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:07.875 19:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:07.875 19:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:07.875 19:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:07.875 19:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:07.875 19:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:07.875 19:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:07.875 19:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:07.875 19:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:08.135 19:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:08.135 "name": "Existed_Raid", 00:21:08.135 "uuid": "b132390a-e902-4348-a67d-53f10b5d05ce", 00:21:08.135 "strip_size_kb": 64, 00:21:08.135 "state": "configuring", 00:21:08.135 "raid_level": "concat", 00:21:08.135 "superblock": true, 00:21:08.135 "num_base_bdevs": 4, 00:21:08.135 "num_base_bdevs_discovered": 3, 00:21:08.135 "num_base_bdevs_operational": 4, 00:21:08.135 "base_bdevs_list": [ 00:21:08.135 { 00:21:08.135 "name": null, 00:21:08.135 "uuid": "35a823d8-a0b1-4472-a6dd-4048bd01c2cf", 00:21:08.135 "is_configured": false, 00:21:08.135 "data_offset": 2048, 00:21:08.135 "data_size": 63488 00:21:08.135 }, 00:21:08.135 { 00:21:08.135 "name": "BaseBdev2", 00:21:08.135 "uuid": "88558d07-53ff-4bf0-80dd-8ad2f7ada37f", 00:21:08.135 "is_configured": true, 00:21:08.135 "data_offset": 2048, 00:21:08.135 "data_size": 63488 00:21:08.135 }, 00:21:08.135 { 00:21:08.135 "name": "BaseBdev3", 00:21:08.135 "uuid": "2beb4e84-0af7-4402-89f0-2f8abc4eae00", 00:21:08.135 "is_configured": true, 00:21:08.135 "data_offset": 2048, 00:21:08.135 "data_size": 63488 00:21:08.135 }, 00:21:08.135 { 00:21:08.135 "name": "BaseBdev4", 00:21:08.135 "uuid": "8218172e-c5e2-427d-b3e1-2ca84f7460de", 00:21:08.135 "is_configured": true, 00:21:08.135 "data_offset": 2048, 00:21:08.135 "data_size": 63488 00:21:08.135 } 00:21:08.135 ] 00:21:08.135 }' 00:21:08.135 19:56:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:08.135 19:56:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:08.704 19:57:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:08.704 19:57:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:09.271 19:57:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:21:09.271 19:57:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:09.271 19:57:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:21:09.531 19:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 35a823d8-a0b1-4472-a6dd-4048bd01c2cf 00:21:09.791 [2024-07-24 19:57:01.263220] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:21:09.791 [2024-07-24 19:57:01.263405] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2525a90 00:21:09.791 [2024-07-24 19:57:01.263420] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:09.791 [2024-07-24 19:57:01.263602] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2520a10 00:21:09.791 [2024-07-24 19:57:01.263723] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2525a90 00:21:09.791 [2024-07-24 19:57:01.263733] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2525a90 00:21:09.791 [2024-07-24 19:57:01.263824] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:09.791 NewBaseBdev 00:21:09.791 19:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:21:09.791 19:57:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:21:09.791 19:57:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:09.791 19:57:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:21:09.791 19:57:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:09.791 19:57:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:09.791 19:57:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:10.050 19:57:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:21:10.310 [ 00:21:10.310 { 00:21:10.310 "name": "NewBaseBdev", 00:21:10.310 "aliases": [ 00:21:10.310 "35a823d8-a0b1-4472-a6dd-4048bd01c2cf" 00:21:10.310 ], 00:21:10.310 "product_name": "Malloc disk", 00:21:10.310 "block_size": 512, 00:21:10.310 "num_blocks": 65536, 00:21:10.310 "uuid": "35a823d8-a0b1-4472-a6dd-4048bd01c2cf", 00:21:10.310 "assigned_rate_limits": { 00:21:10.310 "rw_ios_per_sec": 0, 00:21:10.310 "rw_mbytes_per_sec": 0, 00:21:10.310 "r_mbytes_per_sec": 0, 00:21:10.310 "w_mbytes_per_sec": 0 00:21:10.310 }, 00:21:10.310 "claimed": true, 00:21:10.310 "claim_type": "exclusive_write", 00:21:10.310 "zoned": false, 00:21:10.310 "supported_io_types": { 00:21:10.310 "read": true, 00:21:10.310 "write": true, 00:21:10.310 "unmap": true, 00:21:10.310 "flush": true, 00:21:10.310 "reset": true, 00:21:10.310 "nvme_admin": false, 00:21:10.310 "nvme_io": false, 00:21:10.310 "nvme_io_md": false, 00:21:10.310 "write_zeroes": true, 00:21:10.310 "zcopy": true, 00:21:10.310 "get_zone_info": false, 00:21:10.310 "zone_management": false, 00:21:10.310 "zone_append": false, 00:21:10.310 "compare": false, 00:21:10.310 "compare_and_write": false, 00:21:10.310 "abort": true, 00:21:10.310 "seek_hole": false, 00:21:10.310 "seek_data": false, 00:21:10.310 "copy": true, 00:21:10.310 "nvme_iov_md": false 00:21:10.310 }, 00:21:10.310 "memory_domains": [ 00:21:10.310 { 00:21:10.310 "dma_device_id": "system", 00:21:10.310 "dma_device_type": 1 00:21:10.310 }, 00:21:10.310 { 00:21:10.310 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:10.310 "dma_device_type": 2 00:21:10.310 } 00:21:10.310 ], 00:21:10.310 "driver_specific": {} 00:21:10.310 } 00:21:10.310 ] 00:21:10.310 19:57:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:21:10.310 19:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:21:10.310 19:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:10.310 19:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:10.310 19:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:10.310 19:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:10.310 19:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:10.310 19:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:10.310 19:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:10.310 19:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:10.310 19:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:10.310 19:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:10.310 19:57:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:10.569 19:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:10.569 "name": "Existed_Raid", 00:21:10.569 "uuid": "b132390a-e902-4348-a67d-53f10b5d05ce", 00:21:10.569 "strip_size_kb": 64, 00:21:10.569 "state": "online", 00:21:10.569 "raid_level": "concat", 00:21:10.569 "superblock": true, 00:21:10.569 "num_base_bdevs": 4, 00:21:10.569 "num_base_bdevs_discovered": 4, 00:21:10.569 "num_base_bdevs_operational": 4, 00:21:10.569 "base_bdevs_list": [ 00:21:10.569 { 00:21:10.569 "name": "NewBaseBdev", 00:21:10.569 "uuid": "35a823d8-a0b1-4472-a6dd-4048bd01c2cf", 00:21:10.569 "is_configured": true, 00:21:10.569 "data_offset": 2048, 00:21:10.569 "data_size": 63488 00:21:10.569 }, 00:21:10.569 { 00:21:10.569 "name": "BaseBdev2", 00:21:10.569 "uuid": "88558d07-53ff-4bf0-80dd-8ad2f7ada37f", 00:21:10.569 "is_configured": true, 00:21:10.569 "data_offset": 2048, 00:21:10.569 "data_size": 63488 00:21:10.569 }, 00:21:10.569 { 00:21:10.569 "name": "BaseBdev3", 00:21:10.569 "uuid": "2beb4e84-0af7-4402-89f0-2f8abc4eae00", 00:21:10.569 "is_configured": true, 00:21:10.569 "data_offset": 2048, 00:21:10.569 "data_size": 63488 00:21:10.569 }, 00:21:10.569 { 00:21:10.569 "name": "BaseBdev4", 00:21:10.569 "uuid": "8218172e-c5e2-427d-b3e1-2ca84f7460de", 00:21:10.569 "is_configured": true, 00:21:10.569 "data_offset": 2048, 00:21:10.569 "data_size": 63488 00:21:10.569 } 00:21:10.569 ] 00:21:10.569 }' 00:21:10.569 19:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:10.569 19:57:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:11.138 19:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:21:11.138 19:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:11.138 19:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:11.138 19:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:11.138 19:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:11.138 19:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:21:11.138 19:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:11.138 19:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:11.397 [2024-07-24 19:57:02.835770] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:11.397 19:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:11.397 "name": "Existed_Raid", 00:21:11.397 "aliases": [ 00:21:11.397 "b132390a-e902-4348-a67d-53f10b5d05ce" 00:21:11.397 ], 00:21:11.397 "product_name": "Raid Volume", 00:21:11.397 "block_size": 512, 00:21:11.397 "num_blocks": 253952, 00:21:11.397 "uuid": "b132390a-e902-4348-a67d-53f10b5d05ce", 00:21:11.397 "assigned_rate_limits": { 00:21:11.397 "rw_ios_per_sec": 0, 00:21:11.397 "rw_mbytes_per_sec": 0, 00:21:11.397 "r_mbytes_per_sec": 0, 00:21:11.397 "w_mbytes_per_sec": 0 00:21:11.397 }, 00:21:11.397 "claimed": false, 00:21:11.397 "zoned": false, 00:21:11.397 "supported_io_types": { 00:21:11.397 "read": true, 00:21:11.397 "write": true, 00:21:11.397 "unmap": true, 00:21:11.397 "flush": true, 00:21:11.397 "reset": true, 00:21:11.397 "nvme_admin": false, 00:21:11.398 "nvme_io": false, 00:21:11.398 "nvme_io_md": false, 00:21:11.398 "write_zeroes": true, 00:21:11.398 "zcopy": false, 00:21:11.398 "get_zone_info": false, 00:21:11.398 "zone_management": false, 00:21:11.398 "zone_append": false, 00:21:11.398 "compare": false, 00:21:11.398 "compare_and_write": false, 00:21:11.398 "abort": false, 00:21:11.398 "seek_hole": false, 00:21:11.398 "seek_data": false, 00:21:11.398 "copy": false, 00:21:11.398 "nvme_iov_md": false 00:21:11.398 }, 00:21:11.398 "memory_domains": [ 00:21:11.398 { 00:21:11.398 "dma_device_id": "system", 00:21:11.398 "dma_device_type": 1 00:21:11.398 }, 00:21:11.398 { 00:21:11.398 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:11.398 "dma_device_type": 2 00:21:11.398 }, 00:21:11.398 { 00:21:11.398 "dma_device_id": "system", 00:21:11.398 "dma_device_type": 1 00:21:11.398 }, 00:21:11.398 { 00:21:11.398 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:11.398 "dma_device_type": 2 00:21:11.398 }, 00:21:11.398 { 00:21:11.398 "dma_device_id": "system", 00:21:11.398 "dma_device_type": 1 00:21:11.398 }, 00:21:11.398 { 00:21:11.398 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:11.398 "dma_device_type": 2 00:21:11.398 }, 00:21:11.398 { 00:21:11.398 "dma_device_id": "system", 00:21:11.398 "dma_device_type": 1 00:21:11.398 }, 00:21:11.398 { 00:21:11.398 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:11.398 "dma_device_type": 2 00:21:11.398 } 00:21:11.398 ], 00:21:11.398 "driver_specific": { 00:21:11.398 "raid": { 00:21:11.398 "uuid": "b132390a-e902-4348-a67d-53f10b5d05ce", 00:21:11.398 "strip_size_kb": 64, 00:21:11.398 "state": "online", 00:21:11.398 "raid_level": "concat", 00:21:11.398 "superblock": true, 00:21:11.398 "num_base_bdevs": 4, 00:21:11.398 "num_base_bdevs_discovered": 4, 00:21:11.398 "num_base_bdevs_operational": 4, 00:21:11.398 "base_bdevs_list": [ 00:21:11.398 { 00:21:11.398 "name": "NewBaseBdev", 00:21:11.398 "uuid": "35a823d8-a0b1-4472-a6dd-4048bd01c2cf", 00:21:11.398 "is_configured": true, 00:21:11.398 "data_offset": 2048, 00:21:11.398 "data_size": 63488 00:21:11.398 }, 00:21:11.398 { 00:21:11.398 "name": "BaseBdev2", 00:21:11.398 "uuid": "88558d07-53ff-4bf0-80dd-8ad2f7ada37f", 00:21:11.398 "is_configured": true, 00:21:11.398 "data_offset": 2048, 00:21:11.398 "data_size": 63488 00:21:11.398 }, 00:21:11.398 { 00:21:11.398 "name": "BaseBdev3", 00:21:11.398 "uuid": "2beb4e84-0af7-4402-89f0-2f8abc4eae00", 00:21:11.398 "is_configured": true, 00:21:11.398 "data_offset": 2048, 00:21:11.398 "data_size": 63488 00:21:11.398 }, 00:21:11.398 { 00:21:11.398 "name": "BaseBdev4", 00:21:11.398 "uuid": "8218172e-c5e2-427d-b3e1-2ca84f7460de", 00:21:11.398 "is_configured": true, 00:21:11.398 "data_offset": 2048, 00:21:11.398 "data_size": 63488 00:21:11.398 } 00:21:11.398 ] 00:21:11.398 } 00:21:11.398 } 00:21:11.398 }' 00:21:11.398 19:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:11.398 19:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:21:11.398 BaseBdev2 00:21:11.398 BaseBdev3 00:21:11.398 BaseBdev4' 00:21:11.398 19:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:11.398 19:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:21:11.398 19:57:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:11.657 19:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:11.657 "name": "NewBaseBdev", 00:21:11.657 "aliases": [ 00:21:11.657 "35a823d8-a0b1-4472-a6dd-4048bd01c2cf" 00:21:11.657 ], 00:21:11.657 "product_name": "Malloc disk", 00:21:11.657 "block_size": 512, 00:21:11.657 "num_blocks": 65536, 00:21:11.657 "uuid": "35a823d8-a0b1-4472-a6dd-4048bd01c2cf", 00:21:11.657 "assigned_rate_limits": { 00:21:11.657 "rw_ios_per_sec": 0, 00:21:11.657 "rw_mbytes_per_sec": 0, 00:21:11.657 "r_mbytes_per_sec": 0, 00:21:11.657 "w_mbytes_per_sec": 0 00:21:11.657 }, 00:21:11.657 "claimed": true, 00:21:11.657 "claim_type": "exclusive_write", 00:21:11.657 "zoned": false, 00:21:11.657 "supported_io_types": { 00:21:11.657 "read": true, 00:21:11.657 "write": true, 00:21:11.657 "unmap": true, 00:21:11.657 "flush": true, 00:21:11.657 "reset": true, 00:21:11.657 "nvme_admin": false, 00:21:11.657 "nvme_io": false, 00:21:11.657 "nvme_io_md": false, 00:21:11.657 "write_zeroes": true, 00:21:11.657 "zcopy": true, 00:21:11.657 "get_zone_info": false, 00:21:11.657 "zone_management": false, 00:21:11.657 "zone_append": false, 00:21:11.657 "compare": false, 00:21:11.657 "compare_and_write": false, 00:21:11.657 "abort": true, 00:21:11.657 "seek_hole": false, 00:21:11.657 "seek_data": false, 00:21:11.657 "copy": true, 00:21:11.657 "nvme_iov_md": false 00:21:11.657 }, 00:21:11.657 "memory_domains": [ 00:21:11.657 { 00:21:11.657 "dma_device_id": "system", 00:21:11.657 "dma_device_type": 1 00:21:11.657 }, 00:21:11.657 { 00:21:11.657 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:11.657 "dma_device_type": 2 00:21:11.657 } 00:21:11.657 ], 00:21:11.657 "driver_specific": {} 00:21:11.657 }' 00:21:11.657 19:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:11.657 19:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:11.657 19:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:11.657 19:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:11.917 19:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:11.917 19:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:11.917 19:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:11.917 19:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:11.917 19:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:11.917 19:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:11.917 19:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:11.917 19:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:11.917 19:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:11.917 19:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:11.917 19:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:12.176 19:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:12.176 "name": "BaseBdev2", 00:21:12.176 "aliases": [ 00:21:12.176 "88558d07-53ff-4bf0-80dd-8ad2f7ada37f" 00:21:12.176 ], 00:21:12.176 "product_name": "Malloc disk", 00:21:12.176 "block_size": 512, 00:21:12.176 "num_blocks": 65536, 00:21:12.176 "uuid": "88558d07-53ff-4bf0-80dd-8ad2f7ada37f", 00:21:12.176 "assigned_rate_limits": { 00:21:12.176 "rw_ios_per_sec": 0, 00:21:12.176 "rw_mbytes_per_sec": 0, 00:21:12.176 "r_mbytes_per_sec": 0, 00:21:12.176 "w_mbytes_per_sec": 0 00:21:12.176 }, 00:21:12.176 "claimed": true, 00:21:12.176 "claim_type": "exclusive_write", 00:21:12.176 "zoned": false, 00:21:12.176 "supported_io_types": { 00:21:12.176 "read": true, 00:21:12.176 "write": true, 00:21:12.176 "unmap": true, 00:21:12.176 "flush": true, 00:21:12.176 "reset": true, 00:21:12.176 "nvme_admin": false, 00:21:12.176 "nvme_io": false, 00:21:12.176 "nvme_io_md": false, 00:21:12.176 "write_zeroes": true, 00:21:12.176 "zcopy": true, 00:21:12.176 "get_zone_info": false, 00:21:12.176 "zone_management": false, 00:21:12.176 "zone_append": false, 00:21:12.176 "compare": false, 00:21:12.176 "compare_and_write": false, 00:21:12.176 "abort": true, 00:21:12.176 "seek_hole": false, 00:21:12.176 "seek_data": false, 00:21:12.176 "copy": true, 00:21:12.176 "nvme_iov_md": false 00:21:12.176 }, 00:21:12.176 "memory_domains": [ 00:21:12.176 { 00:21:12.176 "dma_device_id": "system", 00:21:12.176 "dma_device_type": 1 00:21:12.176 }, 00:21:12.176 { 00:21:12.176 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:12.176 "dma_device_type": 2 00:21:12.176 } 00:21:12.176 ], 00:21:12.176 "driver_specific": {} 00:21:12.176 }' 00:21:12.176 19:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:12.435 19:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:12.435 19:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:12.435 19:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:12.435 19:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:12.435 19:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:12.435 19:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:12.435 19:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:12.435 19:57:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:12.435 19:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:12.694 19:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:12.694 19:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:12.694 19:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:12.694 19:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:12.694 19:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:12.953 19:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:12.953 "name": "BaseBdev3", 00:21:12.953 "aliases": [ 00:21:12.953 "2beb4e84-0af7-4402-89f0-2f8abc4eae00" 00:21:12.953 ], 00:21:12.953 "product_name": "Malloc disk", 00:21:12.953 "block_size": 512, 00:21:12.953 "num_blocks": 65536, 00:21:12.953 "uuid": "2beb4e84-0af7-4402-89f0-2f8abc4eae00", 00:21:12.953 "assigned_rate_limits": { 00:21:12.953 "rw_ios_per_sec": 0, 00:21:12.953 "rw_mbytes_per_sec": 0, 00:21:12.953 "r_mbytes_per_sec": 0, 00:21:12.953 "w_mbytes_per_sec": 0 00:21:12.953 }, 00:21:12.953 "claimed": true, 00:21:12.953 "claim_type": "exclusive_write", 00:21:12.953 "zoned": false, 00:21:12.953 "supported_io_types": { 00:21:12.953 "read": true, 00:21:12.953 "write": true, 00:21:12.953 "unmap": true, 00:21:12.953 "flush": true, 00:21:12.953 "reset": true, 00:21:12.953 "nvme_admin": false, 00:21:12.953 "nvme_io": false, 00:21:12.953 "nvme_io_md": false, 00:21:12.953 "write_zeroes": true, 00:21:12.953 "zcopy": true, 00:21:12.953 "get_zone_info": false, 00:21:12.953 "zone_management": false, 00:21:12.953 "zone_append": false, 00:21:12.953 "compare": false, 00:21:12.953 "compare_and_write": false, 00:21:12.953 "abort": true, 00:21:12.953 "seek_hole": false, 00:21:12.953 "seek_data": false, 00:21:12.953 "copy": true, 00:21:12.953 "nvme_iov_md": false 00:21:12.953 }, 00:21:12.953 "memory_domains": [ 00:21:12.953 { 00:21:12.954 "dma_device_id": "system", 00:21:12.954 "dma_device_type": 1 00:21:12.954 }, 00:21:12.954 { 00:21:12.954 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:12.954 "dma_device_type": 2 00:21:12.954 } 00:21:12.954 ], 00:21:12.954 "driver_specific": {} 00:21:12.954 }' 00:21:12.954 19:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:12.954 19:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:12.954 19:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:12.954 19:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:12.954 19:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:12.954 19:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:12.954 19:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:13.213 19:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:13.213 19:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:13.213 19:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:13.213 19:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:13.213 19:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:13.213 19:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:13.213 19:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:13.213 19:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:13.472 19:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:13.472 "name": "BaseBdev4", 00:21:13.472 "aliases": [ 00:21:13.472 "8218172e-c5e2-427d-b3e1-2ca84f7460de" 00:21:13.472 ], 00:21:13.472 "product_name": "Malloc disk", 00:21:13.472 "block_size": 512, 00:21:13.472 "num_blocks": 65536, 00:21:13.472 "uuid": "8218172e-c5e2-427d-b3e1-2ca84f7460de", 00:21:13.472 "assigned_rate_limits": { 00:21:13.472 "rw_ios_per_sec": 0, 00:21:13.472 "rw_mbytes_per_sec": 0, 00:21:13.472 "r_mbytes_per_sec": 0, 00:21:13.472 "w_mbytes_per_sec": 0 00:21:13.472 }, 00:21:13.472 "claimed": true, 00:21:13.472 "claim_type": "exclusive_write", 00:21:13.472 "zoned": false, 00:21:13.472 "supported_io_types": { 00:21:13.472 "read": true, 00:21:13.472 "write": true, 00:21:13.472 "unmap": true, 00:21:13.472 "flush": true, 00:21:13.472 "reset": true, 00:21:13.472 "nvme_admin": false, 00:21:13.472 "nvme_io": false, 00:21:13.472 "nvme_io_md": false, 00:21:13.472 "write_zeroes": true, 00:21:13.472 "zcopy": true, 00:21:13.472 "get_zone_info": false, 00:21:13.472 "zone_management": false, 00:21:13.472 "zone_append": false, 00:21:13.472 "compare": false, 00:21:13.472 "compare_and_write": false, 00:21:13.472 "abort": true, 00:21:13.472 "seek_hole": false, 00:21:13.472 "seek_data": false, 00:21:13.472 "copy": true, 00:21:13.472 "nvme_iov_md": false 00:21:13.472 }, 00:21:13.472 "memory_domains": [ 00:21:13.472 { 00:21:13.472 "dma_device_id": "system", 00:21:13.472 "dma_device_type": 1 00:21:13.472 }, 00:21:13.472 { 00:21:13.472 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:13.472 "dma_device_type": 2 00:21:13.472 } 00:21:13.472 ], 00:21:13.472 "driver_specific": {} 00:21:13.472 }' 00:21:13.472 19:57:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:13.472 19:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:13.472 19:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:13.472 19:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:13.731 19:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:13.731 19:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:13.731 19:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:13.732 19:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:13.732 19:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:13.732 19:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:13.732 19:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:13.991 19:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:13.991 19:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:13.991 [2024-07-24 19:57:05.490534] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:13.991 [2024-07-24 19:57:05.490563] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:13.991 [2024-07-24 19:57:05.490617] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:13.991 [2024-07-24 19:57:05.490680] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:13.991 [2024-07-24 19:57:05.490693] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2525a90 name Existed_Raid, state offline 00:21:13.991 19:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1456559 00:21:13.991 19:57:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1456559 ']' 00:21:13.991 19:57:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 1456559 00:21:13.991 19:57:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:21:13.991 19:57:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:21:13.991 19:57:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1456559 00:21:13.991 19:57:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:21:13.991 19:57:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:21:13.991 19:57:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1456559' 00:21:13.991 killing process with pid 1456559 00:21:13.991 19:57:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 1456559 00:21:13.991 [2024-07-24 19:57:05.582027] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:13.991 19:57:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 1456559 00:21:14.250 [2024-07-24 19:57:05.620221] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:14.250 19:57:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:21:14.250 00:21:14.250 real 0m34.657s 00:21:14.250 user 1m3.703s 00:21:14.250 sys 0m6.101s 00:21:14.250 19:57:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:21:14.250 19:57:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:14.250 ************************************ 00:21:14.250 END TEST raid_state_function_test_sb 00:21:14.250 ************************************ 00:21:14.509 19:57:05 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:21:14.509 19:57:05 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:21:14.509 19:57:05 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:21:14.509 19:57:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:14.509 ************************************ 00:21:14.509 START TEST raid_superblock_test 00:21:14.509 ************************************ 00:21:14.509 19:57:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test concat 4 00:21:14.509 19:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=concat 00:21:14.509 19:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=4 00:21:14.509 19:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:21:14.509 19:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:21:14.509 19:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:21:14.509 19:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:21:14.509 19:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:21:14.509 19:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:21:14.509 19:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:21:14.509 19:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:21:14.509 19:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:21:14.509 19:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:21:14.509 19:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:21:14.509 19:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' concat '!=' raid1 ']' 00:21:14.509 19:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # strip_size=64 00:21:14.509 19:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # strip_size_create_arg='-z 64' 00:21:14.509 19:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=1462063 00:21:14.509 19:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 1462063 /var/tmp/spdk-raid.sock 00:21:14.509 19:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:21:14.509 19:57:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1462063 ']' 00:21:14.509 19:57:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:14.509 19:57:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:21:14.510 19:57:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:14.510 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:14.510 19:57:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:21:14.510 19:57:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:14.510 [2024-07-24 19:57:05.987760] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:21:14.510 [2024-07-24 19:57:05.987819] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1462063 ] 00:21:14.510 [2024-07-24 19:57:06.102130] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:14.768 [2024-07-24 19:57:06.210276] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:14.768 [2024-07-24 19:57:06.273645] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:14.768 [2024-07-24 19:57:06.273684] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:15.335 19:57:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:21:15.335 19:57:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:21:15.335 19:57:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:21:15.335 19:57:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:21:15.335 19:57:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:21:15.335 19:57:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:21:15.335 19:57:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:21:15.335 19:57:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:15.335 19:57:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:21:15.335 19:57:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:15.335 19:57:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:21:15.618 malloc1 00:21:15.618 19:57:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:15.618 [2024-07-24 19:57:07.202561] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:15.618 [2024-07-24 19:57:07.202615] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:15.618 [2024-07-24 19:57:07.202637] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1852590 00:21:15.618 [2024-07-24 19:57:07.202650] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:15.618 [2024-07-24 19:57:07.204203] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:15.618 [2024-07-24 19:57:07.204231] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:15.618 pt1 00:21:15.880 19:57:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:21:15.880 19:57:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:21:15.880 19:57:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:21:15.880 19:57:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:21:15.880 19:57:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:21:15.880 19:57:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:15.880 19:57:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:21:15.880 19:57:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:15.880 19:57:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:21:16.138 malloc2 00:21:16.397 19:57:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:16.397 [2024-07-24 19:57:07.973293] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:16.397 [2024-07-24 19:57:07.973343] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:16.397 [2024-07-24 19:57:07.973363] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19f8690 00:21:16.397 [2024-07-24 19:57:07.973376] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:16.397 [2024-07-24 19:57:07.974937] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:16.397 [2024-07-24 19:57:07.974965] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:16.397 pt2 00:21:16.657 19:57:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:21:16.657 19:57:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:21:16.657 19:57:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc3 00:21:16.657 19:57:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt3 00:21:16.657 19:57:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:21:16.657 19:57:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:16.657 19:57:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:21:16.657 19:57:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:16.657 19:57:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:21:16.917 malloc3 00:21:17.187 19:57:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:17.187 [2024-07-24 19:57:08.737186] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:17.187 [2024-07-24 19:57:08.737241] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:17.187 [2024-07-24 19:57:08.737262] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19f9fc0 00:21:17.187 [2024-07-24 19:57:08.737275] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:17.187 [2024-07-24 19:57:08.738884] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:17.187 [2024-07-24 19:57:08.738916] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:17.187 pt3 00:21:17.187 19:57:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:21:17.187 19:57:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:21:17.187 19:57:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc4 00:21:17.187 19:57:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt4 00:21:17.187 19:57:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:21:17.187 19:57:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:17.187 19:57:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:21:17.187 19:57:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:17.187 19:57:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:21:17.823 malloc4 00:21:17.823 19:57:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:18.082 [2024-07-24 19:57:09.501080] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:18.082 [2024-07-24 19:57:09.501129] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:18.082 [2024-07-24 19:57:09.501152] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19fb1c0 00:21:18.082 [2024-07-24 19:57:09.501165] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:18.082 [2024-07-24 19:57:09.502710] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:18.082 [2024-07-24 19:57:09.502741] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:18.082 pt4 00:21:18.082 19:57:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:21:18.082 19:57:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:21:18.082 19:57:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:21:18.650 [2024-07-24 19:57:09.998410] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:18.650 [2024-07-24 19:57:09.999729] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:18.650 [2024-07-24 19:57:09.999786] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:18.650 [2024-07-24 19:57:09.999832] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:18.650 [2024-07-24 19:57:10.000010] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a03e80 00:21:18.650 [2024-07-24 19:57:10.000022] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:18.650 [2024-07-24 19:57:10.000228] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1869480 00:21:18.650 [2024-07-24 19:57:10.000379] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a03e80 00:21:18.650 [2024-07-24 19:57:10.000397] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a03e80 00:21:18.650 [2024-07-24 19:57:10.000502] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:18.650 19:57:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:18.650 19:57:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:18.650 19:57:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:18.650 19:57:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:18.650 19:57:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:18.650 19:57:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:18.650 19:57:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:18.650 19:57:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:18.650 19:57:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:18.650 19:57:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:18.650 19:57:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:18.650 19:57:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:18.910 19:57:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:18.910 "name": "raid_bdev1", 00:21:18.910 "uuid": "ec24541d-ce4b-4a24-8cd2-51e8a2dec6ac", 00:21:18.910 "strip_size_kb": 64, 00:21:18.910 "state": "online", 00:21:18.910 "raid_level": "concat", 00:21:18.910 "superblock": true, 00:21:18.910 "num_base_bdevs": 4, 00:21:18.910 "num_base_bdevs_discovered": 4, 00:21:18.910 "num_base_bdevs_operational": 4, 00:21:18.910 "base_bdevs_list": [ 00:21:18.910 { 00:21:18.910 "name": "pt1", 00:21:18.910 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:18.910 "is_configured": true, 00:21:18.910 "data_offset": 2048, 00:21:18.910 "data_size": 63488 00:21:18.910 }, 00:21:18.910 { 00:21:18.910 "name": "pt2", 00:21:18.910 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:18.910 "is_configured": true, 00:21:18.910 "data_offset": 2048, 00:21:18.910 "data_size": 63488 00:21:18.910 }, 00:21:18.910 { 00:21:18.910 "name": "pt3", 00:21:18.910 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:18.910 "is_configured": true, 00:21:18.910 "data_offset": 2048, 00:21:18.910 "data_size": 63488 00:21:18.910 }, 00:21:18.910 { 00:21:18.910 "name": "pt4", 00:21:18.910 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:18.910 "is_configured": true, 00:21:18.910 "data_offset": 2048, 00:21:18.910 "data_size": 63488 00:21:18.910 } 00:21:18.910 ] 00:21:18.910 }' 00:21:18.910 19:57:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:18.910 19:57:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:19.478 19:57:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:21:19.478 19:57:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:19.478 19:57:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:19.478 19:57:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:19.478 19:57:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:19.478 19:57:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:19.478 19:57:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:19.478 19:57:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:19.737 [2024-07-24 19:57:11.121632] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:19.737 19:57:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:19.737 "name": "raid_bdev1", 00:21:19.737 "aliases": [ 00:21:19.737 "ec24541d-ce4b-4a24-8cd2-51e8a2dec6ac" 00:21:19.737 ], 00:21:19.737 "product_name": "Raid Volume", 00:21:19.737 "block_size": 512, 00:21:19.737 "num_blocks": 253952, 00:21:19.737 "uuid": "ec24541d-ce4b-4a24-8cd2-51e8a2dec6ac", 00:21:19.737 "assigned_rate_limits": { 00:21:19.737 "rw_ios_per_sec": 0, 00:21:19.737 "rw_mbytes_per_sec": 0, 00:21:19.737 "r_mbytes_per_sec": 0, 00:21:19.737 "w_mbytes_per_sec": 0 00:21:19.737 }, 00:21:19.737 "claimed": false, 00:21:19.737 "zoned": false, 00:21:19.737 "supported_io_types": { 00:21:19.737 "read": true, 00:21:19.737 "write": true, 00:21:19.737 "unmap": true, 00:21:19.737 "flush": true, 00:21:19.737 "reset": true, 00:21:19.737 "nvme_admin": false, 00:21:19.737 "nvme_io": false, 00:21:19.737 "nvme_io_md": false, 00:21:19.737 "write_zeroes": true, 00:21:19.737 "zcopy": false, 00:21:19.737 "get_zone_info": false, 00:21:19.737 "zone_management": false, 00:21:19.737 "zone_append": false, 00:21:19.737 "compare": false, 00:21:19.737 "compare_and_write": false, 00:21:19.737 "abort": false, 00:21:19.737 "seek_hole": false, 00:21:19.737 "seek_data": false, 00:21:19.737 "copy": false, 00:21:19.737 "nvme_iov_md": false 00:21:19.737 }, 00:21:19.737 "memory_domains": [ 00:21:19.737 { 00:21:19.737 "dma_device_id": "system", 00:21:19.737 "dma_device_type": 1 00:21:19.737 }, 00:21:19.737 { 00:21:19.737 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:19.737 "dma_device_type": 2 00:21:19.737 }, 00:21:19.737 { 00:21:19.737 "dma_device_id": "system", 00:21:19.737 "dma_device_type": 1 00:21:19.737 }, 00:21:19.737 { 00:21:19.737 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:19.737 "dma_device_type": 2 00:21:19.737 }, 00:21:19.737 { 00:21:19.737 "dma_device_id": "system", 00:21:19.737 "dma_device_type": 1 00:21:19.737 }, 00:21:19.737 { 00:21:19.737 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:19.737 "dma_device_type": 2 00:21:19.737 }, 00:21:19.737 { 00:21:19.737 "dma_device_id": "system", 00:21:19.737 "dma_device_type": 1 00:21:19.737 }, 00:21:19.737 { 00:21:19.737 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:19.737 "dma_device_type": 2 00:21:19.737 } 00:21:19.737 ], 00:21:19.737 "driver_specific": { 00:21:19.737 "raid": { 00:21:19.737 "uuid": "ec24541d-ce4b-4a24-8cd2-51e8a2dec6ac", 00:21:19.737 "strip_size_kb": 64, 00:21:19.737 "state": "online", 00:21:19.737 "raid_level": "concat", 00:21:19.737 "superblock": true, 00:21:19.737 "num_base_bdevs": 4, 00:21:19.737 "num_base_bdevs_discovered": 4, 00:21:19.737 "num_base_bdevs_operational": 4, 00:21:19.737 "base_bdevs_list": [ 00:21:19.737 { 00:21:19.737 "name": "pt1", 00:21:19.737 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:19.737 "is_configured": true, 00:21:19.737 "data_offset": 2048, 00:21:19.737 "data_size": 63488 00:21:19.737 }, 00:21:19.737 { 00:21:19.737 "name": "pt2", 00:21:19.737 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:19.737 "is_configured": true, 00:21:19.737 "data_offset": 2048, 00:21:19.737 "data_size": 63488 00:21:19.737 }, 00:21:19.737 { 00:21:19.737 "name": "pt3", 00:21:19.737 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:19.737 "is_configured": true, 00:21:19.737 "data_offset": 2048, 00:21:19.737 "data_size": 63488 00:21:19.737 }, 00:21:19.737 { 00:21:19.737 "name": "pt4", 00:21:19.737 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:19.737 "is_configured": true, 00:21:19.737 "data_offset": 2048, 00:21:19.737 "data_size": 63488 00:21:19.737 } 00:21:19.737 ] 00:21:19.737 } 00:21:19.737 } 00:21:19.737 }' 00:21:19.737 19:57:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:19.737 19:57:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:19.737 pt2 00:21:19.737 pt3 00:21:19.737 pt4' 00:21:19.737 19:57:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:19.738 19:57:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:19.738 19:57:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:19.997 19:57:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:19.997 "name": "pt1", 00:21:19.997 "aliases": [ 00:21:19.997 "00000000-0000-0000-0000-000000000001" 00:21:19.997 ], 00:21:19.997 "product_name": "passthru", 00:21:19.997 "block_size": 512, 00:21:19.997 "num_blocks": 65536, 00:21:19.997 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:19.997 "assigned_rate_limits": { 00:21:19.997 "rw_ios_per_sec": 0, 00:21:19.997 "rw_mbytes_per_sec": 0, 00:21:19.997 "r_mbytes_per_sec": 0, 00:21:19.997 "w_mbytes_per_sec": 0 00:21:19.997 }, 00:21:19.997 "claimed": true, 00:21:19.997 "claim_type": "exclusive_write", 00:21:19.997 "zoned": false, 00:21:19.997 "supported_io_types": { 00:21:19.997 "read": true, 00:21:19.997 "write": true, 00:21:19.997 "unmap": true, 00:21:19.997 "flush": true, 00:21:19.997 "reset": true, 00:21:19.997 "nvme_admin": false, 00:21:19.997 "nvme_io": false, 00:21:19.997 "nvme_io_md": false, 00:21:19.997 "write_zeroes": true, 00:21:19.997 "zcopy": true, 00:21:19.997 "get_zone_info": false, 00:21:19.997 "zone_management": false, 00:21:19.997 "zone_append": false, 00:21:19.997 "compare": false, 00:21:19.997 "compare_and_write": false, 00:21:19.997 "abort": true, 00:21:19.997 "seek_hole": false, 00:21:19.997 "seek_data": false, 00:21:19.997 "copy": true, 00:21:19.997 "nvme_iov_md": false 00:21:19.997 }, 00:21:19.997 "memory_domains": [ 00:21:19.997 { 00:21:19.997 "dma_device_id": "system", 00:21:19.997 "dma_device_type": 1 00:21:19.997 }, 00:21:19.997 { 00:21:19.997 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:19.997 "dma_device_type": 2 00:21:19.997 } 00:21:19.997 ], 00:21:19.997 "driver_specific": { 00:21:19.997 "passthru": { 00:21:19.997 "name": "pt1", 00:21:19.997 "base_bdev_name": "malloc1" 00:21:19.997 } 00:21:19.997 } 00:21:19.997 }' 00:21:19.997 19:57:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:19.997 19:57:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:19.997 19:57:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:19.997 19:57:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:19.997 19:57:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:20.256 19:57:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:20.256 19:57:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:20.256 19:57:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:20.256 19:57:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:20.256 19:57:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:20.256 19:57:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:20.256 19:57:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:20.256 19:57:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:20.256 19:57:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:20.256 19:57:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:20.515 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:20.515 "name": "pt2", 00:21:20.515 "aliases": [ 00:21:20.515 "00000000-0000-0000-0000-000000000002" 00:21:20.515 ], 00:21:20.515 "product_name": "passthru", 00:21:20.515 "block_size": 512, 00:21:20.515 "num_blocks": 65536, 00:21:20.515 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:20.515 "assigned_rate_limits": { 00:21:20.515 "rw_ios_per_sec": 0, 00:21:20.515 "rw_mbytes_per_sec": 0, 00:21:20.515 "r_mbytes_per_sec": 0, 00:21:20.515 "w_mbytes_per_sec": 0 00:21:20.515 }, 00:21:20.515 "claimed": true, 00:21:20.515 "claim_type": "exclusive_write", 00:21:20.515 "zoned": false, 00:21:20.515 "supported_io_types": { 00:21:20.515 "read": true, 00:21:20.515 "write": true, 00:21:20.515 "unmap": true, 00:21:20.515 "flush": true, 00:21:20.515 "reset": true, 00:21:20.515 "nvme_admin": false, 00:21:20.515 "nvme_io": false, 00:21:20.515 "nvme_io_md": false, 00:21:20.515 "write_zeroes": true, 00:21:20.515 "zcopy": true, 00:21:20.515 "get_zone_info": false, 00:21:20.515 "zone_management": false, 00:21:20.515 "zone_append": false, 00:21:20.515 "compare": false, 00:21:20.515 "compare_and_write": false, 00:21:20.515 "abort": true, 00:21:20.515 "seek_hole": false, 00:21:20.515 "seek_data": false, 00:21:20.515 "copy": true, 00:21:20.515 "nvme_iov_md": false 00:21:20.515 }, 00:21:20.515 "memory_domains": [ 00:21:20.515 { 00:21:20.515 "dma_device_id": "system", 00:21:20.515 "dma_device_type": 1 00:21:20.515 }, 00:21:20.515 { 00:21:20.515 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:20.515 "dma_device_type": 2 00:21:20.515 } 00:21:20.515 ], 00:21:20.515 "driver_specific": { 00:21:20.515 "passthru": { 00:21:20.515 "name": "pt2", 00:21:20.515 "base_bdev_name": "malloc2" 00:21:20.515 } 00:21:20.515 } 00:21:20.515 }' 00:21:20.515 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:20.515 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:20.775 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:20.775 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:20.775 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:20.775 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:20.775 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:20.775 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:20.775 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:20.775 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:20.775 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:21.034 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:21.034 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:21.034 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:21:21.034 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:21.034 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:21.034 "name": "pt3", 00:21:21.034 "aliases": [ 00:21:21.034 "00000000-0000-0000-0000-000000000003" 00:21:21.034 ], 00:21:21.034 "product_name": "passthru", 00:21:21.034 "block_size": 512, 00:21:21.034 "num_blocks": 65536, 00:21:21.034 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:21.034 "assigned_rate_limits": { 00:21:21.034 "rw_ios_per_sec": 0, 00:21:21.034 "rw_mbytes_per_sec": 0, 00:21:21.034 "r_mbytes_per_sec": 0, 00:21:21.034 "w_mbytes_per_sec": 0 00:21:21.034 }, 00:21:21.034 "claimed": true, 00:21:21.034 "claim_type": "exclusive_write", 00:21:21.034 "zoned": false, 00:21:21.034 "supported_io_types": { 00:21:21.034 "read": true, 00:21:21.034 "write": true, 00:21:21.034 "unmap": true, 00:21:21.034 "flush": true, 00:21:21.034 "reset": true, 00:21:21.034 "nvme_admin": false, 00:21:21.034 "nvme_io": false, 00:21:21.034 "nvme_io_md": false, 00:21:21.034 "write_zeroes": true, 00:21:21.034 "zcopy": true, 00:21:21.034 "get_zone_info": false, 00:21:21.034 "zone_management": false, 00:21:21.034 "zone_append": false, 00:21:21.034 "compare": false, 00:21:21.034 "compare_and_write": false, 00:21:21.034 "abort": true, 00:21:21.034 "seek_hole": false, 00:21:21.034 "seek_data": false, 00:21:21.034 "copy": true, 00:21:21.034 "nvme_iov_md": false 00:21:21.034 }, 00:21:21.034 "memory_domains": [ 00:21:21.034 { 00:21:21.034 "dma_device_id": "system", 00:21:21.034 "dma_device_type": 1 00:21:21.034 }, 00:21:21.034 { 00:21:21.034 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:21.034 "dma_device_type": 2 00:21:21.034 } 00:21:21.034 ], 00:21:21.034 "driver_specific": { 00:21:21.034 "passthru": { 00:21:21.034 "name": "pt3", 00:21:21.034 "base_bdev_name": "malloc3" 00:21:21.034 } 00:21:21.034 } 00:21:21.034 }' 00:21:21.034 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:21.293 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:21.293 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:21.293 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:21.293 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:21.293 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:21.293 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:21.293 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:21.553 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:21.553 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:21.553 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:21.553 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:21.553 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:21.553 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:21:21.553 19:57:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:21.812 19:57:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:21.812 "name": "pt4", 00:21:21.812 "aliases": [ 00:21:21.812 "00000000-0000-0000-0000-000000000004" 00:21:21.812 ], 00:21:21.812 "product_name": "passthru", 00:21:21.812 "block_size": 512, 00:21:21.812 "num_blocks": 65536, 00:21:21.812 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:21.812 "assigned_rate_limits": { 00:21:21.812 "rw_ios_per_sec": 0, 00:21:21.812 "rw_mbytes_per_sec": 0, 00:21:21.812 "r_mbytes_per_sec": 0, 00:21:21.812 "w_mbytes_per_sec": 0 00:21:21.812 }, 00:21:21.812 "claimed": true, 00:21:21.812 "claim_type": "exclusive_write", 00:21:21.812 "zoned": false, 00:21:21.812 "supported_io_types": { 00:21:21.812 "read": true, 00:21:21.812 "write": true, 00:21:21.812 "unmap": true, 00:21:21.812 "flush": true, 00:21:21.812 "reset": true, 00:21:21.812 "nvme_admin": false, 00:21:21.812 "nvme_io": false, 00:21:21.812 "nvme_io_md": false, 00:21:21.812 "write_zeroes": true, 00:21:21.812 "zcopy": true, 00:21:21.812 "get_zone_info": false, 00:21:21.812 "zone_management": false, 00:21:21.812 "zone_append": false, 00:21:21.812 "compare": false, 00:21:21.812 "compare_and_write": false, 00:21:21.812 "abort": true, 00:21:21.812 "seek_hole": false, 00:21:21.812 "seek_data": false, 00:21:21.812 "copy": true, 00:21:21.812 "nvme_iov_md": false 00:21:21.812 }, 00:21:21.812 "memory_domains": [ 00:21:21.812 { 00:21:21.812 "dma_device_id": "system", 00:21:21.812 "dma_device_type": 1 00:21:21.812 }, 00:21:21.812 { 00:21:21.812 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:21.812 "dma_device_type": 2 00:21:21.812 } 00:21:21.812 ], 00:21:21.812 "driver_specific": { 00:21:21.812 "passthru": { 00:21:21.812 "name": "pt4", 00:21:21.812 "base_bdev_name": "malloc4" 00:21:21.812 } 00:21:21.812 } 00:21:21.812 }' 00:21:21.813 19:57:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:21.813 19:57:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:21.813 19:57:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:21.813 19:57:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:21.813 19:57:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:21.813 19:57:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:21.813 19:57:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:22.071 19:57:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:22.071 19:57:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:22.071 19:57:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:22.071 19:57:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:22.071 19:57:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:22.072 19:57:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:22.072 19:57:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:21:22.330 [2024-07-24 19:57:13.796776] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:22.330 19:57:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=ec24541d-ce4b-4a24-8cd2-51e8a2dec6ac 00:21:22.330 19:57:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z ec24541d-ce4b-4a24-8cd2-51e8a2dec6ac ']' 00:21:22.330 19:57:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:22.590 [2024-07-24 19:57:14.041119] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:22.590 [2024-07-24 19:57:14.041145] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:22.590 [2024-07-24 19:57:14.041199] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:22.590 [2024-07-24 19:57:14.041266] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:22.590 [2024-07-24 19:57:14.041278] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a03e80 name raid_bdev1, state offline 00:21:22.590 19:57:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:22.590 19:57:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:21:22.849 19:57:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:21:22.849 19:57:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:21:22.849 19:57:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:21:22.849 19:57:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:23.109 19:57:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:21:23.109 19:57:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:23.677 19:57:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:21:23.677 19:57:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:21:23.936 19:57:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:21:23.936 19:57:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:21:24.505 19:57:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:21:24.505 19:57:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:21:24.505 19:57:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:21:24.505 19:57:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:24.505 19:57:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:21:24.505 19:57:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:24.505 19:57:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:24.505 19:57:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:21:24.505 19:57:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:24.505 19:57:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:21:24.505 19:57:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:24.505 19:57:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:21:24.505 19:57:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:24.505 19:57:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:24.505 19:57:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:24.766 [2024-07-24 19:57:16.303072] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:21:24.766 [2024-07-24 19:57:16.304480] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:21:24.766 [2024-07-24 19:57:16.304525] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:21:24.766 [2024-07-24 19:57:16.304561] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:21:24.766 [2024-07-24 19:57:16.304610] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:21:24.766 [2024-07-24 19:57:16.304652] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:21:24.766 [2024-07-24 19:57:16.304676] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:21:24.766 [2024-07-24 19:57:16.304698] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:21:24.766 [2024-07-24 19:57:16.304716] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:24.766 [2024-07-24 19:57:16.304728] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19f9c40 name raid_bdev1, state configuring 00:21:24.766 request: 00:21:24.766 { 00:21:24.766 "name": "raid_bdev1", 00:21:24.766 "raid_level": "concat", 00:21:24.766 "base_bdevs": [ 00:21:24.766 "malloc1", 00:21:24.766 "malloc2", 00:21:24.766 "malloc3", 00:21:24.766 "malloc4" 00:21:24.766 ], 00:21:24.766 "strip_size_kb": 64, 00:21:24.766 "superblock": false, 00:21:24.766 "method": "bdev_raid_create", 00:21:24.766 "req_id": 1 00:21:24.766 } 00:21:24.766 Got JSON-RPC error response 00:21:24.766 response: 00:21:24.766 { 00:21:24.766 "code": -17, 00:21:24.766 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:21:24.766 } 00:21:24.766 19:57:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:21:24.766 19:57:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:21:24.766 19:57:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:21:24.766 19:57:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:21:24.766 19:57:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:24.766 19:57:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:21:25.026 19:57:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:21:25.026 19:57:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:21:25.026 19:57:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:25.285 [2024-07-24 19:57:16.800321] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:25.285 [2024-07-24 19:57:16.800373] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:25.285 [2024-07-24 19:57:16.800409] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19f8460 00:21:25.285 [2024-07-24 19:57:16.800425] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:25.285 [2024-07-24 19:57:16.802094] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:25.285 [2024-07-24 19:57:16.802126] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:25.285 [2024-07-24 19:57:16.802198] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:25.285 [2024-07-24 19:57:16.802230] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:25.285 pt1 00:21:25.285 19:57:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:21:25.285 19:57:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:25.285 19:57:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:25.285 19:57:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:25.285 19:57:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:25.285 19:57:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:25.285 19:57:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:25.285 19:57:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:25.286 19:57:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:25.286 19:57:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:25.286 19:57:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:25.286 19:57:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:25.546 19:57:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:25.547 "name": "raid_bdev1", 00:21:25.547 "uuid": "ec24541d-ce4b-4a24-8cd2-51e8a2dec6ac", 00:21:25.547 "strip_size_kb": 64, 00:21:25.547 "state": "configuring", 00:21:25.547 "raid_level": "concat", 00:21:25.547 "superblock": true, 00:21:25.547 "num_base_bdevs": 4, 00:21:25.547 "num_base_bdevs_discovered": 1, 00:21:25.547 "num_base_bdevs_operational": 4, 00:21:25.547 "base_bdevs_list": [ 00:21:25.547 { 00:21:25.547 "name": "pt1", 00:21:25.547 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:25.547 "is_configured": true, 00:21:25.547 "data_offset": 2048, 00:21:25.547 "data_size": 63488 00:21:25.547 }, 00:21:25.547 { 00:21:25.547 "name": null, 00:21:25.547 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:25.547 "is_configured": false, 00:21:25.547 "data_offset": 2048, 00:21:25.547 "data_size": 63488 00:21:25.547 }, 00:21:25.547 { 00:21:25.547 "name": null, 00:21:25.547 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:25.547 "is_configured": false, 00:21:25.547 "data_offset": 2048, 00:21:25.547 "data_size": 63488 00:21:25.547 }, 00:21:25.547 { 00:21:25.547 "name": null, 00:21:25.547 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:25.547 "is_configured": false, 00:21:25.547 "data_offset": 2048, 00:21:25.547 "data_size": 63488 00:21:25.547 } 00:21:25.547 ] 00:21:25.547 }' 00:21:25.547 19:57:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:25.547 19:57:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:26.117 19:57:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 4 -gt 2 ']' 00:21:26.117 19:57:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:26.376 [2024-07-24 19:57:17.895223] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:26.376 [2024-07-24 19:57:17.895277] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:26.376 [2024-07-24 19:57:17.895299] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x184bda0 00:21:26.376 [2024-07-24 19:57:17.895312] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:26.376 [2024-07-24 19:57:17.895679] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:26.376 [2024-07-24 19:57:17.895698] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:26.376 [2024-07-24 19:57:17.895764] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:26.376 [2024-07-24 19:57:17.895784] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:26.376 pt2 00:21:26.376 19:57:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@488 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:26.635 [2024-07-24 19:57:18.140017] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:21:26.635 19:57:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@489 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:21:26.635 19:57:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:26.635 19:57:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:26.635 19:57:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:26.635 19:57:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:26.635 19:57:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:26.635 19:57:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:26.635 19:57:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:26.635 19:57:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:26.635 19:57:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:26.635 19:57:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:26.635 19:57:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:26.893 19:57:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:26.893 "name": "raid_bdev1", 00:21:26.893 "uuid": "ec24541d-ce4b-4a24-8cd2-51e8a2dec6ac", 00:21:26.893 "strip_size_kb": 64, 00:21:26.893 "state": "configuring", 00:21:26.893 "raid_level": "concat", 00:21:26.893 "superblock": true, 00:21:26.893 "num_base_bdevs": 4, 00:21:26.893 "num_base_bdevs_discovered": 1, 00:21:26.893 "num_base_bdevs_operational": 4, 00:21:26.893 "base_bdevs_list": [ 00:21:26.893 { 00:21:26.893 "name": "pt1", 00:21:26.893 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:26.893 "is_configured": true, 00:21:26.893 "data_offset": 2048, 00:21:26.893 "data_size": 63488 00:21:26.893 }, 00:21:26.893 { 00:21:26.894 "name": null, 00:21:26.894 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:26.894 "is_configured": false, 00:21:26.894 "data_offset": 2048, 00:21:26.894 "data_size": 63488 00:21:26.894 }, 00:21:26.894 { 00:21:26.894 "name": null, 00:21:26.894 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:26.894 "is_configured": false, 00:21:26.894 "data_offset": 2048, 00:21:26.894 "data_size": 63488 00:21:26.894 }, 00:21:26.894 { 00:21:26.894 "name": null, 00:21:26.894 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:26.894 "is_configured": false, 00:21:26.894 "data_offset": 2048, 00:21:26.894 "data_size": 63488 00:21:26.894 } 00:21:26.894 ] 00:21:26.894 }' 00:21:26.894 19:57:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:26.894 19:57:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:27.461 19:57:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:21:27.461 19:57:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:21:27.461 19:57:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:27.720 [2024-07-24 19:57:19.226876] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:27.720 [2024-07-24 19:57:19.226931] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:27.720 [2024-07-24 19:57:19.226953] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1849c90 00:21:27.720 [2024-07-24 19:57:19.226966] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:27.720 [2024-07-24 19:57:19.227324] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:27.720 [2024-07-24 19:57:19.227342] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:27.720 [2024-07-24 19:57:19.227428] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:27.720 [2024-07-24 19:57:19.227451] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:27.720 pt2 00:21:27.720 19:57:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:21:27.720 19:57:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:21:27.720 19:57:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:27.979 [2024-07-24 19:57:19.471524] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:27.979 [2024-07-24 19:57:19.471560] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:27.979 [2024-07-24 19:57:19.471581] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19f88c0 00:21:27.979 [2024-07-24 19:57:19.471593] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:27.979 [2024-07-24 19:57:19.471905] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:27.979 [2024-07-24 19:57:19.471922] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:27.979 [2024-07-24 19:57:19.471976] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:21:27.979 [2024-07-24 19:57:19.471995] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:27.979 pt3 00:21:27.979 19:57:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:21:27.979 19:57:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:21:27.979 19:57:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:28.239 [2024-07-24 19:57:19.720183] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:28.239 [2024-07-24 19:57:19.720224] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:28.239 [2024-07-24 19:57:19.720249] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x184a750 00:21:28.239 [2024-07-24 19:57:19.720262] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:28.239 [2024-07-24 19:57:19.720577] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:28.239 [2024-07-24 19:57:19.720594] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:28.239 [2024-07-24 19:57:19.720646] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:21:28.239 [2024-07-24 19:57:19.720666] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:28.239 [2024-07-24 19:57:19.720789] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1849530 00:21:28.239 [2024-07-24 19:57:19.720800] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:28.239 [2024-07-24 19:57:19.720970] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1851cb0 00:21:28.239 [2024-07-24 19:57:19.721106] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1849530 00:21:28.239 [2024-07-24 19:57:19.721115] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1849530 00:21:28.239 [2024-07-24 19:57:19.721209] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:28.239 pt4 00:21:28.239 19:57:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:21:28.239 19:57:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:21:28.239 19:57:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:28.239 19:57:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:28.239 19:57:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:28.239 19:57:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:28.239 19:57:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:28.239 19:57:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:28.239 19:57:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:28.239 19:57:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:28.239 19:57:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:28.239 19:57:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:28.239 19:57:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:28.239 19:57:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:28.498 19:57:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:28.498 "name": "raid_bdev1", 00:21:28.498 "uuid": "ec24541d-ce4b-4a24-8cd2-51e8a2dec6ac", 00:21:28.498 "strip_size_kb": 64, 00:21:28.498 "state": "online", 00:21:28.498 "raid_level": "concat", 00:21:28.498 "superblock": true, 00:21:28.498 "num_base_bdevs": 4, 00:21:28.498 "num_base_bdevs_discovered": 4, 00:21:28.498 "num_base_bdevs_operational": 4, 00:21:28.498 "base_bdevs_list": [ 00:21:28.498 { 00:21:28.498 "name": "pt1", 00:21:28.498 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:28.498 "is_configured": true, 00:21:28.498 "data_offset": 2048, 00:21:28.498 "data_size": 63488 00:21:28.498 }, 00:21:28.498 { 00:21:28.498 "name": "pt2", 00:21:28.498 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:28.498 "is_configured": true, 00:21:28.498 "data_offset": 2048, 00:21:28.498 "data_size": 63488 00:21:28.498 }, 00:21:28.498 { 00:21:28.498 "name": "pt3", 00:21:28.498 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:28.498 "is_configured": true, 00:21:28.498 "data_offset": 2048, 00:21:28.498 "data_size": 63488 00:21:28.498 }, 00:21:28.498 { 00:21:28.498 "name": "pt4", 00:21:28.498 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:28.498 "is_configured": true, 00:21:28.498 "data_offset": 2048, 00:21:28.498 "data_size": 63488 00:21:28.498 } 00:21:28.498 ] 00:21:28.498 }' 00:21:28.498 19:57:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:28.498 19:57:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:29.434 19:57:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:21:29.434 19:57:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:29.434 19:57:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:29.434 19:57:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:29.434 19:57:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:29.434 19:57:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:29.434 19:57:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:29.434 19:57:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:29.434 [2024-07-24 19:57:21.015936] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:29.693 19:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:29.693 "name": "raid_bdev1", 00:21:29.693 "aliases": [ 00:21:29.693 "ec24541d-ce4b-4a24-8cd2-51e8a2dec6ac" 00:21:29.693 ], 00:21:29.693 "product_name": "Raid Volume", 00:21:29.693 "block_size": 512, 00:21:29.693 "num_blocks": 253952, 00:21:29.693 "uuid": "ec24541d-ce4b-4a24-8cd2-51e8a2dec6ac", 00:21:29.693 "assigned_rate_limits": { 00:21:29.693 "rw_ios_per_sec": 0, 00:21:29.693 "rw_mbytes_per_sec": 0, 00:21:29.693 "r_mbytes_per_sec": 0, 00:21:29.693 "w_mbytes_per_sec": 0 00:21:29.693 }, 00:21:29.693 "claimed": false, 00:21:29.693 "zoned": false, 00:21:29.693 "supported_io_types": { 00:21:29.693 "read": true, 00:21:29.693 "write": true, 00:21:29.693 "unmap": true, 00:21:29.693 "flush": true, 00:21:29.693 "reset": true, 00:21:29.693 "nvme_admin": false, 00:21:29.693 "nvme_io": false, 00:21:29.693 "nvme_io_md": false, 00:21:29.693 "write_zeroes": true, 00:21:29.693 "zcopy": false, 00:21:29.693 "get_zone_info": false, 00:21:29.693 "zone_management": false, 00:21:29.693 "zone_append": false, 00:21:29.693 "compare": false, 00:21:29.693 "compare_and_write": false, 00:21:29.693 "abort": false, 00:21:29.693 "seek_hole": false, 00:21:29.693 "seek_data": false, 00:21:29.693 "copy": false, 00:21:29.693 "nvme_iov_md": false 00:21:29.693 }, 00:21:29.693 "memory_domains": [ 00:21:29.693 { 00:21:29.693 "dma_device_id": "system", 00:21:29.693 "dma_device_type": 1 00:21:29.693 }, 00:21:29.693 { 00:21:29.693 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:29.693 "dma_device_type": 2 00:21:29.693 }, 00:21:29.693 { 00:21:29.693 "dma_device_id": "system", 00:21:29.693 "dma_device_type": 1 00:21:29.693 }, 00:21:29.693 { 00:21:29.693 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:29.693 "dma_device_type": 2 00:21:29.693 }, 00:21:29.693 { 00:21:29.693 "dma_device_id": "system", 00:21:29.693 "dma_device_type": 1 00:21:29.693 }, 00:21:29.693 { 00:21:29.693 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:29.693 "dma_device_type": 2 00:21:29.693 }, 00:21:29.693 { 00:21:29.693 "dma_device_id": "system", 00:21:29.693 "dma_device_type": 1 00:21:29.693 }, 00:21:29.693 { 00:21:29.693 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:29.693 "dma_device_type": 2 00:21:29.693 } 00:21:29.693 ], 00:21:29.693 "driver_specific": { 00:21:29.693 "raid": { 00:21:29.693 "uuid": "ec24541d-ce4b-4a24-8cd2-51e8a2dec6ac", 00:21:29.693 "strip_size_kb": 64, 00:21:29.693 "state": "online", 00:21:29.693 "raid_level": "concat", 00:21:29.693 "superblock": true, 00:21:29.693 "num_base_bdevs": 4, 00:21:29.693 "num_base_bdevs_discovered": 4, 00:21:29.693 "num_base_bdevs_operational": 4, 00:21:29.693 "base_bdevs_list": [ 00:21:29.694 { 00:21:29.694 "name": "pt1", 00:21:29.694 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:29.694 "is_configured": true, 00:21:29.694 "data_offset": 2048, 00:21:29.694 "data_size": 63488 00:21:29.694 }, 00:21:29.694 { 00:21:29.694 "name": "pt2", 00:21:29.694 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:29.694 "is_configured": true, 00:21:29.694 "data_offset": 2048, 00:21:29.694 "data_size": 63488 00:21:29.694 }, 00:21:29.694 { 00:21:29.694 "name": "pt3", 00:21:29.694 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:29.694 "is_configured": true, 00:21:29.694 "data_offset": 2048, 00:21:29.694 "data_size": 63488 00:21:29.694 }, 00:21:29.694 { 00:21:29.694 "name": "pt4", 00:21:29.694 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:29.694 "is_configured": true, 00:21:29.694 "data_offset": 2048, 00:21:29.694 "data_size": 63488 00:21:29.694 } 00:21:29.694 ] 00:21:29.694 } 00:21:29.694 } 00:21:29.694 }' 00:21:29.694 19:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:29.694 19:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:29.694 pt2 00:21:29.694 pt3 00:21:29.694 pt4' 00:21:29.694 19:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:29.694 19:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:29.694 19:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:29.953 19:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:29.953 "name": "pt1", 00:21:29.953 "aliases": [ 00:21:29.953 "00000000-0000-0000-0000-000000000001" 00:21:29.953 ], 00:21:29.953 "product_name": "passthru", 00:21:29.953 "block_size": 512, 00:21:29.953 "num_blocks": 65536, 00:21:29.953 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:29.953 "assigned_rate_limits": { 00:21:29.953 "rw_ios_per_sec": 0, 00:21:29.953 "rw_mbytes_per_sec": 0, 00:21:29.953 "r_mbytes_per_sec": 0, 00:21:29.953 "w_mbytes_per_sec": 0 00:21:29.953 }, 00:21:29.953 "claimed": true, 00:21:29.953 "claim_type": "exclusive_write", 00:21:29.953 "zoned": false, 00:21:29.953 "supported_io_types": { 00:21:29.953 "read": true, 00:21:29.953 "write": true, 00:21:29.953 "unmap": true, 00:21:29.953 "flush": true, 00:21:29.953 "reset": true, 00:21:29.953 "nvme_admin": false, 00:21:29.953 "nvme_io": false, 00:21:29.953 "nvme_io_md": false, 00:21:29.953 "write_zeroes": true, 00:21:29.953 "zcopy": true, 00:21:29.953 "get_zone_info": false, 00:21:29.953 "zone_management": false, 00:21:29.953 "zone_append": false, 00:21:29.953 "compare": false, 00:21:29.953 "compare_and_write": false, 00:21:29.953 "abort": true, 00:21:29.953 "seek_hole": false, 00:21:29.953 "seek_data": false, 00:21:29.953 "copy": true, 00:21:29.953 "nvme_iov_md": false 00:21:29.953 }, 00:21:29.953 "memory_domains": [ 00:21:29.953 { 00:21:29.953 "dma_device_id": "system", 00:21:29.953 "dma_device_type": 1 00:21:29.953 }, 00:21:29.953 { 00:21:29.953 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:29.953 "dma_device_type": 2 00:21:29.953 } 00:21:29.953 ], 00:21:29.953 "driver_specific": { 00:21:29.953 "passthru": { 00:21:29.953 "name": "pt1", 00:21:29.953 "base_bdev_name": "malloc1" 00:21:29.953 } 00:21:29.953 } 00:21:29.953 }' 00:21:29.953 19:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:29.953 19:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:29.954 19:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:29.954 19:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:29.954 19:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:29.954 19:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:29.954 19:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:30.213 19:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:30.213 19:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:30.213 19:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:30.213 19:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:30.213 19:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:30.213 19:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:30.213 19:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:30.213 19:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:30.472 19:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:30.472 "name": "pt2", 00:21:30.472 "aliases": [ 00:21:30.472 "00000000-0000-0000-0000-000000000002" 00:21:30.472 ], 00:21:30.472 "product_name": "passthru", 00:21:30.472 "block_size": 512, 00:21:30.472 "num_blocks": 65536, 00:21:30.472 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:30.472 "assigned_rate_limits": { 00:21:30.472 "rw_ios_per_sec": 0, 00:21:30.472 "rw_mbytes_per_sec": 0, 00:21:30.472 "r_mbytes_per_sec": 0, 00:21:30.472 "w_mbytes_per_sec": 0 00:21:30.472 }, 00:21:30.472 "claimed": true, 00:21:30.472 "claim_type": "exclusive_write", 00:21:30.472 "zoned": false, 00:21:30.472 "supported_io_types": { 00:21:30.472 "read": true, 00:21:30.472 "write": true, 00:21:30.472 "unmap": true, 00:21:30.472 "flush": true, 00:21:30.472 "reset": true, 00:21:30.472 "nvme_admin": false, 00:21:30.472 "nvme_io": false, 00:21:30.472 "nvme_io_md": false, 00:21:30.472 "write_zeroes": true, 00:21:30.472 "zcopy": true, 00:21:30.472 "get_zone_info": false, 00:21:30.472 "zone_management": false, 00:21:30.472 "zone_append": false, 00:21:30.472 "compare": false, 00:21:30.472 "compare_and_write": false, 00:21:30.472 "abort": true, 00:21:30.472 "seek_hole": false, 00:21:30.472 "seek_data": false, 00:21:30.472 "copy": true, 00:21:30.472 "nvme_iov_md": false 00:21:30.472 }, 00:21:30.472 "memory_domains": [ 00:21:30.472 { 00:21:30.472 "dma_device_id": "system", 00:21:30.472 "dma_device_type": 1 00:21:30.472 }, 00:21:30.472 { 00:21:30.472 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:30.472 "dma_device_type": 2 00:21:30.472 } 00:21:30.472 ], 00:21:30.472 "driver_specific": { 00:21:30.472 "passthru": { 00:21:30.472 "name": "pt2", 00:21:30.472 "base_bdev_name": "malloc2" 00:21:30.472 } 00:21:30.472 } 00:21:30.472 }' 00:21:30.472 19:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:30.472 19:57:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:30.472 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:30.472 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:30.731 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:30.731 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:30.731 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:30.731 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:30.731 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:30.731 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:30.731 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:30.731 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:30.731 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:30.731 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:21:30.731 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:30.991 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:30.991 "name": "pt3", 00:21:30.991 "aliases": [ 00:21:30.991 "00000000-0000-0000-0000-000000000003" 00:21:30.991 ], 00:21:30.991 "product_name": "passthru", 00:21:30.991 "block_size": 512, 00:21:30.991 "num_blocks": 65536, 00:21:30.991 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:30.991 "assigned_rate_limits": { 00:21:30.991 "rw_ios_per_sec": 0, 00:21:30.991 "rw_mbytes_per_sec": 0, 00:21:30.991 "r_mbytes_per_sec": 0, 00:21:30.991 "w_mbytes_per_sec": 0 00:21:30.991 }, 00:21:30.991 "claimed": true, 00:21:30.991 "claim_type": "exclusive_write", 00:21:30.991 "zoned": false, 00:21:30.991 "supported_io_types": { 00:21:30.991 "read": true, 00:21:30.991 "write": true, 00:21:30.991 "unmap": true, 00:21:30.991 "flush": true, 00:21:30.991 "reset": true, 00:21:30.991 "nvme_admin": false, 00:21:30.991 "nvme_io": false, 00:21:30.991 "nvme_io_md": false, 00:21:30.991 "write_zeroes": true, 00:21:30.991 "zcopy": true, 00:21:30.991 "get_zone_info": false, 00:21:30.991 "zone_management": false, 00:21:30.991 "zone_append": false, 00:21:30.991 "compare": false, 00:21:30.991 "compare_and_write": false, 00:21:30.991 "abort": true, 00:21:30.991 "seek_hole": false, 00:21:30.991 "seek_data": false, 00:21:30.991 "copy": true, 00:21:30.991 "nvme_iov_md": false 00:21:30.991 }, 00:21:30.991 "memory_domains": [ 00:21:30.991 { 00:21:30.991 "dma_device_id": "system", 00:21:30.991 "dma_device_type": 1 00:21:30.991 }, 00:21:30.991 { 00:21:30.991 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:30.991 "dma_device_type": 2 00:21:30.991 } 00:21:30.991 ], 00:21:30.991 "driver_specific": { 00:21:30.991 "passthru": { 00:21:30.991 "name": "pt3", 00:21:30.991 "base_bdev_name": "malloc3" 00:21:30.991 } 00:21:30.991 } 00:21:30.991 }' 00:21:30.991 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:30.991 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:30.991 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:30.991 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:31.250 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:31.250 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:31.250 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:31.250 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:31.250 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:31.250 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:31.250 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:31.250 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:31.250 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:31.250 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:21:31.250 19:57:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:31.515 19:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:31.515 "name": "pt4", 00:21:31.515 "aliases": [ 00:21:31.515 "00000000-0000-0000-0000-000000000004" 00:21:31.515 ], 00:21:31.515 "product_name": "passthru", 00:21:31.515 "block_size": 512, 00:21:31.515 "num_blocks": 65536, 00:21:31.515 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:31.515 "assigned_rate_limits": { 00:21:31.515 "rw_ios_per_sec": 0, 00:21:31.515 "rw_mbytes_per_sec": 0, 00:21:31.515 "r_mbytes_per_sec": 0, 00:21:31.515 "w_mbytes_per_sec": 0 00:21:31.515 }, 00:21:31.515 "claimed": true, 00:21:31.515 "claim_type": "exclusive_write", 00:21:31.515 "zoned": false, 00:21:31.515 "supported_io_types": { 00:21:31.515 "read": true, 00:21:31.515 "write": true, 00:21:31.515 "unmap": true, 00:21:31.515 "flush": true, 00:21:31.515 "reset": true, 00:21:31.515 "nvme_admin": false, 00:21:31.515 "nvme_io": false, 00:21:31.515 "nvme_io_md": false, 00:21:31.515 "write_zeroes": true, 00:21:31.515 "zcopy": true, 00:21:31.515 "get_zone_info": false, 00:21:31.515 "zone_management": false, 00:21:31.515 "zone_append": false, 00:21:31.515 "compare": false, 00:21:31.515 "compare_and_write": false, 00:21:31.515 "abort": true, 00:21:31.515 "seek_hole": false, 00:21:31.515 "seek_data": false, 00:21:31.515 "copy": true, 00:21:31.515 "nvme_iov_md": false 00:21:31.515 }, 00:21:31.515 "memory_domains": [ 00:21:31.515 { 00:21:31.515 "dma_device_id": "system", 00:21:31.515 "dma_device_type": 1 00:21:31.515 }, 00:21:31.515 { 00:21:31.515 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:31.515 "dma_device_type": 2 00:21:31.515 } 00:21:31.515 ], 00:21:31.515 "driver_specific": { 00:21:31.515 "passthru": { 00:21:31.515 "name": "pt4", 00:21:31.515 "base_bdev_name": "malloc4" 00:21:31.515 } 00:21:31.515 } 00:21:31.515 }' 00:21:31.515 19:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:31.773 19:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:31.773 19:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:31.773 19:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:31.773 19:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:31.773 19:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:31.773 19:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:31.773 19:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:31.773 19:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:31.773 19:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:32.038 19:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:32.038 19:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:32.038 19:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:21:32.038 19:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:32.297 [2024-07-24 19:57:23.638895] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:32.297 19:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' ec24541d-ce4b-4a24-8cd2-51e8a2dec6ac '!=' ec24541d-ce4b-4a24-8cd2-51e8a2dec6ac ']' 00:21:32.297 19:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy concat 00:21:32.297 19:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:32.297 19:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:32.297 19:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 1462063 00:21:32.297 19:57:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1462063 ']' 00:21:32.297 19:57:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1462063 00:21:32.297 19:57:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:21:32.297 19:57:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:21:32.297 19:57:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1462063 00:21:32.297 19:57:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:21:32.297 19:57:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:21:32.297 19:57:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1462063' 00:21:32.297 killing process with pid 1462063 00:21:32.297 19:57:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 1462063 00:21:32.297 [2024-07-24 19:57:23.708251] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:32.297 [2024-07-24 19:57:23.708318] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:32.297 [2024-07-24 19:57:23.708387] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:32.297 [2024-07-24 19:57:23.708408] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1849530 name raid_bdev1, state offline 00:21:32.297 19:57:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 1462063 00:21:32.297 [2024-07-24 19:57:23.745190] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:32.557 19:57:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:21:32.557 00:21:32.557 real 0m18.021s 00:21:32.557 user 0m32.637s 00:21:32.557 sys 0m3.148s 00:21:32.557 19:57:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:21:32.557 19:57:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:32.557 ************************************ 00:21:32.557 END TEST raid_superblock_test 00:21:32.557 ************************************ 00:21:32.557 19:57:24 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:21:32.557 19:57:24 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:21:32.557 19:57:24 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:21:32.557 19:57:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:32.557 ************************************ 00:21:32.557 START TEST raid_read_error_test 00:21:32.557 ************************************ 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 4 read 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=concat 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=4 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev4 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' concat '!=' raid1 ']' 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.tXH5ULVLnG 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1464819 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1464819 /var/tmp/spdk-raid.sock 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 1464819 ']' 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:32.557 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:21:32.557 19:57:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:32.817 [2024-07-24 19:57:24.164914] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:21:32.817 [2024-07-24 19:57:24.165055] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1464819 ] 00:21:32.817 [2024-07-24 19:57:24.364146] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:33.076 [2024-07-24 19:57:24.469881] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:33.076 [2024-07-24 19:57:24.542588] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:33.076 [2024-07-24 19:57:24.542624] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:34.011 19:57:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:21:34.011 19:57:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:21:34.011 19:57:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:21:34.011 19:57:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:34.271 BaseBdev1_malloc 00:21:34.531 19:57:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:21:34.531 true 00:21:34.790 19:57:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:21:35.050 [2024-07-24 19:57:26.619813] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:21:35.050 [2024-07-24 19:57:26.619861] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:35.050 [2024-07-24 19:57:26.619884] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19c73a0 00:21:35.050 [2024-07-24 19:57:26.619896] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:35.050 [2024-07-24 19:57:26.621665] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:35.050 [2024-07-24 19:57:26.621693] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:35.050 BaseBdev1 00:21:35.391 19:57:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:21:35.391 19:57:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:35.391 BaseBdev2_malloc 00:21:35.391 19:57:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:21:35.650 true 00:21:35.650 19:57:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:21:36.219 [2024-07-24 19:57:27.628246] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:21:36.219 [2024-07-24 19:57:27.628290] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:36.219 [2024-07-24 19:57:27.628314] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a86370 00:21:36.219 [2024-07-24 19:57:27.628332] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:36.219 [2024-07-24 19:57:27.629895] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:36.219 [2024-07-24 19:57:27.629923] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:36.219 BaseBdev2 00:21:36.219 19:57:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:21:36.219 19:57:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:36.479 BaseBdev3_malloc 00:21:36.479 19:57:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:21:37.049 true 00:21:37.049 19:57:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:21:37.308 [2024-07-24 19:57:28.652098] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:21:37.308 [2024-07-24 19:57:28.652144] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:37.308 [2024-07-24 19:57:28.652169] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19bc2d0 00:21:37.308 [2024-07-24 19:57:28.652181] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:37.308 [2024-07-24 19:57:28.653772] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:37.308 [2024-07-24 19:57:28.653800] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:37.309 BaseBdev3 00:21:37.309 19:57:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:21:37.309 19:57:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:37.568 BaseBdev4_malloc 00:21:37.827 19:57:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:21:38.086 true 00:21:38.346 19:57:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:21:38.346 [2024-07-24 19:57:29.924014] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:21:38.346 [2024-07-24 19:57:29.924059] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:38.346 [2024-07-24 19:57:29.924086] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19bf310 00:21:38.346 [2024-07-24 19:57:29.924099] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:38.346 [2024-07-24 19:57:29.925703] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:38.346 [2024-07-24 19:57:29.925730] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:38.346 BaseBdev4 00:21:38.605 19:57:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:21:38.865 [2024-07-24 19:57:30.425363] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:38.865 [2024-07-24 19:57:30.426752] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:38.865 [2024-07-24 19:57:30.426820] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:38.865 [2024-07-24 19:57:30.426879] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:38.865 [2024-07-24 19:57:30.427118] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x19c0060 00:21:38.865 [2024-07-24 19:57:30.427129] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:38.865 [2024-07-24 19:57:30.427337] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19c0c10 00:21:38.865 [2024-07-24 19:57:30.427502] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19c0060 00:21:38.865 [2024-07-24 19:57:30.427513] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19c0060 00:21:38.865 [2024-07-24 19:57:30.427620] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:38.865 19:57:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:38.865 19:57:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:39.125 19:57:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:39.125 19:57:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:39.125 19:57:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:39.125 19:57:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:39.126 19:57:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:39.126 19:57:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:39.126 19:57:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:39.126 19:57:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:39.126 19:57:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:39.126 19:57:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:39.126 19:57:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:39.126 "name": "raid_bdev1", 00:21:39.126 "uuid": "32ed5513-f6c4-4208-9841-69fe6ace1fe8", 00:21:39.126 "strip_size_kb": 64, 00:21:39.126 "state": "online", 00:21:39.126 "raid_level": "concat", 00:21:39.126 "superblock": true, 00:21:39.126 "num_base_bdevs": 4, 00:21:39.126 "num_base_bdevs_discovered": 4, 00:21:39.126 "num_base_bdevs_operational": 4, 00:21:39.126 "base_bdevs_list": [ 00:21:39.126 { 00:21:39.126 "name": "BaseBdev1", 00:21:39.126 "uuid": "513be127-2cc0-5d20-bce1-78b67d592d89", 00:21:39.126 "is_configured": true, 00:21:39.126 "data_offset": 2048, 00:21:39.126 "data_size": 63488 00:21:39.126 }, 00:21:39.126 { 00:21:39.126 "name": "BaseBdev2", 00:21:39.126 "uuid": "aad68f73-d270-5a43-81f3-01fe27cfd2c4", 00:21:39.126 "is_configured": true, 00:21:39.126 "data_offset": 2048, 00:21:39.126 "data_size": 63488 00:21:39.126 }, 00:21:39.126 { 00:21:39.126 "name": "BaseBdev3", 00:21:39.126 "uuid": "46625b6f-6351-5725-9f83-0ee819fae69d", 00:21:39.126 "is_configured": true, 00:21:39.126 "data_offset": 2048, 00:21:39.126 "data_size": 63488 00:21:39.126 }, 00:21:39.126 { 00:21:39.126 "name": "BaseBdev4", 00:21:39.126 "uuid": "e4b2d409-a67a-5485-83d5-08c38f548f29", 00:21:39.126 "is_configured": true, 00:21:39.126 "data_offset": 2048, 00:21:39.126 "data_size": 63488 00:21:39.126 } 00:21:39.126 ] 00:21:39.126 }' 00:21:39.126 19:57:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:39.126 19:57:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:40.066 19:57:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:21:40.066 19:57:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:40.066 [2024-07-24 19:57:31.424281] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a87e20 00:21:41.005 19:57:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:21:41.005 19:57:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:21:41.005 19:57:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ concat = \r\a\i\d\1 ]] 00:21:41.006 19:57:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=4 00:21:41.006 19:57:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:41.006 19:57:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:41.006 19:57:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:41.006 19:57:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:41.006 19:57:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:41.006 19:57:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:41.006 19:57:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:41.006 19:57:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:41.006 19:57:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:41.006 19:57:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:41.006 19:57:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:41.006 19:57:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:41.265 19:57:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:41.265 "name": "raid_bdev1", 00:21:41.265 "uuid": "32ed5513-f6c4-4208-9841-69fe6ace1fe8", 00:21:41.265 "strip_size_kb": 64, 00:21:41.265 "state": "online", 00:21:41.265 "raid_level": "concat", 00:21:41.265 "superblock": true, 00:21:41.265 "num_base_bdevs": 4, 00:21:41.265 "num_base_bdevs_discovered": 4, 00:21:41.265 "num_base_bdevs_operational": 4, 00:21:41.265 "base_bdevs_list": [ 00:21:41.265 { 00:21:41.265 "name": "BaseBdev1", 00:21:41.265 "uuid": "513be127-2cc0-5d20-bce1-78b67d592d89", 00:21:41.265 "is_configured": true, 00:21:41.265 "data_offset": 2048, 00:21:41.265 "data_size": 63488 00:21:41.265 }, 00:21:41.265 { 00:21:41.265 "name": "BaseBdev2", 00:21:41.265 "uuid": "aad68f73-d270-5a43-81f3-01fe27cfd2c4", 00:21:41.265 "is_configured": true, 00:21:41.265 "data_offset": 2048, 00:21:41.265 "data_size": 63488 00:21:41.265 }, 00:21:41.265 { 00:21:41.265 "name": "BaseBdev3", 00:21:41.265 "uuid": "46625b6f-6351-5725-9f83-0ee819fae69d", 00:21:41.265 "is_configured": true, 00:21:41.265 "data_offset": 2048, 00:21:41.265 "data_size": 63488 00:21:41.265 }, 00:21:41.265 { 00:21:41.265 "name": "BaseBdev4", 00:21:41.265 "uuid": "e4b2d409-a67a-5485-83d5-08c38f548f29", 00:21:41.265 "is_configured": true, 00:21:41.265 "data_offset": 2048, 00:21:41.265 "data_size": 63488 00:21:41.265 } 00:21:41.265 ] 00:21:41.265 }' 00:21:41.265 19:57:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:41.265 19:57:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:41.833 19:57:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:42.412 [2024-07-24 19:57:33.900989] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:42.412 [2024-07-24 19:57:33.901027] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:42.412 [2024-07-24 19:57:33.904247] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:42.412 [2024-07-24 19:57:33.904287] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:42.412 [2024-07-24 19:57:33.904325] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:42.412 [2024-07-24 19:57:33.904336] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19c0060 name raid_bdev1, state offline 00:21:42.412 0 00:21:42.412 19:57:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1464819 00:21:42.412 19:57:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 1464819 ']' 00:21:42.412 19:57:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 1464819 00:21:42.412 19:57:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:21:42.412 19:57:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:21:42.412 19:57:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1464819 00:21:42.412 19:57:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:21:42.412 19:57:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:21:42.412 19:57:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1464819' 00:21:42.412 killing process with pid 1464819 00:21:42.412 19:57:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 1464819 00:21:42.412 [2024-07-24 19:57:33.984634] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:42.412 19:57:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 1464819 00:21:42.672 [2024-07-24 19:57:34.020733] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:42.672 19:57:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.tXH5ULVLnG 00:21:42.672 19:57:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:21:42.672 19:57:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:21:42.672 19:57:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.41 00:21:42.672 19:57:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy concat 00:21:42.672 19:57:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:42.672 19:57:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:42.672 19:57:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.41 != \0\.\0\0 ]] 00:21:42.672 00:21:42.672 real 0m10.225s 00:21:42.672 user 0m17.020s 00:21:42.672 sys 0m1.693s 00:21:42.672 19:57:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:21:42.672 19:57:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:42.672 ************************************ 00:21:42.672 END TEST raid_read_error_test 00:21:42.672 ************************************ 00:21:42.932 19:57:34 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:21:42.932 19:57:34 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:21:42.932 19:57:34 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:21:42.932 19:57:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:42.932 ************************************ 00:21:42.932 START TEST raid_write_error_test 00:21:42.932 ************************************ 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 4 write 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=concat 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=4 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev4 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' concat '!=' raid1 ']' 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.1rZE1WBpEQ 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1466229 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1466229 /var/tmp/spdk-raid.sock 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 1466229 ']' 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:42.932 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:21:42.932 19:57:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:42.932 [2024-07-24 19:57:34.435164] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:21:42.932 [2024-07-24 19:57:34.435238] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1466229 ] 00:21:43.191 [2024-07-24 19:57:34.565431] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:43.191 [2024-07-24 19:57:34.667551] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:43.191 [2024-07-24 19:57:34.737687] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:43.191 [2024-07-24 19:57:34.737724] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:44.130 19:57:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:21:44.130 19:57:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:21:44.130 19:57:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:21:44.130 19:57:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:44.130 BaseBdev1_malloc 00:21:44.130 19:57:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:21:44.389 true 00:21:44.389 19:57:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:21:44.649 [2024-07-24 19:57:36.087782] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:21:44.649 [2024-07-24 19:57:36.087827] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:44.649 [2024-07-24 19:57:36.087846] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe643a0 00:21:44.649 [2024-07-24 19:57:36.087859] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:44.649 [2024-07-24 19:57:36.089426] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:44.649 [2024-07-24 19:57:36.089458] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:44.649 BaseBdev1 00:21:44.649 19:57:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:21:44.649 19:57:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:44.908 BaseBdev2_malloc 00:21:44.908 19:57:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:21:45.167 true 00:21:45.167 19:57:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:21:45.426 [2024-07-24 19:57:36.906568] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:21:45.426 [2024-07-24 19:57:36.906611] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:45.426 [2024-07-24 19:57:36.906634] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf23370 00:21:45.426 [2024-07-24 19:57:36.906646] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:45.426 [2024-07-24 19:57:36.908121] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:45.426 [2024-07-24 19:57:36.908148] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:45.426 BaseBdev2 00:21:45.426 19:57:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:21:45.426 19:57:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:45.685 BaseBdev3_malloc 00:21:45.685 19:57:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:21:45.944 true 00:21:45.944 19:57:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:21:46.204 [2024-07-24 19:57:37.649016] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:21:46.204 [2024-07-24 19:57:37.649059] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:46.204 [2024-07-24 19:57:37.649082] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe592d0 00:21:46.204 [2024-07-24 19:57:37.649094] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:46.204 [2024-07-24 19:57:37.650573] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:46.204 [2024-07-24 19:57:37.650601] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:46.204 BaseBdev3 00:21:46.204 19:57:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:21:46.204 19:57:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:46.463 BaseBdev4_malloc 00:21:46.463 19:57:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:21:46.722 true 00:21:46.722 19:57:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:21:46.981 [2024-07-24 19:57:38.399635] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:21:46.981 [2024-07-24 19:57:38.399675] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:46.981 [2024-07-24 19:57:38.399696] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe5c310 00:21:46.981 [2024-07-24 19:57:38.399714] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:46.981 [2024-07-24 19:57:38.401096] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:46.981 [2024-07-24 19:57:38.401122] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:46.981 BaseBdev4 00:21:46.981 19:57:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:21:47.240 [2024-07-24 19:57:38.644309] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:47.240 [2024-07-24 19:57:38.645476] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:47.240 [2024-07-24 19:57:38.645541] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:47.240 [2024-07-24 19:57:38.645599] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:47.240 [2024-07-24 19:57:38.645825] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xe5d060 00:21:47.240 [2024-07-24 19:57:38.645837] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:47.240 [2024-07-24 19:57:38.646009] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe5dc10 00:21:47.240 [2024-07-24 19:57:38.646154] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe5d060 00:21:47.240 [2024-07-24 19:57:38.646164] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe5d060 00:21:47.241 [2024-07-24 19:57:38.646259] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:47.241 19:57:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:47.241 19:57:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:47.241 19:57:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:47.241 19:57:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:47.241 19:57:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:47.241 19:57:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:47.241 19:57:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:47.241 19:57:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:47.241 19:57:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:47.241 19:57:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:47.241 19:57:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:47.241 19:57:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:47.500 19:57:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:47.500 "name": "raid_bdev1", 00:21:47.500 "uuid": "76d4336d-ed49-4dbc-843c-d673cabae9df", 00:21:47.500 "strip_size_kb": 64, 00:21:47.500 "state": "online", 00:21:47.500 "raid_level": "concat", 00:21:47.500 "superblock": true, 00:21:47.500 "num_base_bdevs": 4, 00:21:47.500 "num_base_bdevs_discovered": 4, 00:21:47.500 "num_base_bdevs_operational": 4, 00:21:47.500 "base_bdevs_list": [ 00:21:47.500 { 00:21:47.500 "name": "BaseBdev1", 00:21:47.500 "uuid": "cb40321e-10fd-56af-9e21-2b5d0875f2e4", 00:21:47.500 "is_configured": true, 00:21:47.500 "data_offset": 2048, 00:21:47.500 "data_size": 63488 00:21:47.500 }, 00:21:47.500 { 00:21:47.500 "name": "BaseBdev2", 00:21:47.500 "uuid": "c4407f67-68c4-535e-bbb2-1f34b29e1bde", 00:21:47.500 "is_configured": true, 00:21:47.500 "data_offset": 2048, 00:21:47.500 "data_size": 63488 00:21:47.500 }, 00:21:47.500 { 00:21:47.500 "name": "BaseBdev3", 00:21:47.500 "uuid": "7b000286-b83b-575b-a687-d64629e419a0", 00:21:47.500 "is_configured": true, 00:21:47.500 "data_offset": 2048, 00:21:47.500 "data_size": 63488 00:21:47.500 }, 00:21:47.500 { 00:21:47.500 "name": "BaseBdev4", 00:21:47.500 "uuid": "62a40092-bf08-5e1b-b712-74df0d5e7472", 00:21:47.500 "is_configured": true, 00:21:47.500 "data_offset": 2048, 00:21:47.500 "data_size": 63488 00:21:47.500 } 00:21:47.500 ] 00:21:47.500 }' 00:21:47.500 19:57:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:47.501 19:57:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:48.070 19:57:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:48.070 19:57:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:21:48.070 [2024-07-24 19:57:39.615179] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf24e20 00:21:49.009 19:57:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:21:49.268 19:57:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:21:49.268 19:57:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ concat = \r\a\i\d\1 ]] 00:21:49.268 19:57:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=4 00:21:49.268 19:57:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:49.268 19:57:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:49.268 19:57:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:49.268 19:57:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:49.268 19:57:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:49.268 19:57:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:49.268 19:57:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:49.268 19:57:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:49.268 19:57:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:49.268 19:57:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:49.268 19:57:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:49.268 19:57:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:49.527 19:57:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:49.527 "name": "raid_bdev1", 00:21:49.527 "uuid": "76d4336d-ed49-4dbc-843c-d673cabae9df", 00:21:49.527 "strip_size_kb": 64, 00:21:49.527 "state": "online", 00:21:49.527 "raid_level": "concat", 00:21:49.527 "superblock": true, 00:21:49.527 "num_base_bdevs": 4, 00:21:49.527 "num_base_bdevs_discovered": 4, 00:21:49.527 "num_base_bdevs_operational": 4, 00:21:49.527 "base_bdevs_list": [ 00:21:49.527 { 00:21:49.527 "name": "BaseBdev1", 00:21:49.527 "uuid": "cb40321e-10fd-56af-9e21-2b5d0875f2e4", 00:21:49.527 "is_configured": true, 00:21:49.527 "data_offset": 2048, 00:21:49.527 "data_size": 63488 00:21:49.527 }, 00:21:49.527 { 00:21:49.527 "name": "BaseBdev2", 00:21:49.527 "uuid": "c4407f67-68c4-535e-bbb2-1f34b29e1bde", 00:21:49.527 "is_configured": true, 00:21:49.527 "data_offset": 2048, 00:21:49.527 "data_size": 63488 00:21:49.527 }, 00:21:49.527 { 00:21:49.527 "name": "BaseBdev3", 00:21:49.527 "uuid": "7b000286-b83b-575b-a687-d64629e419a0", 00:21:49.527 "is_configured": true, 00:21:49.527 "data_offset": 2048, 00:21:49.527 "data_size": 63488 00:21:49.527 }, 00:21:49.527 { 00:21:49.527 "name": "BaseBdev4", 00:21:49.527 "uuid": "62a40092-bf08-5e1b-b712-74df0d5e7472", 00:21:49.527 "is_configured": true, 00:21:49.527 "data_offset": 2048, 00:21:49.527 "data_size": 63488 00:21:49.527 } 00:21:49.527 ] 00:21:49.527 }' 00:21:49.527 19:57:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:49.527 19:57:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:50.096 19:57:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:50.356 [2024-07-24 19:57:41.869048] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:50.356 [2024-07-24 19:57:41.869095] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:50.356 [2024-07-24 19:57:41.872285] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:50.356 [2024-07-24 19:57:41.872335] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:50.356 [2024-07-24 19:57:41.872376] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:50.356 [2024-07-24 19:57:41.872387] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe5d060 name raid_bdev1, state offline 00:21:50.356 0 00:21:50.356 19:57:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1466229 00:21:50.356 19:57:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 1466229 ']' 00:21:50.356 19:57:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 1466229 00:21:50.356 19:57:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:21:50.356 19:57:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:21:50.356 19:57:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1466229 00:21:50.356 19:57:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:21:50.356 19:57:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:21:50.356 19:57:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1466229' 00:21:50.356 killing process with pid 1466229 00:21:50.356 19:57:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 1466229 00:21:50.356 [2024-07-24 19:57:41.938770] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:50.356 19:57:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 1466229 00:21:50.615 [2024-07-24 19:57:41.974340] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:50.875 19:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.1rZE1WBpEQ 00:21:50.875 19:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:21:50.875 19:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:21:50.875 19:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.45 00:21:50.875 19:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy concat 00:21:50.875 19:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:50.875 19:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:50.875 19:57:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.45 != \0\.\0\0 ]] 00:21:50.875 00:21:50.875 real 0m7.868s 00:21:50.875 user 0m12.649s 00:21:50.875 sys 0m1.346s 00:21:50.875 19:57:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:21:50.875 19:57:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:50.875 ************************************ 00:21:50.875 END TEST raid_write_error_test 00:21:50.875 ************************************ 00:21:50.875 19:57:42 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:21:50.875 19:57:42 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:21:50.875 19:57:42 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:21:50.875 19:57:42 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:21:50.875 19:57:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:50.875 ************************************ 00:21:50.875 START TEST raid_state_function_test 00:21:50.875 ************************************ 00:21:50.875 19:57:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 4 false 00:21:50.875 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:21:50.875 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:21:50.875 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:21:50.875 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:21:50.875 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:21:50.875 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:50.875 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:21:50.875 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:50.875 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:50.875 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:21:50.875 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:50.875 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:50.875 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:21:50.875 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:50.875 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:50.875 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:21:50.875 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:50.875 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:50.875 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:50.875 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:21:50.876 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:21:50.876 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:21:50.876 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:21:50.876 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:21:50.876 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:21:50.876 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:21:50.876 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:21:50.876 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:21:50.876 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1467378 00:21:50.876 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1467378' 00:21:50.876 Process raid pid: 1467378 00:21:50.876 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:50.876 19:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1467378 /var/tmp/spdk-raid.sock 00:21:50.876 19:57:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 1467378 ']' 00:21:50.876 19:57:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:50.876 19:57:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:21:50.876 19:57:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:50.876 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:50.876 19:57:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:21:50.876 19:57:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:50.876 [2024-07-24 19:57:42.379844] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:21:50.876 [2024-07-24 19:57:42.379910] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:51.135 [2024-07-24 19:57:42.508378] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:51.135 [2024-07-24 19:57:42.614176] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:51.135 [2024-07-24 19:57:42.687501] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:51.135 [2024-07-24 19:57:42.687539] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:52.072 19:57:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:21:52.072 19:57:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:21:52.072 19:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:52.072 [2024-07-24 19:57:43.534613] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:52.072 [2024-07-24 19:57:43.534651] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:52.072 [2024-07-24 19:57:43.534662] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:52.072 [2024-07-24 19:57:43.534674] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:52.072 [2024-07-24 19:57:43.534683] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:52.072 [2024-07-24 19:57:43.534693] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:52.072 [2024-07-24 19:57:43.534702] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:52.072 [2024-07-24 19:57:43.534712] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:52.072 19:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:52.073 19:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:52.073 19:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:52.073 19:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:52.073 19:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:52.073 19:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:52.073 19:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:52.073 19:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:52.073 19:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:52.073 19:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:52.073 19:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:52.073 19:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:52.332 19:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:52.332 "name": "Existed_Raid", 00:21:52.332 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:52.332 "strip_size_kb": 0, 00:21:52.332 "state": "configuring", 00:21:52.332 "raid_level": "raid1", 00:21:52.332 "superblock": false, 00:21:52.332 "num_base_bdevs": 4, 00:21:52.332 "num_base_bdevs_discovered": 0, 00:21:52.332 "num_base_bdevs_operational": 4, 00:21:52.332 "base_bdevs_list": [ 00:21:52.332 { 00:21:52.332 "name": "BaseBdev1", 00:21:52.332 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:52.332 "is_configured": false, 00:21:52.332 "data_offset": 0, 00:21:52.332 "data_size": 0 00:21:52.332 }, 00:21:52.332 { 00:21:52.332 "name": "BaseBdev2", 00:21:52.332 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:52.332 "is_configured": false, 00:21:52.332 "data_offset": 0, 00:21:52.332 "data_size": 0 00:21:52.332 }, 00:21:52.332 { 00:21:52.332 "name": "BaseBdev3", 00:21:52.332 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:52.332 "is_configured": false, 00:21:52.332 "data_offset": 0, 00:21:52.332 "data_size": 0 00:21:52.332 }, 00:21:52.332 { 00:21:52.332 "name": "BaseBdev4", 00:21:52.332 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:52.332 "is_configured": false, 00:21:52.332 "data_offset": 0, 00:21:52.332 "data_size": 0 00:21:52.332 } 00:21:52.332 ] 00:21:52.332 }' 00:21:52.332 19:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:52.332 19:57:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:52.898 19:57:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:53.157 [2024-07-24 19:57:44.641430] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:53.157 [2024-07-24 19:57:44.641458] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xff1a30 name Existed_Raid, state configuring 00:21:53.157 19:57:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:53.416 [2024-07-24 19:57:44.890090] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:53.416 [2024-07-24 19:57:44.890118] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:53.416 [2024-07-24 19:57:44.890128] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:53.416 [2024-07-24 19:57:44.890139] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:53.416 [2024-07-24 19:57:44.890148] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:53.416 [2024-07-24 19:57:44.890159] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:53.416 [2024-07-24 19:57:44.890167] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:53.416 [2024-07-24 19:57:44.890178] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:53.416 19:57:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:53.676 [2024-07-24 19:57:45.140672] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:53.676 BaseBdev1 00:21:53.676 19:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:21:53.676 19:57:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:21:53.676 19:57:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:53.676 19:57:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:21:53.676 19:57:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:53.676 19:57:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:53.676 19:57:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:53.935 19:57:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:54.194 [ 00:21:54.194 { 00:21:54.194 "name": "BaseBdev1", 00:21:54.194 "aliases": [ 00:21:54.194 "046f0ec2-ae2a-4d0d-af68-eac98511b4b1" 00:21:54.194 ], 00:21:54.194 "product_name": "Malloc disk", 00:21:54.194 "block_size": 512, 00:21:54.194 "num_blocks": 65536, 00:21:54.194 "uuid": "046f0ec2-ae2a-4d0d-af68-eac98511b4b1", 00:21:54.194 "assigned_rate_limits": { 00:21:54.194 "rw_ios_per_sec": 0, 00:21:54.194 "rw_mbytes_per_sec": 0, 00:21:54.194 "r_mbytes_per_sec": 0, 00:21:54.194 "w_mbytes_per_sec": 0 00:21:54.194 }, 00:21:54.194 "claimed": true, 00:21:54.194 "claim_type": "exclusive_write", 00:21:54.194 "zoned": false, 00:21:54.194 "supported_io_types": { 00:21:54.194 "read": true, 00:21:54.194 "write": true, 00:21:54.194 "unmap": true, 00:21:54.194 "flush": true, 00:21:54.194 "reset": true, 00:21:54.194 "nvme_admin": false, 00:21:54.194 "nvme_io": false, 00:21:54.194 "nvme_io_md": false, 00:21:54.194 "write_zeroes": true, 00:21:54.194 "zcopy": true, 00:21:54.194 "get_zone_info": false, 00:21:54.194 "zone_management": false, 00:21:54.194 "zone_append": false, 00:21:54.194 "compare": false, 00:21:54.194 "compare_and_write": false, 00:21:54.194 "abort": true, 00:21:54.194 "seek_hole": false, 00:21:54.194 "seek_data": false, 00:21:54.194 "copy": true, 00:21:54.194 "nvme_iov_md": false 00:21:54.194 }, 00:21:54.194 "memory_domains": [ 00:21:54.194 { 00:21:54.194 "dma_device_id": "system", 00:21:54.194 "dma_device_type": 1 00:21:54.194 }, 00:21:54.194 { 00:21:54.194 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:54.194 "dma_device_type": 2 00:21:54.194 } 00:21:54.194 ], 00:21:54.194 "driver_specific": {} 00:21:54.194 } 00:21:54.194 ] 00:21:54.194 19:57:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:21:54.194 19:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:54.194 19:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:54.194 19:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:54.194 19:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:54.194 19:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:54.194 19:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:54.194 19:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:54.194 19:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:54.194 19:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:54.194 19:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:54.194 19:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:54.195 19:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:54.453 19:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:54.453 "name": "Existed_Raid", 00:21:54.453 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:54.453 "strip_size_kb": 0, 00:21:54.453 "state": "configuring", 00:21:54.453 "raid_level": "raid1", 00:21:54.453 "superblock": false, 00:21:54.453 "num_base_bdevs": 4, 00:21:54.453 "num_base_bdevs_discovered": 1, 00:21:54.453 "num_base_bdevs_operational": 4, 00:21:54.453 "base_bdevs_list": [ 00:21:54.453 { 00:21:54.453 "name": "BaseBdev1", 00:21:54.453 "uuid": "046f0ec2-ae2a-4d0d-af68-eac98511b4b1", 00:21:54.453 "is_configured": true, 00:21:54.453 "data_offset": 0, 00:21:54.453 "data_size": 65536 00:21:54.453 }, 00:21:54.453 { 00:21:54.453 "name": "BaseBdev2", 00:21:54.453 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:54.453 "is_configured": false, 00:21:54.453 "data_offset": 0, 00:21:54.453 "data_size": 0 00:21:54.453 }, 00:21:54.453 { 00:21:54.453 "name": "BaseBdev3", 00:21:54.453 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:54.453 "is_configured": false, 00:21:54.453 "data_offset": 0, 00:21:54.453 "data_size": 0 00:21:54.453 }, 00:21:54.453 { 00:21:54.453 "name": "BaseBdev4", 00:21:54.453 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:54.453 "is_configured": false, 00:21:54.453 "data_offset": 0, 00:21:54.453 "data_size": 0 00:21:54.453 } 00:21:54.453 ] 00:21:54.453 }' 00:21:54.453 19:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:54.453 19:57:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:55.020 19:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:55.315 [2024-07-24 19:57:46.720858] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:55.315 [2024-07-24 19:57:46.720896] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xff12a0 name Existed_Raid, state configuring 00:21:55.315 19:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:55.647 [2024-07-24 19:57:46.969538] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:55.647 [2024-07-24 19:57:46.970989] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:55.647 [2024-07-24 19:57:46.971019] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:55.647 [2024-07-24 19:57:46.971029] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:55.647 [2024-07-24 19:57:46.971041] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:55.647 [2024-07-24 19:57:46.971050] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:55.647 [2024-07-24 19:57:46.971065] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:55.647 19:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:21:55.647 19:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:55.647 19:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:55.647 19:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:55.647 19:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:55.647 19:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:55.647 19:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:55.647 19:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:55.647 19:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:55.647 19:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:55.647 19:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:55.647 19:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:55.647 19:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:55.647 19:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:55.907 19:57:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:55.907 "name": "Existed_Raid", 00:21:55.907 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:55.907 "strip_size_kb": 0, 00:21:55.907 "state": "configuring", 00:21:55.907 "raid_level": "raid1", 00:21:55.907 "superblock": false, 00:21:55.907 "num_base_bdevs": 4, 00:21:55.907 "num_base_bdevs_discovered": 1, 00:21:55.907 "num_base_bdevs_operational": 4, 00:21:55.907 "base_bdevs_list": [ 00:21:55.907 { 00:21:55.907 "name": "BaseBdev1", 00:21:55.907 "uuid": "046f0ec2-ae2a-4d0d-af68-eac98511b4b1", 00:21:55.907 "is_configured": true, 00:21:55.907 "data_offset": 0, 00:21:55.907 "data_size": 65536 00:21:55.907 }, 00:21:55.907 { 00:21:55.907 "name": "BaseBdev2", 00:21:55.907 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:55.907 "is_configured": false, 00:21:55.907 "data_offset": 0, 00:21:55.907 "data_size": 0 00:21:55.907 }, 00:21:55.907 { 00:21:55.907 "name": "BaseBdev3", 00:21:55.907 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:55.907 "is_configured": false, 00:21:55.907 "data_offset": 0, 00:21:55.907 "data_size": 0 00:21:55.907 }, 00:21:55.907 { 00:21:55.907 "name": "BaseBdev4", 00:21:55.907 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:55.907 "is_configured": false, 00:21:55.907 "data_offset": 0, 00:21:55.907 "data_size": 0 00:21:55.907 } 00:21:55.907 ] 00:21:55.907 }' 00:21:55.907 19:57:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:55.907 19:57:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:56.845 19:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:56.845 [2024-07-24 19:57:48.368628] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:56.845 BaseBdev2 00:21:56.845 19:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:21:56.845 19:57:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:21:56.845 19:57:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:56.845 19:57:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:21:56.845 19:57:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:56.845 19:57:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:56.845 19:57:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:57.104 19:57:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:57.364 [ 00:21:57.364 { 00:21:57.364 "name": "BaseBdev2", 00:21:57.364 "aliases": [ 00:21:57.364 "49ab113a-0c0f-4f48-a761-811f83cfe91d" 00:21:57.364 ], 00:21:57.364 "product_name": "Malloc disk", 00:21:57.364 "block_size": 512, 00:21:57.364 "num_blocks": 65536, 00:21:57.364 "uuid": "49ab113a-0c0f-4f48-a761-811f83cfe91d", 00:21:57.364 "assigned_rate_limits": { 00:21:57.364 "rw_ios_per_sec": 0, 00:21:57.364 "rw_mbytes_per_sec": 0, 00:21:57.364 "r_mbytes_per_sec": 0, 00:21:57.364 "w_mbytes_per_sec": 0 00:21:57.364 }, 00:21:57.364 "claimed": true, 00:21:57.364 "claim_type": "exclusive_write", 00:21:57.364 "zoned": false, 00:21:57.364 "supported_io_types": { 00:21:57.364 "read": true, 00:21:57.364 "write": true, 00:21:57.364 "unmap": true, 00:21:57.364 "flush": true, 00:21:57.364 "reset": true, 00:21:57.364 "nvme_admin": false, 00:21:57.364 "nvme_io": false, 00:21:57.364 "nvme_io_md": false, 00:21:57.364 "write_zeroes": true, 00:21:57.364 "zcopy": true, 00:21:57.364 "get_zone_info": false, 00:21:57.364 "zone_management": false, 00:21:57.364 "zone_append": false, 00:21:57.364 "compare": false, 00:21:57.364 "compare_and_write": false, 00:21:57.364 "abort": true, 00:21:57.364 "seek_hole": false, 00:21:57.364 "seek_data": false, 00:21:57.364 "copy": true, 00:21:57.364 "nvme_iov_md": false 00:21:57.364 }, 00:21:57.364 "memory_domains": [ 00:21:57.364 { 00:21:57.364 "dma_device_id": "system", 00:21:57.364 "dma_device_type": 1 00:21:57.364 }, 00:21:57.364 { 00:21:57.364 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:57.364 "dma_device_type": 2 00:21:57.364 } 00:21:57.364 ], 00:21:57.364 "driver_specific": {} 00:21:57.364 } 00:21:57.364 ] 00:21:57.364 19:57:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:21:57.364 19:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:57.364 19:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:57.364 19:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:57.364 19:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:57.364 19:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:57.364 19:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:57.364 19:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:57.364 19:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:57.364 19:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:57.364 19:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:57.364 19:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:57.364 19:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:57.364 19:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:57.364 19:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:57.623 19:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:57.623 "name": "Existed_Raid", 00:21:57.623 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:57.623 "strip_size_kb": 0, 00:21:57.623 "state": "configuring", 00:21:57.623 "raid_level": "raid1", 00:21:57.623 "superblock": false, 00:21:57.623 "num_base_bdevs": 4, 00:21:57.623 "num_base_bdevs_discovered": 2, 00:21:57.623 "num_base_bdevs_operational": 4, 00:21:57.623 "base_bdevs_list": [ 00:21:57.623 { 00:21:57.623 "name": "BaseBdev1", 00:21:57.623 "uuid": "046f0ec2-ae2a-4d0d-af68-eac98511b4b1", 00:21:57.623 "is_configured": true, 00:21:57.623 "data_offset": 0, 00:21:57.623 "data_size": 65536 00:21:57.623 }, 00:21:57.623 { 00:21:57.623 "name": "BaseBdev2", 00:21:57.623 "uuid": "49ab113a-0c0f-4f48-a761-811f83cfe91d", 00:21:57.623 "is_configured": true, 00:21:57.623 "data_offset": 0, 00:21:57.623 "data_size": 65536 00:21:57.623 }, 00:21:57.623 { 00:21:57.623 "name": "BaseBdev3", 00:21:57.623 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:57.623 "is_configured": false, 00:21:57.623 "data_offset": 0, 00:21:57.623 "data_size": 0 00:21:57.623 }, 00:21:57.623 { 00:21:57.623 "name": "BaseBdev4", 00:21:57.623 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:57.623 "is_configured": false, 00:21:57.623 "data_offset": 0, 00:21:57.623 "data_size": 0 00:21:57.623 } 00:21:57.623 ] 00:21:57.623 }' 00:21:57.623 19:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:57.623 19:57:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:58.192 19:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:58.452 [2024-07-24 19:57:49.892021] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:58.452 BaseBdev3 00:21:58.452 19:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:21:58.452 19:57:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:21:58.452 19:57:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:58.452 19:57:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:21:58.452 19:57:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:58.452 19:57:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:58.452 19:57:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:58.711 19:57:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:58.971 [ 00:21:58.971 { 00:21:58.971 "name": "BaseBdev3", 00:21:58.971 "aliases": [ 00:21:58.971 "f728cdc7-8909-4d9e-b8cf-c4ddde516ba7" 00:21:58.971 ], 00:21:58.971 "product_name": "Malloc disk", 00:21:58.971 "block_size": 512, 00:21:58.971 "num_blocks": 65536, 00:21:58.971 "uuid": "f728cdc7-8909-4d9e-b8cf-c4ddde516ba7", 00:21:58.971 "assigned_rate_limits": { 00:21:58.971 "rw_ios_per_sec": 0, 00:21:58.971 "rw_mbytes_per_sec": 0, 00:21:58.971 "r_mbytes_per_sec": 0, 00:21:58.971 "w_mbytes_per_sec": 0 00:21:58.971 }, 00:21:58.971 "claimed": true, 00:21:58.971 "claim_type": "exclusive_write", 00:21:58.971 "zoned": false, 00:21:58.971 "supported_io_types": { 00:21:58.971 "read": true, 00:21:58.971 "write": true, 00:21:58.971 "unmap": true, 00:21:58.971 "flush": true, 00:21:58.971 "reset": true, 00:21:58.971 "nvme_admin": false, 00:21:58.971 "nvme_io": false, 00:21:58.971 "nvme_io_md": false, 00:21:58.971 "write_zeroes": true, 00:21:58.971 "zcopy": true, 00:21:58.971 "get_zone_info": false, 00:21:58.971 "zone_management": false, 00:21:58.971 "zone_append": false, 00:21:58.971 "compare": false, 00:21:58.971 "compare_and_write": false, 00:21:58.971 "abort": true, 00:21:58.971 "seek_hole": false, 00:21:58.971 "seek_data": false, 00:21:58.971 "copy": true, 00:21:58.971 "nvme_iov_md": false 00:21:58.971 }, 00:21:58.971 "memory_domains": [ 00:21:58.971 { 00:21:58.971 "dma_device_id": "system", 00:21:58.971 "dma_device_type": 1 00:21:58.971 }, 00:21:58.971 { 00:21:58.971 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:58.971 "dma_device_type": 2 00:21:58.971 } 00:21:58.971 ], 00:21:58.971 "driver_specific": {} 00:21:58.971 } 00:21:58.971 ] 00:21:58.971 19:57:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:21:58.971 19:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:58.971 19:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:58.971 19:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:58.971 19:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:58.971 19:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:58.971 19:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:58.971 19:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:58.971 19:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:58.971 19:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:58.971 19:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:58.971 19:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:58.971 19:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:58.972 19:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:58.972 19:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:59.231 19:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:59.231 "name": "Existed_Raid", 00:21:59.231 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:59.231 "strip_size_kb": 0, 00:21:59.231 "state": "configuring", 00:21:59.231 "raid_level": "raid1", 00:21:59.231 "superblock": false, 00:21:59.231 "num_base_bdevs": 4, 00:21:59.231 "num_base_bdevs_discovered": 3, 00:21:59.231 "num_base_bdevs_operational": 4, 00:21:59.231 "base_bdevs_list": [ 00:21:59.231 { 00:21:59.231 "name": "BaseBdev1", 00:21:59.231 "uuid": "046f0ec2-ae2a-4d0d-af68-eac98511b4b1", 00:21:59.231 "is_configured": true, 00:21:59.231 "data_offset": 0, 00:21:59.231 "data_size": 65536 00:21:59.231 }, 00:21:59.231 { 00:21:59.231 "name": "BaseBdev2", 00:21:59.231 "uuid": "49ab113a-0c0f-4f48-a761-811f83cfe91d", 00:21:59.231 "is_configured": true, 00:21:59.231 "data_offset": 0, 00:21:59.231 "data_size": 65536 00:21:59.231 }, 00:21:59.231 { 00:21:59.231 "name": "BaseBdev3", 00:21:59.231 "uuid": "f728cdc7-8909-4d9e-b8cf-c4ddde516ba7", 00:21:59.231 "is_configured": true, 00:21:59.231 "data_offset": 0, 00:21:59.231 "data_size": 65536 00:21:59.231 }, 00:21:59.231 { 00:21:59.231 "name": "BaseBdev4", 00:21:59.231 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:59.231 "is_configured": false, 00:21:59.231 "data_offset": 0, 00:21:59.231 "data_size": 0 00:21:59.231 } 00:21:59.231 ] 00:21:59.231 }' 00:21:59.231 19:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:59.231 19:57:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:59.799 19:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:22:00.059 [2024-07-24 19:57:51.499711] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:00.059 [2024-07-24 19:57:51.499748] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xff2300 00:22:00.059 [2024-07-24 19:57:51.499757] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:22:00.059 [2024-07-24 19:57:51.499973] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xff3280 00:22:00.059 [2024-07-24 19:57:51.500105] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xff2300 00:22:00.059 [2024-07-24 19:57:51.500115] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xff2300 00:22:00.059 [2024-07-24 19:57:51.500273] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:00.059 BaseBdev4 00:22:00.059 19:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:22:00.059 19:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:22:00.059 19:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:00.059 19:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:22:00.059 19:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:00.059 19:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:00.059 19:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:00.318 19:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:00.577 [ 00:22:00.577 { 00:22:00.577 "name": "BaseBdev4", 00:22:00.577 "aliases": [ 00:22:00.577 "f3582a1f-c2fd-4696-abfe-e5e21c3ddd28" 00:22:00.577 ], 00:22:00.577 "product_name": "Malloc disk", 00:22:00.577 "block_size": 512, 00:22:00.577 "num_blocks": 65536, 00:22:00.577 "uuid": "f3582a1f-c2fd-4696-abfe-e5e21c3ddd28", 00:22:00.577 "assigned_rate_limits": { 00:22:00.577 "rw_ios_per_sec": 0, 00:22:00.577 "rw_mbytes_per_sec": 0, 00:22:00.577 "r_mbytes_per_sec": 0, 00:22:00.577 "w_mbytes_per_sec": 0 00:22:00.577 }, 00:22:00.577 "claimed": true, 00:22:00.577 "claim_type": "exclusive_write", 00:22:00.577 "zoned": false, 00:22:00.577 "supported_io_types": { 00:22:00.577 "read": true, 00:22:00.577 "write": true, 00:22:00.577 "unmap": true, 00:22:00.577 "flush": true, 00:22:00.577 "reset": true, 00:22:00.577 "nvme_admin": false, 00:22:00.577 "nvme_io": false, 00:22:00.577 "nvme_io_md": false, 00:22:00.577 "write_zeroes": true, 00:22:00.577 "zcopy": true, 00:22:00.577 "get_zone_info": false, 00:22:00.577 "zone_management": false, 00:22:00.577 "zone_append": false, 00:22:00.577 "compare": false, 00:22:00.577 "compare_and_write": false, 00:22:00.577 "abort": true, 00:22:00.577 "seek_hole": false, 00:22:00.577 "seek_data": false, 00:22:00.577 "copy": true, 00:22:00.577 "nvme_iov_md": false 00:22:00.577 }, 00:22:00.577 "memory_domains": [ 00:22:00.577 { 00:22:00.577 "dma_device_id": "system", 00:22:00.577 "dma_device_type": 1 00:22:00.577 }, 00:22:00.577 { 00:22:00.577 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:00.577 "dma_device_type": 2 00:22:00.577 } 00:22:00.577 ], 00:22:00.577 "driver_specific": {} 00:22:00.577 } 00:22:00.577 ] 00:22:00.577 19:57:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:22:00.577 19:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:00.577 19:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:00.577 19:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:22:00.577 19:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:00.577 19:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:00.577 19:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:00.577 19:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:00.577 19:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:00.577 19:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:00.577 19:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:00.577 19:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:00.577 19:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:00.577 19:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:00.578 19:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:00.837 19:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:00.838 "name": "Existed_Raid", 00:22:00.838 "uuid": "1ce76e03-d5b2-4ba8-b7a1-ea7e43830f34", 00:22:00.838 "strip_size_kb": 0, 00:22:00.838 "state": "online", 00:22:00.838 "raid_level": "raid1", 00:22:00.838 "superblock": false, 00:22:00.838 "num_base_bdevs": 4, 00:22:00.838 "num_base_bdevs_discovered": 4, 00:22:00.838 "num_base_bdevs_operational": 4, 00:22:00.838 "base_bdevs_list": [ 00:22:00.838 { 00:22:00.838 "name": "BaseBdev1", 00:22:00.838 "uuid": "046f0ec2-ae2a-4d0d-af68-eac98511b4b1", 00:22:00.838 "is_configured": true, 00:22:00.838 "data_offset": 0, 00:22:00.838 "data_size": 65536 00:22:00.838 }, 00:22:00.838 { 00:22:00.838 "name": "BaseBdev2", 00:22:00.838 "uuid": "49ab113a-0c0f-4f48-a761-811f83cfe91d", 00:22:00.838 "is_configured": true, 00:22:00.838 "data_offset": 0, 00:22:00.838 "data_size": 65536 00:22:00.838 }, 00:22:00.838 { 00:22:00.838 "name": "BaseBdev3", 00:22:00.838 "uuid": "f728cdc7-8909-4d9e-b8cf-c4ddde516ba7", 00:22:00.838 "is_configured": true, 00:22:00.838 "data_offset": 0, 00:22:00.838 "data_size": 65536 00:22:00.838 }, 00:22:00.838 { 00:22:00.838 "name": "BaseBdev4", 00:22:00.838 "uuid": "f3582a1f-c2fd-4696-abfe-e5e21c3ddd28", 00:22:00.838 "is_configured": true, 00:22:00.838 "data_offset": 0, 00:22:00.838 "data_size": 65536 00:22:00.838 } 00:22:00.838 ] 00:22:00.838 }' 00:22:00.838 19:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:00.838 19:57:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:01.406 19:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:22:01.406 19:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:01.406 19:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:01.406 19:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:01.406 19:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:01.406 19:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:01.406 19:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:01.406 19:57:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:01.665 [2024-07-24 19:57:53.124375] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:01.665 19:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:01.665 "name": "Existed_Raid", 00:22:01.665 "aliases": [ 00:22:01.665 "1ce76e03-d5b2-4ba8-b7a1-ea7e43830f34" 00:22:01.665 ], 00:22:01.665 "product_name": "Raid Volume", 00:22:01.665 "block_size": 512, 00:22:01.665 "num_blocks": 65536, 00:22:01.665 "uuid": "1ce76e03-d5b2-4ba8-b7a1-ea7e43830f34", 00:22:01.665 "assigned_rate_limits": { 00:22:01.665 "rw_ios_per_sec": 0, 00:22:01.665 "rw_mbytes_per_sec": 0, 00:22:01.665 "r_mbytes_per_sec": 0, 00:22:01.665 "w_mbytes_per_sec": 0 00:22:01.665 }, 00:22:01.665 "claimed": false, 00:22:01.665 "zoned": false, 00:22:01.665 "supported_io_types": { 00:22:01.665 "read": true, 00:22:01.665 "write": true, 00:22:01.665 "unmap": false, 00:22:01.665 "flush": false, 00:22:01.665 "reset": true, 00:22:01.665 "nvme_admin": false, 00:22:01.665 "nvme_io": false, 00:22:01.665 "nvme_io_md": false, 00:22:01.665 "write_zeroes": true, 00:22:01.665 "zcopy": false, 00:22:01.665 "get_zone_info": false, 00:22:01.665 "zone_management": false, 00:22:01.665 "zone_append": false, 00:22:01.665 "compare": false, 00:22:01.665 "compare_and_write": false, 00:22:01.665 "abort": false, 00:22:01.665 "seek_hole": false, 00:22:01.665 "seek_data": false, 00:22:01.665 "copy": false, 00:22:01.665 "nvme_iov_md": false 00:22:01.665 }, 00:22:01.665 "memory_domains": [ 00:22:01.665 { 00:22:01.665 "dma_device_id": "system", 00:22:01.665 "dma_device_type": 1 00:22:01.665 }, 00:22:01.665 { 00:22:01.665 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:01.665 "dma_device_type": 2 00:22:01.666 }, 00:22:01.666 { 00:22:01.666 "dma_device_id": "system", 00:22:01.666 "dma_device_type": 1 00:22:01.666 }, 00:22:01.666 { 00:22:01.666 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:01.666 "dma_device_type": 2 00:22:01.666 }, 00:22:01.666 { 00:22:01.666 "dma_device_id": "system", 00:22:01.666 "dma_device_type": 1 00:22:01.666 }, 00:22:01.666 { 00:22:01.666 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:01.666 "dma_device_type": 2 00:22:01.666 }, 00:22:01.666 { 00:22:01.666 "dma_device_id": "system", 00:22:01.666 "dma_device_type": 1 00:22:01.666 }, 00:22:01.666 { 00:22:01.666 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:01.666 "dma_device_type": 2 00:22:01.666 } 00:22:01.666 ], 00:22:01.666 "driver_specific": { 00:22:01.666 "raid": { 00:22:01.666 "uuid": "1ce76e03-d5b2-4ba8-b7a1-ea7e43830f34", 00:22:01.666 "strip_size_kb": 0, 00:22:01.666 "state": "online", 00:22:01.666 "raid_level": "raid1", 00:22:01.666 "superblock": false, 00:22:01.666 "num_base_bdevs": 4, 00:22:01.666 "num_base_bdevs_discovered": 4, 00:22:01.666 "num_base_bdevs_operational": 4, 00:22:01.666 "base_bdevs_list": [ 00:22:01.666 { 00:22:01.666 "name": "BaseBdev1", 00:22:01.666 "uuid": "046f0ec2-ae2a-4d0d-af68-eac98511b4b1", 00:22:01.666 "is_configured": true, 00:22:01.666 "data_offset": 0, 00:22:01.666 "data_size": 65536 00:22:01.666 }, 00:22:01.666 { 00:22:01.666 "name": "BaseBdev2", 00:22:01.666 "uuid": "49ab113a-0c0f-4f48-a761-811f83cfe91d", 00:22:01.666 "is_configured": true, 00:22:01.666 "data_offset": 0, 00:22:01.666 "data_size": 65536 00:22:01.666 }, 00:22:01.666 { 00:22:01.666 "name": "BaseBdev3", 00:22:01.666 "uuid": "f728cdc7-8909-4d9e-b8cf-c4ddde516ba7", 00:22:01.666 "is_configured": true, 00:22:01.666 "data_offset": 0, 00:22:01.666 "data_size": 65536 00:22:01.666 }, 00:22:01.666 { 00:22:01.666 "name": "BaseBdev4", 00:22:01.666 "uuid": "f3582a1f-c2fd-4696-abfe-e5e21c3ddd28", 00:22:01.666 "is_configured": true, 00:22:01.666 "data_offset": 0, 00:22:01.666 "data_size": 65536 00:22:01.666 } 00:22:01.666 ] 00:22:01.666 } 00:22:01.666 } 00:22:01.666 }' 00:22:01.666 19:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:01.666 19:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:22:01.666 BaseBdev2 00:22:01.666 BaseBdev3 00:22:01.666 BaseBdev4' 00:22:01.666 19:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:01.666 19:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:22:01.666 19:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:02.244 19:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:02.244 "name": "BaseBdev1", 00:22:02.244 "aliases": [ 00:22:02.244 "046f0ec2-ae2a-4d0d-af68-eac98511b4b1" 00:22:02.244 ], 00:22:02.244 "product_name": "Malloc disk", 00:22:02.244 "block_size": 512, 00:22:02.244 "num_blocks": 65536, 00:22:02.244 "uuid": "046f0ec2-ae2a-4d0d-af68-eac98511b4b1", 00:22:02.244 "assigned_rate_limits": { 00:22:02.244 "rw_ios_per_sec": 0, 00:22:02.244 "rw_mbytes_per_sec": 0, 00:22:02.244 "r_mbytes_per_sec": 0, 00:22:02.244 "w_mbytes_per_sec": 0 00:22:02.244 }, 00:22:02.244 "claimed": true, 00:22:02.244 "claim_type": "exclusive_write", 00:22:02.244 "zoned": false, 00:22:02.244 "supported_io_types": { 00:22:02.244 "read": true, 00:22:02.244 "write": true, 00:22:02.244 "unmap": true, 00:22:02.244 "flush": true, 00:22:02.244 "reset": true, 00:22:02.244 "nvme_admin": false, 00:22:02.244 "nvme_io": false, 00:22:02.244 "nvme_io_md": false, 00:22:02.244 "write_zeroes": true, 00:22:02.244 "zcopy": true, 00:22:02.244 "get_zone_info": false, 00:22:02.244 "zone_management": false, 00:22:02.244 "zone_append": false, 00:22:02.244 "compare": false, 00:22:02.244 "compare_and_write": false, 00:22:02.244 "abort": true, 00:22:02.244 "seek_hole": false, 00:22:02.244 "seek_data": false, 00:22:02.244 "copy": true, 00:22:02.244 "nvme_iov_md": false 00:22:02.244 }, 00:22:02.244 "memory_domains": [ 00:22:02.244 { 00:22:02.244 "dma_device_id": "system", 00:22:02.244 "dma_device_type": 1 00:22:02.244 }, 00:22:02.244 { 00:22:02.244 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:02.244 "dma_device_type": 2 00:22:02.244 } 00:22:02.244 ], 00:22:02.244 "driver_specific": {} 00:22:02.244 }' 00:22:02.244 19:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:02.244 19:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:02.503 19:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:02.503 19:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:02.503 19:57:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:02.503 19:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:02.503 19:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:02.762 19:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:02.762 19:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:02.762 19:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:02.762 19:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:02.762 19:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:02.762 19:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:02.762 19:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:02.762 19:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:03.330 19:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:03.330 "name": "BaseBdev2", 00:22:03.330 "aliases": [ 00:22:03.330 "49ab113a-0c0f-4f48-a761-811f83cfe91d" 00:22:03.330 ], 00:22:03.330 "product_name": "Malloc disk", 00:22:03.330 "block_size": 512, 00:22:03.330 "num_blocks": 65536, 00:22:03.330 "uuid": "49ab113a-0c0f-4f48-a761-811f83cfe91d", 00:22:03.330 "assigned_rate_limits": { 00:22:03.330 "rw_ios_per_sec": 0, 00:22:03.330 "rw_mbytes_per_sec": 0, 00:22:03.330 "r_mbytes_per_sec": 0, 00:22:03.330 "w_mbytes_per_sec": 0 00:22:03.330 }, 00:22:03.330 "claimed": true, 00:22:03.330 "claim_type": "exclusive_write", 00:22:03.330 "zoned": false, 00:22:03.330 "supported_io_types": { 00:22:03.330 "read": true, 00:22:03.330 "write": true, 00:22:03.330 "unmap": true, 00:22:03.330 "flush": true, 00:22:03.330 "reset": true, 00:22:03.330 "nvme_admin": false, 00:22:03.330 "nvme_io": false, 00:22:03.330 "nvme_io_md": false, 00:22:03.330 "write_zeroes": true, 00:22:03.330 "zcopy": true, 00:22:03.330 "get_zone_info": false, 00:22:03.330 "zone_management": false, 00:22:03.330 "zone_append": false, 00:22:03.330 "compare": false, 00:22:03.330 "compare_and_write": false, 00:22:03.330 "abort": true, 00:22:03.330 "seek_hole": false, 00:22:03.330 "seek_data": false, 00:22:03.330 "copy": true, 00:22:03.330 "nvme_iov_md": false 00:22:03.330 }, 00:22:03.330 "memory_domains": [ 00:22:03.330 { 00:22:03.330 "dma_device_id": "system", 00:22:03.330 "dma_device_type": 1 00:22:03.330 }, 00:22:03.330 { 00:22:03.330 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:03.330 "dma_device_type": 2 00:22:03.330 } 00:22:03.330 ], 00:22:03.330 "driver_specific": {} 00:22:03.330 }' 00:22:03.330 19:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:03.330 19:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:03.330 19:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:03.330 19:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:03.590 19:57:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:03.590 19:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:03.590 19:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:03.590 19:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:03.851 19:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:03.851 19:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:03.851 19:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:03.851 19:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:03.851 19:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:03.851 19:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:03.851 19:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:04.418 19:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:04.418 "name": "BaseBdev3", 00:22:04.418 "aliases": [ 00:22:04.418 "f728cdc7-8909-4d9e-b8cf-c4ddde516ba7" 00:22:04.418 ], 00:22:04.418 "product_name": "Malloc disk", 00:22:04.418 "block_size": 512, 00:22:04.418 "num_blocks": 65536, 00:22:04.418 "uuid": "f728cdc7-8909-4d9e-b8cf-c4ddde516ba7", 00:22:04.418 "assigned_rate_limits": { 00:22:04.418 "rw_ios_per_sec": 0, 00:22:04.418 "rw_mbytes_per_sec": 0, 00:22:04.418 "r_mbytes_per_sec": 0, 00:22:04.418 "w_mbytes_per_sec": 0 00:22:04.418 }, 00:22:04.418 "claimed": true, 00:22:04.418 "claim_type": "exclusive_write", 00:22:04.418 "zoned": false, 00:22:04.418 "supported_io_types": { 00:22:04.418 "read": true, 00:22:04.418 "write": true, 00:22:04.418 "unmap": true, 00:22:04.418 "flush": true, 00:22:04.418 "reset": true, 00:22:04.418 "nvme_admin": false, 00:22:04.418 "nvme_io": false, 00:22:04.418 "nvme_io_md": false, 00:22:04.418 "write_zeroes": true, 00:22:04.418 "zcopy": true, 00:22:04.418 "get_zone_info": false, 00:22:04.418 "zone_management": false, 00:22:04.418 "zone_append": false, 00:22:04.418 "compare": false, 00:22:04.418 "compare_and_write": false, 00:22:04.418 "abort": true, 00:22:04.418 "seek_hole": false, 00:22:04.418 "seek_data": false, 00:22:04.418 "copy": true, 00:22:04.418 "nvme_iov_md": false 00:22:04.418 }, 00:22:04.418 "memory_domains": [ 00:22:04.418 { 00:22:04.418 "dma_device_id": "system", 00:22:04.418 "dma_device_type": 1 00:22:04.418 }, 00:22:04.418 { 00:22:04.418 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:04.418 "dma_device_type": 2 00:22:04.418 } 00:22:04.418 ], 00:22:04.418 "driver_specific": {} 00:22:04.418 }' 00:22:04.418 19:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:04.418 19:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:04.418 19:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:04.418 19:57:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:04.677 19:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:04.677 19:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:04.677 19:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:04.677 19:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:04.677 19:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:04.677 19:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:04.936 19:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:04.936 19:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:04.936 19:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:04.936 19:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:04.936 19:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:05.195 19:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:05.195 "name": "BaseBdev4", 00:22:05.195 "aliases": [ 00:22:05.195 "f3582a1f-c2fd-4696-abfe-e5e21c3ddd28" 00:22:05.195 ], 00:22:05.195 "product_name": "Malloc disk", 00:22:05.195 "block_size": 512, 00:22:05.195 "num_blocks": 65536, 00:22:05.195 "uuid": "f3582a1f-c2fd-4696-abfe-e5e21c3ddd28", 00:22:05.195 "assigned_rate_limits": { 00:22:05.195 "rw_ios_per_sec": 0, 00:22:05.195 "rw_mbytes_per_sec": 0, 00:22:05.195 "r_mbytes_per_sec": 0, 00:22:05.195 "w_mbytes_per_sec": 0 00:22:05.195 }, 00:22:05.195 "claimed": true, 00:22:05.195 "claim_type": "exclusive_write", 00:22:05.195 "zoned": false, 00:22:05.195 "supported_io_types": { 00:22:05.195 "read": true, 00:22:05.195 "write": true, 00:22:05.195 "unmap": true, 00:22:05.195 "flush": true, 00:22:05.195 "reset": true, 00:22:05.195 "nvme_admin": false, 00:22:05.195 "nvme_io": false, 00:22:05.195 "nvme_io_md": false, 00:22:05.195 "write_zeroes": true, 00:22:05.195 "zcopy": true, 00:22:05.195 "get_zone_info": false, 00:22:05.195 "zone_management": false, 00:22:05.195 "zone_append": false, 00:22:05.195 "compare": false, 00:22:05.195 "compare_and_write": false, 00:22:05.195 "abort": true, 00:22:05.195 "seek_hole": false, 00:22:05.195 "seek_data": false, 00:22:05.195 "copy": true, 00:22:05.195 "nvme_iov_md": false 00:22:05.195 }, 00:22:05.195 "memory_domains": [ 00:22:05.195 { 00:22:05.195 "dma_device_id": "system", 00:22:05.195 "dma_device_type": 1 00:22:05.195 }, 00:22:05.195 { 00:22:05.195 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:05.196 "dma_device_type": 2 00:22:05.196 } 00:22:05.196 ], 00:22:05.196 "driver_specific": {} 00:22:05.196 }' 00:22:05.196 19:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:05.196 19:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:05.196 19:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:05.196 19:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:05.196 19:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:05.196 19:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:05.196 19:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:05.196 19:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:05.455 19:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:05.455 19:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:05.455 19:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:05.455 19:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:05.455 19:57:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:05.713 [2024-07-24 19:57:57.122691] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:05.713 19:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:22:05.713 19:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:22:05.714 19:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:05.714 19:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:22:05.714 19:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:22:05.714 19:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:22:05.714 19:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:05.714 19:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:05.714 19:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:05.714 19:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:05.714 19:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:05.714 19:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:05.714 19:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:05.714 19:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:05.714 19:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:05.714 19:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:05.714 19:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:05.973 19:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:05.973 "name": "Existed_Raid", 00:22:05.973 "uuid": "1ce76e03-d5b2-4ba8-b7a1-ea7e43830f34", 00:22:05.973 "strip_size_kb": 0, 00:22:05.973 "state": "online", 00:22:05.973 "raid_level": "raid1", 00:22:05.973 "superblock": false, 00:22:05.973 "num_base_bdevs": 4, 00:22:05.973 "num_base_bdevs_discovered": 3, 00:22:05.973 "num_base_bdevs_operational": 3, 00:22:05.973 "base_bdevs_list": [ 00:22:05.973 { 00:22:05.973 "name": null, 00:22:05.973 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:05.973 "is_configured": false, 00:22:05.973 "data_offset": 0, 00:22:05.973 "data_size": 65536 00:22:05.973 }, 00:22:05.973 { 00:22:05.973 "name": "BaseBdev2", 00:22:05.973 "uuid": "49ab113a-0c0f-4f48-a761-811f83cfe91d", 00:22:05.973 "is_configured": true, 00:22:05.973 "data_offset": 0, 00:22:05.973 "data_size": 65536 00:22:05.973 }, 00:22:05.973 { 00:22:05.973 "name": "BaseBdev3", 00:22:05.973 "uuid": "f728cdc7-8909-4d9e-b8cf-c4ddde516ba7", 00:22:05.973 "is_configured": true, 00:22:05.973 "data_offset": 0, 00:22:05.973 "data_size": 65536 00:22:05.973 }, 00:22:05.973 { 00:22:05.973 "name": "BaseBdev4", 00:22:05.973 "uuid": "f3582a1f-c2fd-4696-abfe-e5e21c3ddd28", 00:22:05.973 "is_configured": true, 00:22:05.973 "data_offset": 0, 00:22:05.973 "data_size": 65536 00:22:05.973 } 00:22:05.973 ] 00:22:05.973 }' 00:22:05.973 19:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:05.973 19:57:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:06.541 19:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:22:06.541 19:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:06.541 19:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:06.541 19:57:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:06.800 19:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:06.800 19:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:06.800 19:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:22:07.368 [2024-07-24 19:57:58.716008] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:07.368 19:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:07.368 19:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:07.368 19:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:07.368 19:57:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:07.936 19:57:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:07.936 19:57:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:07.936 19:57:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:22:08.194 [2024-07-24 19:57:59.753481] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:08.451 19:57:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:08.451 19:57:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:08.451 19:57:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:08.451 19:57:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:09.017 19:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:09.017 19:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:09.017 19:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:22:09.275 [2024-07-24 19:58:00.796585] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:22:09.275 [2024-07-24 19:58:00.796662] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:09.275 [2024-07-24 19:58:00.807498] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:09.275 [2024-07-24 19:58:00.807532] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:09.276 [2024-07-24 19:58:00.807544] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xff2300 name Existed_Raid, state offline 00:22:09.276 19:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:09.276 19:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:09.276 19:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:09.276 19:58:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:22:09.843 19:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:22:09.843 19:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:22:09.843 19:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:22:09.843 19:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:22:09.843 19:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:09.843 19:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:22:10.101 BaseBdev2 00:22:10.101 19:58:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:22:10.101 19:58:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:22:10.101 19:58:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:10.101 19:58:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:22:10.101 19:58:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:10.101 19:58:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:10.101 19:58:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:10.667 19:58:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:10.667 [ 00:22:10.667 { 00:22:10.667 "name": "BaseBdev2", 00:22:10.667 "aliases": [ 00:22:10.667 "2bdfc2fb-7b1b-48fe-90a1-d9ca9e779bc1" 00:22:10.667 ], 00:22:10.667 "product_name": "Malloc disk", 00:22:10.667 "block_size": 512, 00:22:10.667 "num_blocks": 65536, 00:22:10.667 "uuid": "2bdfc2fb-7b1b-48fe-90a1-d9ca9e779bc1", 00:22:10.667 "assigned_rate_limits": { 00:22:10.667 "rw_ios_per_sec": 0, 00:22:10.667 "rw_mbytes_per_sec": 0, 00:22:10.667 "r_mbytes_per_sec": 0, 00:22:10.667 "w_mbytes_per_sec": 0 00:22:10.667 }, 00:22:10.667 "claimed": false, 00:22:10.667 "zoned": false, 00:22:10.667 "supported_io_types": { 00:22:10.667 "read": true, 00:22:10.667 "write": true, 00:22:10.667 "unmap": true, 00:22:10.667 "flush": true, 00:22:10.667 "reset": true, 00:22:10.667 "nvme_admin": false, 00:22:10.667 "nvme_io": false, 00:22:10.667 "nvme_io_md": false, 00:22:10.667 "write_zeroes": true, 00:22:10.667 "zcopy": true, 00:22:10.667 "get_zone_info": false, 00:22:10.667 "zone_management": false, 00:22:10.667 "zone_append": false, 00:22:10.667 "compare": false, 00:22:10.668 "compare_and_write": false, 00:22:10.668 "abort": true, 00:22:10.668 "seek_hole": false, 00:22:10.668 "seek_data": false, 00:22:10.668 "copy": true, 00:22:10.668 "nvme_iov_md": false 00:22:10.668 }, 00:22:10.668 "memory_domains": [ 00:22:10.668 { 00:22:10.668 "dma_device_id": "system", 00:22:10.668 "dma_device_type": 1 00:22:10.668 }, 00:22:10.668 { 00:22:10.668 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:10.668 "dma_device_type": 2 00:22:10.668 } 00:22:10.668 ], 00:22:10.668 "driver_specific": {} 00:22:10.668 } 00:22:10.668 ] 00:22:10.668 19:58:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:22:10.668 19:58:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:10.668 19:58:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:10.668 19:58:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:22:11.235 BaseBdev3 00:22:11.235 19:58:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:22:11.235 19:58:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:22:11.235 19:58:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:11.235 19:58:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:22:11.235 19:58:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:11.235 19:58:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:11.235 19:58:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:11.493 19:58:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:22:12.060 [ 00:22:12.060 { 00:22:12.060 "name": "BaseBdev3", 00:22:12.060 "aliases": [ 00:22:12.060 "7e73c08e-6a38-4ba5-ad73-ec3ff16cf46c" 00:22:12.060 ], 00:22:12.060 "product_name": "Malloc disk", 00:22:12.060 "block_size": 512, 00:22:12.060 "num_blocks": 65536, 00:22:12.060 "uuid": "7e73c08e-6a38-4ba5-ad73-ec3ff16cf46c", 00:22:12.060 "assigned_rate_limits": { 00:22:12.060 "rw_ios_per_sec": 0, 00:22:12.060 "rw_mbytes_per_sec": 0, 00:22:12.060 "r_mbytes_per_sec": 0, 00:22:12.060 "w_mbytes_per_sec": 0 00:22:12.060 }, 00:22:12.060 "claimed": false, 00:22:12.060 "zoned": false, 00:22:12.060 "supported_io_types": { 00:22:12.060 "read": true, 00:22:12.060 "write": true, 00:22:12.060 "unmap": true, 00:22:12.060 "flush": true, 00:22:12.060 "reset": true, 00:22:12.060 "nvme_admin": false, 00:22:12.060 "nvme_io": false, 00:22:12.060 "nvme_io_md": false, 00:22:12.060 "write_zeroes": true, 00:22:12.060 "zcopy": true, 00:22:12.060 "get_zone_info": false, 00:22:12.060 "zone_management": false, 00:22:12.060 "zone_append": false, 00:22:12.060 "compare": false, 00:22:12.060 "compare_and_write": false, 00:22:12.060 "abort": true, 00:22:12.060 "seek_hole": false, 00:22:12.060 "seek_data": false, 00:22:12.060 "copy": true, 00:22:12.060 "nvme_iov_md": false 00:22:12.060 }, 00:22:12.060 "memory_domains": [ 00:22:12.060 { 00:22:12.060 "dma_device_id": "system", 00:22:12.060 "dma_device_type": 1 00:22:12.060 }, 00:22:12.060 { 00:22:12.060 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:12.060 "dma_device_type": 2 00:22:12.060 } 00:22:12.060 ], 00:22:12.060 "driver_specific": {} 00:22:12.060 } 00:22:12.060 ] 00:22:12.060 19:58:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:22:12.060 19:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:12.060 19:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:12.060 19:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:22:12.319 BaseBdev4 00:22:12.319 19:58:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:22:12.319 19:58:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:22:12.319 19:58:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:12.319 19:58:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:22:12.319 19:58:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:12.319 19:58:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:12.319 19:58:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:12.885 19:58:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:13.144 [ 00:22:13.144 { 00:22:13.144 "name": "BaseBdev4", 00:22:13.144 "aliases": [ 00:22:13.144 "5080280b-c380-4621-9f9a-7e85ff705868" 00:22:13.144 ], 00:22:13.144 "product_name": "Malloc disk", 00:22:13.144 "block_size": 512, 00:22:13.144 "num_blocks": 65536, 00:22:13.144 "uuid": "5080280b-c380-4621-9f9a-7e85ff705868", 00:22:13.144 "assigned_rate_limits": { 00:22:13.144 "rw_ios_per_sec": 0, 00:22:13.144 "rw_mbytes_per_sec": 0, 00:22:13.144 "r_mbytes_per_sec": 0, 00:22:13.144 "w_mbytes_per_sec": 0 00:22:13.144 }, 00:22:13.144 "claimed": false, 00:22:13.144 "zoned": false, 00:22:13.144 "supported_io_types": { 00:22:13.144 "read": true, 00:22:13.144 "write": true, 00:22:13.144 "unmap": true, 00:22:13.144 "flush": true, 00:22:13.144 "reset": true, 00:22:13.144 "nvme_admin": false, 00:22:13.144 "nvme_io": false, 00:22:13.144 "nvme_io_md": false, 00:22:13.144 "write_zeroes": true, 00:22:13.144 "zcopy": true, 00:22:13.144 "get_zone_info": false, 00:22:13.144 "zone_management": false, 00:22:13.144 "zone_append": false, 00:22:13.144 "compare": false, 00:22:13.144 "compare_and_write": false, 00:22:13.144 "abort": true, 00:22:13.144 "seek_hole": false, 00:22:13.144 "seek_data": false, 00:22:13.144 "copy": true, 00:22:13.144 "nvme_iov_md": false 00:22:13.144 }, 00:22:13.144 "memory_domains": [ 00:22:13.144 { 00:22:13.144 "dma_device_id": "system", 00:22:13.144 "dma_device_type": 1 00:22:13.144 }, 00:22:13.144 { 00:22:13.144 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:13.144 "dma_device_type": 2 00:22:13.144 } 00:22:13.144 ], 00:22:13.144 "driver_specific": {} 00:22:13.144 } 00:22:13.144 ] 00:22:13.144 19:58:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:22:13.144 19:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:13.144 19:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:13.144 19:58:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:13.711 [2024-07-24 19:58:05.181334] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:13.711 [2024-07-24 19:58:05.181374] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:13.711 [2024-07-24 19:58:05.181402] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:13.711 [2024-07-24 19:58:05.182775] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:13.711 [2024-07-24 19:58:05.182815] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:13.711 19:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:13.711 19:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:13.711 19:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:13.712 19:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:13.712 19:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:13.712 19:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:13.712 19:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:13.712 19:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:13.712 19:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:13.712 19:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:13.712 19:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:13.712 19:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:14.324 19:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:14.324 "name": "Existed_Raid", 00:22:14.324 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:14.324 "strip_size_kb": 0, 00:22:14.324 "state": "configuring", 00:22:14.324 "raid_level": "raid1", 00:22:14.324 "superblock": false, 00:22:14.324 "num_base_bdevs": 4, 00:22:14.324 "num_base_bdevs_discovered": 3, 00:22:14.324 "num_base_bdevs_operational": 4, 00:22:14.324 "base_bdevs_list": [ 00:22:14.324 { 00:22:14.324 "name": "BaseBdev1", 00:22:14.324 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:14.324 "is_configured": false, 00:22:14.324 "data_offset": 0, 00:22:14.324 "data_size": 0 00:22:14.324 }, 00:22:14.324 { 00:22:14.324 "name": "BaseBdev2", 00:22:14.324 "uuid": "2bdfc2fb-7b1b-48fe-90a1-d9ca9e779bc1", 00:22:14.324 "is_configured": true, 00:22:14.324 "data_offset": 0, 00:22:14.324 "data_size": 65536 00:22:14.324 }, 00:22:14.324 { 00:22:14.324 "name": "BaseBdev3", 00:22:14.324 "uuid": "7e73c08e-6a38-4ba5-ad73-ec3ff16cf46c", 00:22:14.324 "is_configured": true, 00:22:14.324 "data_offset": 0, 00:22:14.324 "data_size": 65536 00:22:14.324 }, 00:22:14.324 { 00:22:14.324 "name": "BaseBdev4", 00:22:14.324 "uuid": "5080280b-c380-4621-9f9a-7e85ff705868", 00:22:14.324 "is_configured": true, 00:22:14.324 "data_offset": 0, 00:22:14.324 "data_size": 65536 00:22:14.324 } 00:22:14.324 ] 00:22:14.324 }' 00:22:14.324 19:58:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:14.324 19:58:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:14.951 19:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:15.211 [2024-07-24 19:58:06.548947] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:15.211 19:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:15.211 19:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:15.211 19:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:15.211 19:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:15.211 19:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:15.211 19:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:15.211 19:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:15.211 19:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:15.211 19:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:15.211 19:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:15.211 19:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:15.211 19:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:15.211 19:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:15.211 "name": "Existed_Raid", 00:22:15.211 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:15.211 "strip_size_kb": 0, 00:22:15.211 "state": "configuring", 00:22:15.211 "raid_level": "raid1", 00:22:15.211 "superblock": false, 00:22:15.211 "num_base_bdevs": 4, 00:22:15.211 "num_base_bdevs_discovered": 2, 00:22:15.211 "num_base_bdevs_operational": 4, 00:22:15.211 "base_bdevs_list": [ 00:22:15.211 { 00:22:15.211 "name": "BaseBdev1", 00:22:15.211 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:15.211 "is_configured": false, 00:22:15.211 "data_offset": 0, 00:22:15.211 "data_size": 0 00:22:15.211 }, 00:22:15.211 { 00:22:15.211 "name": null, 00:22:15.211 "uuid": "2bdfc2fb-7b1b-48fe-90a1-d9ca9e779bc1", 00:22:15.211 "is_configured": false, 00:22:15.211 "data_offset": 0, 00:22:15.211 "data_size": 65536 00:22:15.211 }, 00:22:15.211 { 00:22:15.211 "name": "BaseBdev3", 00:22:15.211 "uuid": "7e73c08e-6a38-4ba5-ad73-ec3ff16cf46c", 00:22:15.211 "is_configured": true, 00:22:15.211 "data_offset": 0, 00:22:15.211 "data_size": 65536 00:22:15.211 }, 00:22:15.211 { 00:22:15.211 "name": "BaseBdev4", 00:22:15.211 "uuid": "5080280b-c380-4621-9f9a-7e85ff705868", 00:22:15.211 "is_configured": true, 00:22:15.211 "data_offset": 0, 00:22:15.211 "data_size": 65536 00:22:15.211 } 00:22:15.211 ] 00:22:15.211 }' 00:22:15.211 19:58:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:15.211 19:58:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:15.780 19:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:15.780 19:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:16.039 19:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:22:16.039 19:58:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:16.608 [2024-07-24 19:58:08.033516] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:16.608 BaseBdev1 00:22:16.608 19:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:22:16.608 19:58:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:22:16.608 19:58:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:16.608 19:58:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:22:16.608 19:58:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:16.608 19:58:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:16.608 19:58:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:16.867 19:58:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:17.135 [ 00:22:17.135 { 00:22:17.135 "name": "BaseBdev1", 00:22:17.135 "aliases": [ 00:22:17.135 "0dc2af7b-6abb-44a3-a701-0149509eb429" 00:22:17.135 ], 00:22:17.135 "product_name": "Malloc disk", 00:22:17.135 "block_size": 512, 00:22:17.135 "num_blocks": 65536, 00:22:17.135 "uuid": "0dc2af7b-6abb-44a3-a701-0149509eb429", 00:22:17.135 "assigned_rate_limits": { 00:22:17.135 "rw_ios_per_sec": 0, 00:22:17.135 "rw_mbytes_per_sec": 0, 00:22:17.135 "r_mbytes_per_sec": 0, 00:22:17.135 "w_mbytes_per_sec": 0 00:22:17.135 }, 00:22:17.135 "claimed": true, 00:22:17.135 "claim_type": "exclusive_write", 00:22:17.135 "zoned": false, 00:22:17.135 "supported_io_types": { 00:22:17.135 "read": true, 00:22:17.135 "write": true, 00:22:17.135 "unmap": true, 00:22:17.135 "flush": true, 00:22:17.135 "reset": true, 00:22:17.135 "nvme_admin": false, 00:22:17.135 "nvme_io": false, 00:22:17.135 "nvme_io_md": false, 00:22:17.135 "write_zeroes": true, 00:22:17.135 "zcopy": true, 00:22:17.135 "get_zone_info": false, 00:22:17.135 "zone_management": false, 00:22:17.135 "zone_append": false, 00:22:17.135 "compare": false, 00:22:17.135 "compare_and_write": false, 00:22:17.135 "abort": true, 00:22:17.135 "seek_hole": false, 00:22:17.135 "seek_data": false, 00:22:17.135 "copy": true, 00:22:17.135 "nvme_iov_md": false 00:22:17.135 }, 00:22:17.135 "memory_domains": [ 00:22:17.135 { 00:22:17.135 "dma_device_id": "system", 00:22:17.135 "dma_device_type": 1 00:22:17.135 }, 00:22:17.135 { 00:22:17.135 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:17.135 "dma_device_type": 2 00:22:17.135 } 00:22:17.135 ], 00:22:17.135 "driver_specific": {} 00:22:17.135 } 00:22:17.135 ] 00:22:17.136 19:58:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:22:17.136 19:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:17.136 19:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:17.136 19:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:17.136 19:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:17.136 19:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:17.136 19:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:17.136 19:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:17.136 19:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:17.136 19:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:17.136 19:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:17.136 19:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:17.136 19:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:17.395 19:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:17.395 "name": "Existed_Raid", 00:22:17.395 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:17.395 "strip_size_kb": 0, 00:22:17.395 "state": "configuring", 00:22:17.395 "raid_level": "raid1", 00:22:17.395 "superblock": false, 00:22:17.395 "num_base_bdevs": 4, 00:22:17.395 "num_base_bdevs_discovered": 3, 00:22:17.395 "num_base_bdevs_operational": 4, 00:22:17.395 "base_bdevs_list": [ 00:22:17.395 { 00:22:17.395 "name": "BaseBdev1", 00:22:17.395 "uuid": "0dc2af7b-6abb-44a3-a701-0149509eb429", 00:22:17.395 "is_configured": true, 00:22:17.395 "data_offset": 0, 00:22:17.395 "data_size": 65536 00:22:17.395 }, 00:22:17.395 { 00:22:17.395 "name": null, 00:22:17.395 "uuid": "2bdfc2fb-7b1b-48fe-90a1-d9ca9e779bc1", 00:22:17.395 "is_configured": false, 00:22:17.395 "data_offset": 0, 00:22:17.395 "data_size": 65536 00:22:17.395 }, 00:22:17.396 { 00:22:17.396 "name": "BaseBdev3", 00:22:17.396 "uuid": "7e73c08e-6a38-4ba5-ad73-ec3ff16cf46c", 00:22:17.396 "is_configured": true, 00:22:17.396 "data_offset": 0, 00:22:17.396 "data_size": 65536 00:22:17.396 }, 00:22:17.396 { 00:22:17.396 "name": "BaseBdev4", 00:22:17.396 "uuid": "5080280b-c380-4621-9f9a-7e85ff705868", 00:22:17.396 "is_configured": true, 00:22:17.396 "data_offset": 0, 00:22:17.396 "data_size": 65536 00:22:17.396 } 00:22:17.396 ] 00:22:17.396 }' 00:22:17.396 19:58:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:17.396 19:58:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:17.964 19:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:17.964 19:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:18.223 19:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:22:18.223 19:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:22:18.482 [2024-07-24 19:58:09.870409] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:18.482 19:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:18.482 19:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:18.482 19:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:18.482 19:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:18.482 19:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:18.482 19:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:18.482 19:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:18.482 19:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:18.482 19:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:18.482 19:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:18.482 19:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:18.482 19:58:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:18.741 19:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:18.741 "name": "Existed_Raid", 00:22:18.741 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:18.741 "strip_size_kb": 0, 00:22:18.741 "state": "configuring", 00:22:18.741 "raid_level": "raid1", 00:22:18.741 "superblock": false, 00:22:18.741 "num_base_bdevs": 4, 00:22:18.741 "num_base_bdevs_discovered": 2, 00:22:18.741 "num_base_bdevs_operational": 4, 00:22:18.741 "base_bdevs_list": [ 00:22:18.741 { 00:22:18.741 "name": "BaseBdev1", 00:22:18.741 "uuid": "0dc2af7b-6abb-44a3-a701-0149509eb429", 00:22:18.741 "is_configured": true, 00:22:18.741 "data_offset": 0, 00:22:18.741 "data_size": 65536 00:22:18.741 }, 00:22:18.741 { 00:22:18.741 "name": null, 00:22:18.741 "uuid": "2bdfc2fb-7b1b-48fe-90a1-d9ca9e779bc1", 00:22:18.741 "is_configured": false, 00:22:18.741 "data_offset": 0, 00:22:18.741 "data_size": 65536 00:22:18.741 }, 00:22:18.741 { 00:22:18.741 "name": null, 00:22:18.741 "uuid": "7e73c08e-6a38-4ba5-ad73-ec3ff16cf46c", 00:22:18.741 "is_configured": false, 00:22:18.741 "data_offset": 0, 00:22:18.741 "data_size": 65536 00:22:18.741 }, 00:22:18.741 { 00:22:18.741 "name": "BaseBdev4", 00:22:18.741 "uuid": "5080280b-c380-4621-9f9a-7e85ff705868", 00:22:18.741 "is_configured": true, 00:22:18.741 "data_offset": 0, 00:22:18.741 "data_size": 65536 00:22:18.741 } 00:22:18.741 ] 00:22:18.741 }' 00:22:18.741 19:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:18.741 19:58:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:19.309 19:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:19.309 19:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:19.568 19:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:22:19.568 19:58:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:22:20.136 [2024-07-24 19:58:11.474688] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:20.136 19:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:20.136 19:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:20.136 19:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:20.136 19:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:20.136 19:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:20.136 19:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:20.136 19:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:20.136 19:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:20.136 19:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:20.136 19:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:20.136 19:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:20.136 19:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:20.395 19:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:20.395 "name": "Existed_Raid", 00:22:20.395 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:20.395 "strip_size_kb": 0, 00:22:20.395 "state": "configuring", 00:22:20.395 "raid_level": "raid1", 00:22:20.395 "superblock": false, 00:22:20.395 "num_base_bdevs": 4, 00:22:20.395 "num_base_bdevs_discovered": 3, 00:22:20.395 "num_base_bdevs_operational": 4, 00:22:20.395 "base_bdevs_list": [ 00:22:20.395 { 00:22:20.395 "name": "BaseBdev1", 00:22:20.395 "uuid": "0dc2af7b-6abb-44a3-a701-0149509eb429", 00:22:20.395 "is_configured": true, 00:22:20.395 "data_offset": 0, 00:22:20.395 "data_size": 65536 00:22:20.395 }, 00:22:20.395 { 00:22:20.395 "name": null, 00:22:20.395 "uuid": "2bdfc2fb-7b1b-48fe-90a1-d9ca9e779bc1", 00:22:20.395 "is_configured": false, 00:22:20.395 "data_offset": 0, 00:22:20.395 "data_size": 65536 00:22:20.395 }, 00:22:20.395 { 00:22:20.395 "name": "BaseBdev3", 00:22:20.395 "uuid": "7e73c08e-6a38-4ba5-ad73-ec3ff16cf46c", 00:22:20.395 "is_configured": true, 00:22:20.395 "data_offset": 0, 00:22:20.395 "data_size": 65536 00:22:20.395 }, 00:22:20.395 { 00:22:20.395 "name": "BaseBdev4", 00:22:20.395 "uuid": "5080280b-c380-4621-9f9a-7e85ff705868", 00:22:20.395 "is_configured": true, 00:22:20.395 "data_offset": 0, 00:22:20.395 "data_size": 65536 00:22:20.395 } 00:22:20.395 ] 00:22:20.395 }' 00:22:20.395 19:58:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:20.395 19:58:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:20.962 19:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:20.962 19:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:21.222 19:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:22:21.222 19:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:21.481 [2024-07-24 19:58:12.830295] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:21.481 19:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:21.481 19:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:21.481 19:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:21.481 19:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:21.481 19:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:21.481 19:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:21.481 19:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:21.481 19:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:21.481 19:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:21.481 19:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:21.481 19:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:21.481 19:58:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:21.741 19:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:21.741 "name": "Existed_Raid", 00:22:21.741 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:21.741 "strip_size_kb": 0, 00:22:21.741 "state": "configuring", 00:22:21.741 "raid_level": "raid1", 00:22:21.741 "superblock": false, 00:22:21.741 "num_base_bdevs": 4, 00:22:21.741 "num_base_bdevs_discovered": 2, 00:22:21.741 "num_base_bdevs_operational": 4, 00:22:21.741 "base_bdevs_list": [ 00:22:21.741 { 00:22:21.741 "name": null, 00:22:21.741 "uuid": "0dc2af7b-6abb-44a3-a701-0149509eb429", 00:22:21.741 "is_configured": false, 00:22:21.741 "data_offset": 0, 00:22:21.741 "data_size": 65536 00:22:21.741 }, 00:22:21.741 { 00:22:21.741 "name": null, 00:22:21.741 "uuid": "2bdfc2fb-7b1b-48fe-90a1-d9ca9e779bc1", 00:22:21.741 "is_configured": false, 00:22:21.741 "data_offset": 0, 00:22:21.741 "data_size": 65536 00:22:21.741 }, 00:22:21.741 { 00:22:21.741 "name": "BaseBdev3", 00:22:21.741 "uuid": "7e73c08e-6a38-4ba5-ad73-ec3ff16cf46c", 00:22:21.741 "is_configured": true, 00:22:21.741 "data_offset": 0, 00:22:21.741 "data_size": 65536 00:22:21.741 }, 00:22:21.741 { 00:22:21.741 "name": "BaseBdev4", 00:22:21.741 "uuid": "5080280b-c380-4621-9f9a-7e85ff705868", 00:22:21.741 "is_configured": true, 00:22:21.741 "data_offset": 0, 00:22:21.741 "data_size": 65536 00:22:21.741 } 00:22:21.741 ] 00:22:21.741 }' 00:22:21.741 19:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:21.741 19:58:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:22.307 19:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:22.307 19:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:22.566 19:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:22:22.566 19:58:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:22:23.133 [2024-07-24 19:58:14.449192] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:23.133 19:58:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:23.133 19:58:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:23.133 19:58:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:23.133 19:58:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:23.133 19:58:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:23.133 19:58:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:23.133 19:58:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:23.133 19:58:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:23.133 19:58:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:23.133 19:58:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:23.133 19:58:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:23.133 19:58:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:23.392 19:58:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:23.392 "name": "Existed_Raid", 00:22:23.392 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:23.392 "strip_size_kb": 0, 00:22:23.392 "state": "configuring", 00:22:23.392 "raid_level": "raid1", 00:22:23.392 "superblock": false, 00:22:23.392 "num_base_bdevs": 4, 00:22:23.392 "num_base_bdevs_discovered": 3, 00:22:23.392 "num_base_bdevs_operational": 4, 00:22:23.392 "base_bdevs_list": [ 00:22:23.392 { 00:22:23.392 "name": null, 00:22:23.392 "uuid": "0dc2af7b-6abb-44a3-a701-0149509eb429", 00:22:23.392 "is_configured": false, 00:22:23.392 "data_offset": 0, 00:22:23.392 "data_size": 65536 00:22:23.392 }, 00:22:23.392 { 00:22:23.392 "name": "BaseBdev2", 00:22:23.392 "uuid": "2bdfc2fb-7b1b-48fe-90a1-d9ca9e779bc1", 00:22:23.392 "is_configured": true, 00:22:23.392 "data_offset": 0, 00:22:23.392 "data_size": 65536 00:22:23.392 }, 00:22:23.392 { 00:22:23.392 "name": "BaseBdev3", 00:22:23.392 "uuid": "7e73c08e-6a38-4ba5-ad73-ec3ff16cf46c", 00:22:23.392 "is_configured": true, 00:22:23.392 "data_offset": 0, 00:22:23.392 "data_size": 65536 00:22:23.392 }, 00:22:23.392 { 00:22:23.392 "name": "BaseBdev4", 00:22:23.392 "uuid": "5080280b-c380-4621-9f9a-7e85ff705868", 00:22:23.392 "is_configured": true, 00:22:23.392 "data_offset": 0, 00:22:23.392 "data_size": 65536 00:22:23.392 } 00:22:23.392 ] 00:22:23.392 }' 00:22:23.392 19:58:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:23.392 19:58:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:23.959 19:58:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:23.960 19:58:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:24.218 19:58:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:22:24.218 19:58:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:24.218 19:58:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:22:24.475 19:58:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 0dc2af7b-6abb-44a3-a701-0149509eb429 00:22:24.475 [2024-07-24 19:58:16.066037] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:22:24.475 [2024-07-24 19:58:16.066079] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xff4130 00:22:24.475 [2024-07-24 19:58:16.066087] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:22:24.475 [2024-07-24 19:58:16.066285] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xff1170 00:22:24.475 [2024-07-24 19:58:16.066429] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xff4130 00:22:24.475 [2024-07-24 19:58:16.066439] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xff4130 00:22:24.475 [2024-07-24 19:58:16.066601] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:24.475 NewBaseBdev 00:22:24.733 19:58:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:22:24.733 19:58:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:22:24.733 19:58:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:24.733 19:58:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:22:24.733 19:58:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:24.733 19:58:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:24.733 19:58:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:24.992 19:58:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:22:24.992 [ 00:22:24.992 { 00:22:24.992 "name": "NewBaseBdev", 00:22:24.992 "aliases": [ 00:22:24.992 "0dc2af7b-6abb-44a3-a701-0149509eb429" 00:22:24.992 ], 00:22:24.992 "product_name": "Malloc disk", 00:22:24.992 "block_size": 512, 00:22:24.992 "num_blocks": 65536, 00:22:24.992 "uuid": "0dc2af7b-6abb-44a3-a701-0149509eb429", 00:22:24.992 "assigned_rate_limits": { 00:22:24.992 "rw_ios_per_sec": 0, 00:22:24.992 "rw_mbytes_per_sec": 0, 00:22:24.992 "r_mbytes_per_sec": 0, 00:22:24.992 "w_mbytes_per_sec": 0 00:22:24.992 }, 00:22:24.992 "claimed": true, 00:22:24.992 "claim_type": "exclusive_write", 00:22:24.992 "zoned": false, 00:22:24.992 "supported_io_types": { 00:22:24.992 "read": true, 00:22:24.992 "write": true, 00:22:24.992 "unmap": true, 00:22:24.992 "flush": true, 00:22:24.992 "reset": true, 00:22:24.992 "nvme_admin": false, 00:22:24.992 "nvme_io": false, 00:22:24.992 "nvme_io_md": false, 00:22:24.992 "write_zeroes": true, 00:22:24.992 "zcopy": true, 00:22:24.992 "get_zone_info": false, 00:22:24.992 "zone_management": false, 00:22:24.992 "zone_append": false, 00:22:24.992 "compare": false, 00:22:24.992 "compare_and_write": false, 00:22:24.992 "abort": true, 00:22:24.992 "seek_hole": false, 00:22:24.992 "seek_data": false, 00:22:24.992 "copy": true, 00:22:24.992 "nvme_iov_md": false 00:22:24.992 }, 00:22:24.992 "memory_domains": [ 00:22:24.992 { 00:22:24.992 "dma_device_id": "system", 00:22:24.992 "dma_device_type": 1 00:22:24.992 }, 00:22:24.992 { 00:22:24.992 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:24.992 "dma_device_type": 2 00:22:24.992 } 00:22:24.992 ], 00:22:24.992 "driver_specific": {} 00:22:24.992 } 00:22:24.992 ] 00:22:24.992 19:58:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:22:24.992 19:58:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:22:24.992 19:58:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:24.992 19:58:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:24.992 19:58:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:24.992 19:58:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:24.992 19:58:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:24.992 19:58:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:24.992 19:58:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:24.992 19:58:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:24.992 19:58:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:24.992 19:58:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:24.992 19:58:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:25.250 19:58:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:25.250 "name": "Existed_Raid", 00:22:25.250 "uuid": "c5e8a40b-28b1-472e-85b4-211bd777bb6a", 00:22:25.250 "strip_size_kb": 0, 00:22:25.250 "state": "online", 00:22:25.250 "raid_level": "raid1", 00:22:25.250 "superblock": false, 00:22:25.250 "num_base_bdevs": 4, 00:22:25.250 "num_base_bdevs_discovered": 4, 00:22:25.250 "num_base_bdevs_operational": 4, 00:22:25.250 "base_bdevs_list": [ 00:22:25.250 { 00:22:25.250 "name": "NewBaseBdev", 00:22:25.250 "uuid": "0dc2af7b-6abb-44a3-a701-0149509eb429", 00:22:25.250 "is_configured": true, 00:22:25.250 "data_offset": 0, 00:22:25.250 "data_size": 65536 00:22:25.250 }, 00:22:25.250 { 00:22:25.250 "name": "BaseBdev2", 00:22:25.250 "uuid": "2bdfc2fb-7b1b-48fe-90a1-d9ca9e779bc1", 00:22:25.250 "is_configured": true, 00:22:25.250 "data_offset": 0, 00:22:25.250 "data_size": 65536 00:22:25.250 }, 00:22:25.250 { 00:22:25.250 "name": "BaseBdev3", 00:22:25.250 "uuid": "7e73c08e-6a38-4ba5-ad73-ec3ff16cf46c", 00:22:25.250 "is_configured": true, 00:22:25.250 "data_offset": 0, 00:22:25.250 "data_size": 65536 00:22:25.250 }, 00:22:25.250 { 00:22:25.250 "name": "BaseBdev4", 00:22:25.250 "uuid": "5080280b-c380-4621-9f9a-7e85ff705868", 00:22:25.250 "is_configured": true, 00:22:25.250 "data_offset": 0, 00:22:25.250 "data_size": 65536 00:22:25.250 } 00:22:25.250 ] 00:22:25.250 }' 00:22:25.250 19:58:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:25.250 19:58:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:26.185 19:58:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:22:26.185 19:58:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:26.185 19:58:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:26.185 19:58:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:26.185 19:58:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:26.185 19:58:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:26.185 19:58:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:26.185 19:58:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:26.185 [2024-07-24 19:58:17.650581] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:26.185 19:58:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:26.185 "name": "Existed_Raid", 00:22:26.185 "aliases": [ 00:22:26.185 "c5e8a40b-28b1-472e-85b4-211bd777bb6a" 00:22:26.185 ], 00:22:26.185 "product_name": "Raid Volume", 00:22:26.185 "block_size": 512, 00:22:26.185 "num_blocks": 65536, 00:22:26.185 "uuid": "c5e8a40b-28b1-472e-85b4-211bd777bb6a", 00:22:26.185 "assigned_rate_limits": { 00:22:26.185 "rw_ios_per_sec": 0, 00:22:26.185 "rw_mbytes_per_sec": 0, 00:22:26.185 "r_mbytes_per_sec": 0, 00:22:26.185 "w_mbytes_per_sec": 0 00:22:26.185 }, 00:22:26.185 "claimed": false, 00:22:26.185 "zoned": false, 00:22:26.185 "supported_io_types": { 00:22:26.185 "read": true, 00:22:26.185 "write": true, 00:22:26.185 "unmap": false, 00:22:26.185 "flush": false, 00:22:26.185 "reset": true, 00:22:26.185 "nvme_admin": false, 00:22:26.185 "nvme_io": false, 00:22:26.185 "nvme_io_md": false, 00:22:26.185 "write_zeroes": true, 00:22:26.185 "zcopy": false, 00:22:26.185 "get_zone_info": false, 00:22:26.185 "zone_management": false, 00:22:26.185 "zone_append": false, 00:22:26.185 "compare": false, 00:22:26.185 "compare_and_write": false, 00:22:26.185 "abort": false, 00:22:26.185 "seek_hole": false, 00:22:26.185 "seek_data": false, 00:22:26.185 "copy": false, 00:22:26.185 "nvme_iov_md": false 00:22:26.185 }, 00:22:26.185 "memory_domains": [ 00:22:26.185 { 00:22:26.185 "dma_device_id": "system", 00:22:26.185 "dma_device_type": 1 00:22:26.185 }, 00:22:26.185 { 00:22:26.185 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:26.185 "dma_device_type": 2 00:22:26.185 }, 00:22:26.185 { 00:22:26.185 "dma_device_id": "system", 00:22:26.185 "dma_device_type": 1 00:22:26.185 }, 00:22:26.185 { 00:22:26.185 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:26.185 "dma_device_type": 2 00:22:26.186 }, 00:22:26.186 { 00:22:26.186 "dma_device_id": "system", 00:22:26.186 "dma_device_type": 1 00:22:26.186 }, 00:22:26.186 { 00:22:26.186 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:26.186 "dma_device_type": 2 00:22:26.186 }, 00:22:26.186 { 00:22:26.186 "dma_device_id": "system", 00:22:26.186 "dma_device_type": 1 00:22:26.186 }, 00:22:26.186 { 00:22:26.186 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:26.186 "dma_device_type": 2 00:22:26.186 } 00:22:26.186 ], 00:22:26.186 "driver_specific": { 00:22:26.186 "raid": { 00:22:26.186 "uuid": "c5e8a40b-28b1-472e-85b4-211bd777bb6a", 00:22:26.186 "strip_size_kb": 0, 00:22:26.186 "state": "online", 00:22:26.186 "raid_level": "raid1", 00:22:26.186 "superblock": false, 00:22:26.186 "num_base_bdevs": 4, 00:22:26.186 "num_base_bdevs_discovered": 4, 00:22:26.186 "num_base_bdevs_operational": 4, 00:22:26.186 "base_bdevs_list": [ 00:22:26.186 { 00:22:26.186 "name": "NewBaseBdev", 00:22:26.186 "uuid": "0dc2af7b-6abb-44a3-a701-0149509eb429", 00:22:26.186 "is_configured": true, 00:22:26.186 "data_offset": 0, 00:22:26.186 "data_size": 65536 00:22:26.186 }, 00:22:26.186 { 00:22:26.186 "name": "BaseBdev2", 00:22:26.186 "uuid": "2bdfc2fb-7b1b-48fe-90a1-d9ca9e779bc1", 00:22:26.186 "is_configured": true, 00:22:26.186 "data_offset": 0, 00:22:26.186 "data_size": 65536 00:22:26.186 }, 00:22:26.186 { 00:22:26.186 "name": "BaseBdev3", 00:22:26.186 "uuid": "7e73c08e-6a38-4ba5-ad73-ec3ff16cf46c", 00:22:26.186 "is_configured": true, 00:22:26.186 "data_offset": 0, 00:22:26.186 "data_size": 65536 00:22:26.186 }, 00:22:26.186 { 00:22:26.186 "name": "BaseBdev4", 00:22:26.186 "uuid": "5080280b-c380-4621-9f9a-7e85ff705868", 00:22:26.186 "is_configured": true, 00:22:26.186 "data_offset": 0, 00:22:26.186 "data_size": 65536 00:22:26.186 } 00:22:26.186 ] 00:22:26.186 } 00:22:26.186 } 00:22:26.186 }' 00:22:26.186 19:58:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:26.186 19:58:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:22:26.186 BaseBdev2 00:22:26.186 BaseBdev3 00:22:26.186 BaseBdev4' 00:22:26.186 19:58:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:26.186 19:58:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:22:26.186 19:58:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:26.444 19:58:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:26.444 "name": "NewBaseBdev", 00:22:26.444 "aliases": [ 00:22:26.444 "0dc2af7b-6abb-44a3-a701-0149509eb429" 00:22:26.444 ], 00:22:26.444 "product_name": "Malloc disk", 00:22:26.444 "block_size": 512, 00:22:26.444 "num_blocks": 65536, 00:22:26.444 "uuid": "0dc2af7b-6abb-44a3-a701-0149509eb429", 00:22:26.444 "assigned_rate_limits": { 00:22:26.444 "rw_ios_per_sec": 0, 00:22:26.444 "rw_mbytes_per_sec": 0, 00:22:26.444 "r_mbytes_per_sec": 0, 00:22:26.444 "w_mbytes_per_sec": 0 00:22:26.444 }, 00:22:26.444 "claimed": true, 00:22:26.444 "claim_type": "exclusive_write", 00:22:26.444 "zoned": false, 00:22:26.444 "supported_io_types": { 00:22:26.444 "read": true, 00:22:26.444 "write": true, 00:22:26.444 "unmap": true, 00:22:26.444 "flush": true, 00:22:26.444 "reset": true, 00:22:26.444 "nvme_admin": false, 00:22:26.444 "nvme_io": false, 00:22:26.444 "nvme_io_md": false, 00:22:26.444 "write_zeroes": true, 00:22:26.444 "zcopy": true, 00:22:26.444 "get_zone_info": false, 00:22:26.444 "zone_management": false, 00:22:26.444 "zone_append": false, 00:22:26.444 "compare": false, 00:22:26.444 "compare_and_write": false, 00:22:26.444 "abort": true, 00:22:26.444 "seek_hole": false, 00:22:26.444 "seek_data": false, 00:22:26.444 "copy": true, 00:22:26.444 "nvme_iov_md": false 00:22:26.444 }, 00:22:26.444 "memory_domains": [ 00:22:26.444 { 00:22:26.444 "dma_device_id": "system", 00:22:26.444 "dma_device_type": 1 00:22:26.444 }, 00:22:26.444 { 00:22:26.444 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:26.444 "dma_device_type": 2 00:22:26.444 } 00:22:26.444 ], 00:22:26.444 "driver_specific": {} 00:22:26.444 }' 00:22:26.444 19:58:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:26.444 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:26.702 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:26.702 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:26.702 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:26.702 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:26.702 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:26.702 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:26.702 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:26.702 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:26.702 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:26.960 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:26.960 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:26.960 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:26.960 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:27.218 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:27.218 "name": "BaseBdev2", 00:22:27.218 "aliases": [ 00:22:27.218 "2bdfc2fb-7b1b-48fe-90a1-d9ca9e779bc1" 00:22:27.218 ], 00:22:27.218 "product_name": "Malloc disk", 00:22:27.218 "block_size": 512, 00:22:27.218 "num_blocks": 65536, 00:22:27.218 "uuid": "2bdfc2fb-7b1b-48fe-90a1-d9ca9e779bc1", 00:22:27.218 "assigned_rate_limits": { 00:22:27.218 "rw_ios_per_sec": 0, 00:22:27.218 "rw_mbytes_per_sec": 0, 00:22:27.218 "r_mbytes_per_sec": 0, 00:22:27.218 "w_mbytes_per_sec": 0 00:22:27.218 }, 00:22:27.218 "claimed": true, 00:22:27.218 "claim_type": "exclusive_write", 00:22:27.218 "zoned": false, 00:22:27.218 "supported_io_types": { 00:22:27.218 "read": true, 00:22:27.218 "write": true, 00:22:27.218 "unmap": true, 00:22:27.218 "flush": true, 00:22:27.218 "reset": true, 00:22:27.218 "nvme_admin": false, 00:22:27.218 "nvme_io": false, 00:22:27.219 "nvme_io_md": false, 00:22:27.219 "write_zeroes": true, 00:22:27.219 "zcopy": true, 00:22:27.219 "get_zone_info": false, 00:22:27.219 "zone_management": false, 00:22:27.219 "zone_append": false, 00:22:27.219 "compare": false, 00:22:27.219 "compare_and_write": false, 00:22:27.219 "abort": true, 00:22:27.219 "seek_hole": false, 00:22:27.219 "seek_data": false, 00:22:27.219 "copy": true, 00:22:27.219 "nvme_iov_md": false 00:22:27.219 }, 00:22:27.219 "memory_domains": [ 00:22:27.219 { 00:22:27.219 "dma_device_id": "system", 00:22:27.219 "dma_device_type": 1 00:22:27.219 }, 00:22:27.219 { 00:22:27.219 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:27.219 "dma_device_type": 2 00:22:27.219 } 00:22:27.219 ], 00:22:27.219 "driver_specific": {} 00:22:27.219 }' 00:22:27.219 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:27.219 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:27.219 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:27.219 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:27.219 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:27.219 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:27.219 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:27.477 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:27.477 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:27.477 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:27.477 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:27.477 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:27.477 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:27.477 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:27.477 19:58:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:27.735 19:58:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:27.735 "name": "BaseBdev3", 00:22:27.735 "aliases": [ 00:22:27.735 "7e73c08e-6a38-4ba5-ad73-ec3ff16cf46c" 00:22:27.735 ], 00:22:27.735 "product_name": "Malloc disk", 00:22:27.735 "block_size": 512, 00:22:27.735 "num_blocks": 65536, 00:22:27.735 "uuid": "7e73c08e-6a38-4ba5-ad73-ec3ff16cf46c", 00:22:27.735 "assigned_rate_limits": { 00:22:27.735 "rw_ios_per_sec": 0, 00:22:27.735 "rw_mbytes_per_sec": 0, 00:22:27.735 "r_mbytes_per_sec": 0, 00:22:27.735 "w_mbytes_per_sec": 0 00:22:27.735 }, 00:22:27.735 "claimed": true, 00:22:27.735 "claim_type": "exclusive_write", 00:22:27.735 "zoned": false, 00:22:27.735 "supported_io_types": { 00:22:27.735 "read": true, 00:22:27.735 "write": true, 00:22:27.735 "unmap": true, 00:22:27.735 "flush": true, 00:22:27.735 "reset": true, 00:22:27.735 "nvme_admin": false, 00:22:27.735 "nvme_io": false, 00:22:27.735 "nvme_io_md": false, 00:22:27.735 "write_zeroes": true, 00:22:27.735 "zcopy": true, 00:22:27.735 "get_zone_info": false, 00:22:27.735 "zone_management": false, 00:22:27.735 "zone_append": false, 00:22:27.735 "compare": false, 00:22:27.735 "compare_and_write": false, 00:22:27.735 "abort": true, 00:22:27.735 "seek_hole": false, 00:22:27.735 "seek_data": false, 00:22:27.735 "copy": true, 00:22:27.735 "nvme_iov_md": false 00:22:27.735 }, 00:22:27.735 "memory_domains": [ 00:22:27.735 { 00:22:27.735 "dma_device_id": "system", 00:22:27.735 "dma_device_type": 1 00:22:27.735 }, 00:22:27.735 { 00:22:27.735 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:27.735 "dma_device_type": 2 00:22:27.735 } 00:22:27.735 ], 00:22:27.735 "driver_specific": {} 00:22:27.735 }' 00:22:27.736 19:58:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:27.736 19:58:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:27.736 19:58:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:27.736 19:58:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:27.994 19:58:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:27.994 19:58:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:27.994 19:58:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:27.994 19:58:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:27.994 19:58:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:27.994 19:58:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:27.994 19:58:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:27.994 19:58:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:27.994 19:58:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:27.994 19:58:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:27.994 19:58:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:28.252 19:58:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:28.252 "name": "BaseBdev4", 00:22:28.252 "aliases": [ 00:22:28.252 "5080280b-c380-4621-9f9a-7e85ff705868" 00:22:28.252 ], 00:22:28.252 "product_name": "Malloc disk", 00:22:28.252 "block_size": 512, 00:22:28.252 "num_blocks": 65536, 00:22:28.252 "uuid": "5080280b-c380-4621-9f9a-7e85ff705868", 00:22:28.252 "assigned_rate_limits": { 00:22:28.252 "rw_ios_per_sec": 0, 00:22:28.252 "rw_mbytes_per_sec": 0, 00:22:28.252 "r_mbytes_per_sec": 0, 00:22:28.252 "w_mbytes_per_sec": 0 00:22:28.252 }, 00:22:28.252 "claimed": true, 00:22:28.252 "claim_type": "exclusive_write", 00:22:28.252 "zoned": false, 00:22:28.252 "supported_io_types": { 00:22:28.252 "read": true, 00:22:28.252 "write": true, 00:22:28.252 "unmap": true, 00:22:28.252 "flush": true, 00:22:28.252 "reset": true, 00:22:28.252 "nvme_admin": false, 00:22:28.252 "nvme_io": false, 00:22:28.252 "nvme_io_md": false, 00:22:28.252 "write_zeroes": true, 00:22:28.252 "zcopy": true, 00:22:28.252 "get_zone_info": false, 00:22:28.252 "zone_management": false, 00:22:28.252 "zone_append": false, 00:22:28.252 "compare": false, 00:22:28.252 "compare_and_write": false, 00:22:28.252 "abort": true, 00:22:28.252 "seek_hole": false, 00:22:28.252 "seek_data": false, 00:22:28.252 "copy": true, 00:22:28.252 "nvme_iov_md": false 00:22:28.252 }, 00:22:28.252 "memory_domains": [ 00:22:28.252 { 00:22:28.252 "dma_device_id": "system", 00:22:28.252 "dma_device_type": 1 00:22:28.252 }, 00:22:28.252 { 00:22:28.252 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:28.252 "dma_device_type": 2 00:22:28.252 } 00:22:28.252 ], 00:22:28.252 "driver_specific": {} 00:22:28.252 }' 00:22:28.252 19:58:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:28.510 19:58:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:28.510 19:58:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:28.510 19:58:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:28.510 19:58:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:28.510 19:58:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:28.510 19:58:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:28.510 19:58:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:28.768 19:58:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:28.768 19:58:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:28.768 19:58:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:28.768 19:58:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:28.768 19:58:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:29.027 [2024-07-24 19:58:20.449702] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:29.027 [2024-07-24 19:58:20.449729] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:29.027 [2024-07-24 19:58:20.449784] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:29.027 [2024-07-24 19:58:20.450068] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:29.027 [2024-07-24 19:58:20.450080] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xff4130 name Existed_Raid, state offline 00:22:29.027 19:58:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1467378 00:22:29.027 19:58:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 1467378 ']' 00:22:29.027 19:58:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 1467378 00:22:29.027 19:58:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:22:29.027 19:58:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:22:29.027 19:58:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1467378 00:22:29.027 19:58:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:22:29.027 19:58:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:22:29.027 19:58:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1467378' 00:22:29.027 killing process with pid 1467378 00:22:29.027 19:58:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 1467378 00:22:29.027 [2024-07-24 19:58:20.516358] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:29.027 19:58:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 1467378 00:22:29.027 [2024-07-24 19:58:20.558770] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:29.286 19:58:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:22:29.286 00:22:29.286 real 0m38.474s 00:22:29.286 user 1m11.005s 00:22:29.286 sys 0m6.459s 00:22:29.286 19:58:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:22:29.286 19:58:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:29.286 ************************************ 00:22:29.286 END TEST raid_state_function_test 00:22:29.286 ************************************ 00:22:29.286 19:58:20 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:22:29.286 19:58:20 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:22:29.286 19:58:20 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:22:29.286 19:58:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:29.286 ************************************ 00:22:29.286 START TEST raid_state_function_test_sb 00:22:29.286 ************************************ 00:22:29.286 19:58:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 4 true 00:22:29.286 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:22:29.286 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:22:29.286 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:22:29.286 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:22:29.286 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:22:29.286 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:29.286 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:22:29.286 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:29.286 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:29.286 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:22:29.286 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:29.286 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:29.286 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:22:29.544 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:29.544 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:29.544 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:22:29.544 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:29.544 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:29.544 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:29.544 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:22:29.544 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:22:29.544 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:22:29.544 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:22:29.544 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:22:29.544 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:22:29.544 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:22:29.544 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:22:29.544 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:22:29.544 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1472949 00:22:29.544 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1472949' 00:22:29.544 Process raid pid: 1472949 00:22:29.544 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:22:29.544 19:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1472949 /var/tmp/spdk-raid.sock 00:22:29.544 19:58:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1472949 ']' 00:22:29.544 19:58:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:29.544 19:58:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:22:29.544 19:58:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:29.544 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:29.544 19:58:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:22:29.544 19:58:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:29.544 [2024-07-24 19:58:20.945263] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:22:29.544 [2024-07-24 19:58:20.945334] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:22:29.544 [2024-07-24 19:58:21.075112] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:29.803 [2024-07-24 19:58:21.179358] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:29.803 [2024-07-24 19:58:21.236240] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:29.803 [2024-07-24 19:58:21.236287] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:30.368 19:58:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:22:30.368 19:58:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:22:30.368 19:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:30.626 [2024-07-24 19:58:22.105688] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:30.626 [2024-07-24 19:58:22.105724] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:30.626 [2024-07-24 19:58:22.105734] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:30.626 [2024-07-24 19:58:22.105746] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:30.626 [2024-07-24 19:58:22.105754] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:30.626 [2024-07-24 19:58:22.105765] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:30.626 [2024-07-24 19:58:22.105778] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:30.626 [2024-07-24 19:58:22.105789] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:30.626 19:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:30.626 19:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:30.626 19:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:30.626 19:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:30.626 19:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:30.626 19:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:30.626 19:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:30.626 19:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:30.626 19:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:30.626 19:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:30.627 19:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:30.627 19:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:30.884 19:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:30.884 "name": "Existed_Raid", 00:22:30.884 "uuid": "abfe789d-25eb-4dff-8525-aa72b428ec18", 00:22:30.884 "strip_size_kb": 0, 00:22:30.884 "state": "configuring", 00:22:30.884 "raid_level": "raid1", 00:22:30.884 "superblock": true, 00:22:30.884 "num_base_bdevs": 4, 00:22:30.884 "num_base_bdevs_discovered": 0, 00:22:30.884 "num_base_bdevs_operational": 4, 00:22:30.884 "base_bdevs_list": [ 00:22:30.884 { 00:22:30.884 "name": "BaseBdev1", 00:22:30.884 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:30.884 "is_configured": false, 00:22:30.884 "data_offset": 0, 00:22:30.884 "data_size": 0 00:22:30.884 }, 00:22:30.884 { 00:22:30.884 "name": "BaseBdev2", 00:22:30.884 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:30.884 "is_configured": false, 00:22:30.884 "data_offset": 0, 00:22:30.884 "data_size": 0 00:22:30.884 }, 00:22:30.884 { 00:22:30.884 "name": "BaseBdev3", 00:22:30.884 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:30.884 "is_configured": false, 00:22:30.884 "data_offset": 0, 00:22:30.885 "data_size": 0 00:22:30.885 }, 00:22:30.885 { 00:22:30.885 "name": "BaseBdev4", 00:22:30.885 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:30.885 "is_configured": false, 00:22:30.885 "data_offset": 0, 00:22:30.885 "data_size": 0 00:22:30.885 } 00:22:30.885 ] 00:22:30.885 }' 00:22:30.885 19:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:30.885 19:58:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:31.448 19:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:31.707 [2024-07-24 19:58:23.196446] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:31.707 [2024-07-24 19:58:23.196476] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16bea30 name Existed_Raid, state configuring 00:22:31.707 19:58:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:31.966 [2024-07-24 19:58:23.449132] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:31.966 [2024-07-24 19:58:23.449160] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:31.966 [2024-07-24 19:58:23.449169] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:31.967 [2024-07-24 19:58:23.449180] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:31.967 [2024-07-24 19:58:23.449192] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:31.967 [2024-07-24 19:58:23.449203] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:31.967 [2024-07-24 19:58:23.449212] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:31.967 [2024-07-24 19:58:23.449222] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:31.967 19:58:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:32.226 [2024-07-24 19:58:23.707737] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:32.226 BaseBdev1 00:22:32.226 19:58:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:22:32.226 19:58:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:22:32.226 19:58:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:32.226 19:58:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:22:32.226 19:58:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:32.226 19:58:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:32.226 19:58:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:32.484 19:58:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:32.742 [ 00:22:32.742 { 00:22:32.742 "name": "BaseBdev1", 00:22:32.742 "aliases": [ 00:22:32.742 "cae63881-b227-44cf-aa5d-ad6954185cbf" 00:22:32.742 ], 00:22:32.742 "product_name": "Malloc disk", 00:22:32.742 "block_size": 512, 00:22:32.742 "num_blocks": 65536, 00:22:32.742 "uuid": "cae63881-b227-44cf-aa5d-ad6954185cbf", 00:22:32.742 "assigned_rate_limits": { 00:22:32.742 "rw_ios_per_sec": 0, 00:22:32.742 "rw_mbytes_per_sec": 0, 00:22:32.742 "r_mbytes_per_sec": 0, 00:22:32.742 "w_mbytes_per_sec": 0 00:22:32.742 }, 00:22:32.742 "claimed": true, 00:22:32.742 "claim_type": "exclusive_write", 00:22:32.742 "zoned": false, 00:22:32.742 "supported_io_types": { 00:22:32.742 "read": true, 00:22:32.742 "write": true, 00:22:32.742 "unmap": true, 00:22:32.742 "flush": true, 00:22:32.742 "reset": true, 00:22:32.742 "nvme_admin": false, 00:22:32.742 "nvme_io": false, 00:22:32.742 "nvme_io_md": false, 00:22:32.742 "write_zeroes": true, 00:22:32.742 "zcopy": true, 00:22:32.742 "get_zone_info": false, 00:22:32.742 "zone_management": false, 00:22:32.742 "zone_append": false, 00:22:32.742 "compare": false, 00:22:32.742 "compare_and_write": false, 00:22:32.742 "abort": true, 00:22:32.742 "seek_hole": false, 00:22:32.742 "seek_data": false, 00:22:32.742 "copy": true, 00:22:32.742 "nvme_iov_md": false 00:22:32.742 }, 00:22:32.742 "memory_domains": [ 00:22:32.742 { 00:22:32.742 "dma_device_id": "system", 00:22:32.742 "dma_device_type": 1 00:22:32.742 }, 00:22:32.742 { 00:22:32.742 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:32.742 "dma_device_type": 2 00:22:32.742 } 00:22:32.742 ], 00:22:32.742 "driver_specific": {} 00:22:32.742 } 00:22:32.742 ] 00:22:32.742 19:58:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:22:32.742 19:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:32.742 19:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:32.742 19:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:32.742 19:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:32.742 19:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:32.742 19:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:32.742 19:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:32.742 19:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:32.742 19:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:32.742 19:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:32.742 19:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:32.742 19:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:33.000 19:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:33.000 "name": "Existed_Raid", 00:22:33.000 "uuid": "f0fc8be8-569c-4dc8-b35a-3cea6240a4db", 00:22:33.001 "strip_size_kb": 0, 00:22:33.001 "state": "configuring", 00:22:33.001 "raid_level": "raid1", 00:22:33.001 "superblock": true, 00:22:33.001 "num_base_bdevs": 4, 00:22:33.001 "num_base_bdevs_discovered": 1, 00:22:33.001 "num_base_bdevs_operational": 4, 00:22:33.001 "base_bdevs_list": [ 00:22:33.001 { 00:22:33.001 "name": "BaseBdev1", 00:22:33.001 "uuid": "cae63881-b227-44cf-aa5d-ad6954185cbf", 00:22:33.001 "is_configured": true, 00:22:33.001 "data_offset": 2048, 00:22:33.001 "data_size": 63488 00:22:33.001 }, 00:22:33.001 { 00:22:33.001 "name": "BaseBdev2", 00:22:33.001 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:33.001 "is_configured": false, 00:22:33.001 "data_offset": 0, 00:22:33.001 "data_size": 0 00:22:33.001 }, 00:22:33.001 { 00:22:33.001 "name": "BaseBdev3", 00:22:33.001 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:33.001 "is_configured": false, 00:22:33.001 "data_offset": 0, 00:22:33.001 "data_size": 0 00:22:33.001 }, 00:22:33.001 { 00:22:33.001 "name": "BaseBdev4", 00:22:33.001 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:33.001 "is_configured": false, 00:22:33.001 "data_offset": 0, 00:22:33.001 "data_size": 0 00:22:33.001 } 00:22:33.001 ] 00:22:33.001 }' 00:22:33.001 19:58:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:33.001 19:58:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:33.980 19:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:34.251 [2024-07-24 19:58:25.576681] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:34.251 [2024-07-24 19:58:25.576723] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16be2a0 name Existed_Raid, state configuring 00:22:34.251 19:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:34.251 [2024-07-24 19:58:25.821366] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:34.251 [2024-07-24 19:58:25.822816] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:34.251 [2024-07-24 19:58:25.822849] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:34.251 [2024-07-24 19:58:25.822860] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:34.251 [2024-07-24 19:58:25.822871] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:34.251 [2024-07-24 19:58:25.822880] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:34.251 [2024-07-24 19:58:25.822890] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:34.511 19:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:22:34.511 19:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:34.511 19:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:34.511 19:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:34.511 19:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:34.511 19:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:34.511 19:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:34.511 19:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:34.511 19:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:34.511 19:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:34.511 19:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:34.511 19:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:34.511 19:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:34.511 19:58:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:34.511 19:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:34.511 "name": "Existed_Raid", 00:22:34.511 "uuid": "2b953989-246f-4db5-8bdf-1003dd7d42cc", 00:22:34.511 "strip_size_kb": 0, 00:22:34.511 "state": "configuring", 00:22:34.511 "raid_level": "raid1", 00:22:34.511 "superblock": true, 00:22:34.511 "num_base_bdevs": 4, 00:22:34.511 "num_base_bdevs_discovered": 1, 00:22:34.511 "num_base_bdevs_operational": 4, 00:22:34.511 "base_bdevs_list": [ 00:22:34.511 { 00:22:34.511 "name": "BaseBdev1", 00:22:34.511 "uuid": "cae63881-b227-44cf-aa5d-ad6954185cbf", 00:22:34.511 "is_configured": true, 00:22:34.511 "data_offset": 2048, 00:22:34.511 "data_size": 63488 00:22:34.511 }, 00:22:34.511 { 00:22:34.511 "name": "BaseBdev2", 00:22:34.511 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:34.511 "is_configured": false, 00:22:34.511 "data_offset": 0, 00:22:34.511 "data_size": 0 00:22:34.511 }, 00:22:34.511 { 00:22:34.511 "name": "BaseBdev3", 00:22:34.511 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:34.511 "is_configured": false, 00:22:34.511 "data_offset": 0, 00:22:34.511 "data_size": 0 00:22:34.511 }, 00:22:34.511 { 00:22:34.511 "name": "BaseBdev4", 00:22:34.511 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:34.511 "is_configured": false, 00:22:34.511 "data_offset": 0, 00:22:34.511 "data_size": 0 00:22:34.511 } 00:22:34.511 ] 00:22:34.511 }' 00:22:34.511 19:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:34.511 19:58:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:35.449 19:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:22:35.449 [2024-07-24 19:58:26.943647] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:35.449 BaseBdev2 00:22:35.449 19:58:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:22:35.449 19:58:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:22:35.449 19:58:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:35.449 19:58:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:22:35.449 19:58:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:35.449 19:58:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:35.449 19:58:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:35.708 19:58:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:35.966 [ 00:22:35.966 { 00:22:35.966 "name": "BaseBdev2", 00:22:35.966 "aliases": [ 00:22:35.966 "3bde1df2-ac1f-4fdb-832e-623e6c603bb5" 00:22:35.966 ], 00:22:35.966 "product_name": "Malloc disk", 00:22:35.966 "block_size": 512, 00:22:35.966 "num_blocks": 65536, 00:22:35.966 "uuid": "3bde1df2-ac1f-4fdb-832e-623e6c603bb5", 00:22:35.966 "assigned_rate_limits": { 00:22:35.966 "rw_ios_per_sec": 0, 00:22:35.966 "rw_mbytes_per_sec": 0, 00:22:35.966 "r_mbytes_per_sec": 0, 00:22:35.966 "w_mbytes_per_sec": 0 00:22:35.966 }, 00:22:35.966 "claimed": true, 00:22:35.966 "claim_type": "exclusive_write", 00:22:35.966 "zoned": false, 00:22:35.966 "supported_io_types": { 00:22:35.966 "read": true, 00:22:35.966 "write": true, 00:22:35.966 "unmap": true, 00:22:35.966 "flush": true, 00:22:35.966 "reset": true, 00:22:35.966 "nvme_admin": false, 00:22:35.966 "nvme_io": false, 00:22:35.966 "nvme_io_md": false, 00:22:35.966 "write_zeroes": true, 00:22:35.966 "zcopy": true, 00:22:35.966 "get_zone_info": false, 00:22:35.966 "zone_management": false, 00:22:35.966 "zone_append": false, 00:22:35.967 "compare": false, 00:22:35.967 "compare_and_write": false, 00:22:35.967 "abort": true, 00:22:35.967 "seek_hole": false, 00:22:35.967 "seek_data": false, 00:22:35.967 "copy": true, 00:22:35.967 "nvme_iov_md": false 00:22:35.967 }, 00:22:35.967 "memory_domains": [ 00:22:35.967 { 00:22:35.967 "dma_device_id": "system", 00:22:35.967 "dma_device_type": 1 00:22:35.967 }, 00:22:35.967 { 00:22:35.967 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:35.967 "dma_device_type": 2 00:22:35.967 } 00:22:35.967 ], 00:22:35.967 "driver_specific": {} 00:22:35.967 } 00:22:35.967 ] 00:22:35.967 19:58:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:22:35.967 19:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:35.967 19:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:35.967 19:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:35.967 19:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:35.967 19:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:35.967 19:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:35.967 19:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:35.967 19:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:35.967 19:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:35.967 19:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:35.967 19:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:35.967 19:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:35.967 19:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:35.967 19:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:36.226 19:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:36.226 "name": "Existed_Raid", 00:22:36.226 "uuid": "2b953989-246f-4db5-8bdf-1003dd7d42cc", 00:22:36.226 "strip_size_kb": 0, 00:22:36.226 "state": "configuring", 00:22:36.226 "raid_level": "raid1", 00:22:36.226 "superblock": true, 00:22:36.226 "num_base_bdevs": 4, 00:22:36.226 "num_base_bdevs_discovered": 2, 00:22:36.226 "num_base_bdevs_operational": 4, 00:22:36.226 "base_bdevs_list": [ 00:22:36.226 { 00:22:36.226 "name": "BaseBdev1", 00:22:36.226 "uuid": "cae63881-b227-44cf-aa5d-ad6954185cbf", 00:22:36.226 "is_configured": true, 00:22:36.226 "data_offset": 2048, 00:22:36.226 "data_size": 63488 00:22:36.226 }, 00:22:36.226 { 00:22:36.226 "name": "BaseBdev2", 00:22:36.226 "uuid": "3bde1df2-ac1f-4fdb-832e-623e6c603bb5", 00:22:36.226 "is_configured": true, 00:22:36.226 "data_offset": 2048, 00:22:36.226 "data_size": 63488 00:22:36.226 }, 00:22:36.226 { 00:22:36.226 "name": "BaseBdev3", 00:22:36.226 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:36.226 "is_configured": false, 00:22:36.226 "data_offset": 0, 00:22:36.226 "data_size": 0 00:22:36.226 }, 00:22:36.226 { 00:22:36.226 "name": "BaseBdev4", 00:22:36.226 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:36.226 "is_configured": false, 00:22:36.226 "data_offset": 0, 00:22:36.226 "data_size": 0 00:22:36.226 } 00:22:36.226 ] 00:22:36.226 }' 00:22:36.226 19:58:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:36.226 19:58:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:36.794 19:58:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:22:37.053 [2024-07-24 19:58:28.563460] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:37.053 BaseBdev3 00:22:37.053 19:58:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:22:37.053 19:58:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:22:37.053 19:58:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:37.053 19:58:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:22:37.053 19:58:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:37.053 19:58:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:37.053 19:58:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:37.312 19:58:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:22:37.569 [ 00:22:37.569 { 00:22:37.569 "name": "BaseBdev3", 00:22:37.569 "aliases": [ 00:22:37.569 "35dd9637-da65-472f-9317-2e41c176da96" 00:22:37.569 ], 00:22:37.569 "product_name": "Malloc disk", 00:22:37.569 "block_size": 512, 00:22:37.569 "num_blocks": 65536, 00:22:37.569 "uuid": "35dd9637-da65-472f-9317-2e41c176da96", 00:22:37.569 "assigned_rate_limits": { 00:22:37.569 "rw_ios_per_sec": 0, 00:22:37.569 "rw_mbytes_per_sec": 0, 00:22:37.569 "r_mbytes_per_sec": 0, 00:22:37.569 "w_mbytes_per_sec": 0 00:22:37.569 }, 00:22:37.569 "claimed": true, 00:22:37.569 "claim_type": "exclusive_write", 00:22:37.569 "zoned": false, 00:22:37.569 "supported_io_types": { 00:22:37.569 "read": true, 00:22:37.569 "write": true, 00:22:37.569 "unmap": true, 00:22:37.569 "flush": true, 00:22:37.569 "reset": true, 00:22:37.569 "nvme_admin": false, 00:22:37.569 "nvme_io": false, 00:22:37.569 "nvme_io_md": false, 00:22:37.569 "write_zeroes": true, 00:22:37.569 "zcopy": true, 00:22:37.569 "get_zone_info": false, 00:22:37.569 "zone_management": false, 00:22:37.569 "zone_append": false, 00:22:37.569 "compare": false, 00:22:37.569 "compare_and_write": false, 00:22:37.569 "abort": true, 00:22:37.569 "seek_hole": false, 00:22:37.569 "seek_data": false, 00:22:37.569 "copy": true, 00:22:37.569 "nvme_iov_md": false 00:22:37.569 }, 00:22:37.569 "memory_domains": [ 00:22:37.569 { 00:22:37.569 "dma_device_id": "system", 00:22:37.569 "dma_device_type": 1 00:22:37.569 }, 00:22:37.569 { 00:22:37.569 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:37.569 "dma_device_type": 2 00:22:37.569 } 00:22:37.569 ], 00:22:37.569 "driver_specific": {} 00:22:37.569 } 00:22:37.569 ] 00:22:37.569 19:58:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:22:37.570 19:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:37.570 19:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:37.570 19:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:37.570 19:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:37.570 19:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:37.570 19:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:37.570 19:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:37.570 19:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:37.570 19:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:37.570 19:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:37.570 19:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:37.570 19:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:37.570 19:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:37.570 19:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:37.827 19:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:37.827 "name": "Existed_Raid", 00:22:37.827 "uuid": "2b953989-246f-4db5-8bdf-1003dd7d42cc", 00:22:37.827 "strip_size_kb": 0, 00:22:37.827 "state": "configuring", 00:22:37.827 "raid_level": "raid1", 00:22:37.827 "superblock": true, 00:22:37.827 "num_base_bdevs": 4, 00:22:37.827 "num_base_bdevs_discovered": 3, 00:22:37.827 "num_base_bdevs_operational": 4, 00:22:37.827 "base_bdevs_list": [ 00:22:37.827 { 00:22:37.827 "name": "BaseBdev1", 00:22:37.827 "uuid": "cae63881-b227-44cf-aa5d-ad6954185cbf", 00:22:37.827 "is_configured": true, 00:22:37.827 "data_offset": 2048, 00:22:37.827 "data_size": 63488 00:22:37.827 }, 00:22:37.827 { 00:22:37.827 "name": "BaseBdev2", 00:22:37.827 "uuid": "3bde1df2-ac1f-4fdb-832e-623e6c603bb5", 00:22:37.827 "is_configured": true, 00:22:37.827 "data_offset": 2048, 00:22:37.827 "data_size": 63488 00:22:37.827 }, 00:22:37.827 { 00:22:37.827 "name": "BaseBdev3", 00:22:37.827 "uuid": "35dd9637-da65-472f-9317-2e41c176da96", 00:22:37.827 "is_configured": true, 00:22:37.827 "data_offset": 2048, 00:22:37.827 "data_size": 63488 00:22:37.827 }, 00:22:37.827 { 00:22:37.827 "name": "BaseBdev4", 00:22:37.827 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:37.827 "is_configured": false, 00:22:37.827 "data_offset": 0, 00:22:37.827 "data_size": 0 00:22:37.827 } 00:22:37.827 ] 00:22:37.827 }' 00:22:37.827 19:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:37.827 19:58:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:38.393 19:58:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:22:38.653 [2024-07-24 19:58:30.090995] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:38.653 [2024-07-24 19:58:30.091178] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x16bf300 00:22:38.653 [2024-07-24 19:58:30.091192] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:38.653 [2024-07-24 19:58:30.091377] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16c0280 00:22:38.653 [2024-07-24 19:58:30.091527] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16bf300 00:22:38.653 [2024-07-24 19:58:30.091538] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x16bf300 00:22:38.653 [2024-07-24 19:58:30.091638] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:38.653 BaseBdev4 00:22:38.653 19:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:22:38.653 19:58:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:22:38.653 19:58:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:38.653 19:58:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:22:38.653 19:58:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:38.653 19:58:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:38.653 19:58:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:38.911 19:58:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:39.171 [ 00:22:39.171 { 00:22:39.171 "name": "BaseBdev4", 00:22:39.171 "aliases": [ 00:22:39.171 "4e9e6af5-5f54-40b1-a481-5879141c4ce7" 00:22:39.171 ], 00:22:39.171 "product_name": "Malloc disk", 00:22:39.171 "block_size": 512, 00:22:39.171 "num_blocks": 65536, 00:22:39.171 "uuid": "4e9e6af5-5f54-40b1-a481-5879141c4ce7", 00:22:39.171 "assigned_rate_limits": { 00:22:39.171 "rw_ios_per_sec": 0, 00:22:39.171 "rw_mbytes_per_sec": 0, 00:22:39.171 "r_mbytes_per_sec": 0, 00:22:39.171 "w_mbytes_per_sec": 0 00:22:39.171 }, 00:22:39.171 "claimed": true, 00:22:39.171 "claim_type": "exclusive_write", 00:22:39.171 "zoned": false, 00:22:39.171 "supported_io_types": { 00:22:39.171 "read": true, 00:22:39.171 "write": true, 00:22:39.171 "unmap": true, 00:22:39.171 "flush": true, 00:22:39.171 "reset": true, 00:22:39.171 "nvme_admin": false, 00:22:39.171 "nvme_io": false, 00:22:39.171 "nvme_io_md": false, 00:22:39.171 "write_zeroes": true, 00:22:39.171 "zcopy": true, 00:22:39.171 "get_zone_info": false, 00:22:39.171 "zone_management": false, 00:22:39.171 "zone_append": false, 00:22:39.171 "compare": false, 00:22:39.171 "compare_and_write": false, 00:22:39.171 "abort": true, 00:22:39.171 "seek_hole": false, 00:22:39.171 "seek_data": false, 00:22:39.171 "copy": true, 00:22:39.171 "nvme_iov_md": false 00:22:39.171 }, 00:22:39.171 "memory_domains": [ 00:22:39.171 { 00:22:39.171 "dma_device_id": "system", 00:22:39.171 "dma_device_type": 1 00:22:39.171 }, 00:22:39.171 { 00:22:39.171 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:39.171 "dma_device_type": 2 00:22:39.171 } 00:22:39.171 ], 00:22:39.171 "driver_specific": {} 00:22:39.171 } 00:22:39.171 ] 00:22:39.171 19:58:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:22:39.171 19:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:39.171 19:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:39.171 19:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:22:39.171 19:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:39.171 19:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:39.171 19:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:39.171 19:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:39.171 19:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:39.171 19:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:39.171 19:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:39.171 19:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:39.171 19:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:39.171 19:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:39.171 19:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:39.430 19:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:39.430 "name": "Existed_Raid", 00:22:39.430 "uuid": "2b953989-246f-4db5-8bdf-1003dd7d42cc", 00:22:39.430 "strip_size_kb": 0, 00:22:39.430 "state": "online", 00:22:39.430 "raid_level": "raid1", 00:22:39.430 "superblock": true, 00:22:39.430 "num_base_bdevs": 4, 00:22:39.430 "num_base_bdevs_discovered": 4, 00:22:39.430 "num_base_bdevs_operational": 4, 00:22:39.430 "base_bdevs_list": [ 00:22:39.430 { 00:22:39.430 "name": "BaseBdev1", 00:22:39.430 "uuid": "cae63881-b227-44cf-aa5d-ad6954185cbf", 00:22:39.430 "is_configured": true, 00:22:39.430 "data_offset": 2048, 00:22:39.430 "data_size": 63488 00:22:39.430 }, 00:22:39.430 { 00:22:39.430 "name": "BaseBdev2", 00:22:39.430 "uuid": "3bde1df2-ac1f-4fdb-832e-623e6c603bb5", 00:22:39.430 "is_configured": true, 00:22:39.430 "data_offset": 2048, 00:22:39.430 "data_size": 63488 00:22:39.430 }, 00:22:39.430 { 00:22:39.430 "name": "BaseBdev3", 00:22:39.430 "uuid": "35dd9637-da65-472f-9317-2e41c176da96", 00:22:39.430 "is_configured": true, 00:22:39.430 "data_offset": 2048, 00:22:39.430 "data_size": 63488 00:22:39.430 }, 00:22:39.430 { 00:22:39.430 "name": "BaseBdev4", 00:22:39.430 "uuid": "4e9e6af5-5f54-40b1-a481-5879141c4ce7", 00:22:39.430 "is_configured": true, 00:22:39.430 "data_offset": 2048, 00:22:39.430 "data_size": 63488 00:22:39.430 } 00:22:39.430 ] 00:22:39.430 }' 00:22:39.430 19:58:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:39.430 19:58:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:39.997 19:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:22:39.997 19:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:39.997 19:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:39.997 19:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:39.997 19:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:39.997 19:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:22:39.997 19:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:39.997 19:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:40.255 [2024-07-24 19:58:31.687564] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:40.255 19:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:40.255 "name": "Existed_Raid", 00:22:40.255 "aliases": [ 00:22:40.255 "2b953989-246f-4db5-8bdf-1003dd7d42cc" 00:22:40.255 ], 00:22:40.255 "product_name": "Raid Volume", 00:22:40.255 "block_size": 512, 00:22:40.255 "num_blocks": 63488, 00:22:40.255 "uuid": "2b953989-246f-4db5-8bdf-1003dd7d42cc", 00:22:40.255 "assigned_rate_limits": { 00:22:40.255 "rw_ios_per_sec": 0, 00:22:40.255 "rw_mbytes_per_sec": 0, 00:22:40.255 "r_mbytes_per_sec": 0, 00:22:40.255 "w_mbytes_per_sec": 0 00:22:40.255 }, 00:22:40.255 "claimed": false, 00:22:40.255 "zoned": false, 00:22:40.255 "supported_io_types": { 00:22:40.255 "read": true, 00:22:40.255 "write": true, 00:22:40.255 "unmap": false, 00:22:40.255 "flush": false, 00:22:40.255 "reset": true, 00:22:40.255 "nvme_admin": false, 00:22:40.255 "nvme_io": false, 00:22:40.255 "nvme_io_md": false, 00:22:40.255 "write_zeroes": true, 00:22:40.255 "zcopy": false, 00:22:40.255 "get_zone_info": false, 00:22:40.255 "zone_management": false, 00:22:40.255 "zone_append": false, 00:22:40.255 "compare": false, 00:22:40.255 "compare_and_write": false, 00:22:40.255 "abort": false, 00:22:40.255 "seek_hole": false, 00:22:40.256 "seek_data": false, 00:22:40.256 "copy": false, 00:22:40.256 "nvme_iov_md": false 00:22:40.256 }, 00:22:40.256 "memory_domains": [ 00:22:40.256 { 00:22:40.256 "dma_device_id": "system", 00:22:40.256 "dma_device_type": 1 00:22:40.256 }, 00:22:40.256 { 00:22:40.256 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:40.256 "dma_device_type": 2 00:22:40.256 }, 00:22:40.256 { 00:22:40.256 "dma_device_id": "system", 00:22:40.256 "dma_device_type": 1 00:22:40.256 }, 00:22:40.256 { 00:22:40.256 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:40.256 "dma_device_type": 2 00:22:40.256 }, 00:22:40.256 { 00:22:40.256 "dma_device_id": "system", 00:22:40.256 "dma_device_type": 1 00:22:40.256 }, 00:22:40.256 { 00:22:40.256 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:40.256 "dma_device_type": 2 00:22:40.256 }, 00:22:40.256 { 00:22:40.256 "dma_device_id": "system", 00:22:40.256 "dma_device_type": 1 00:22:40.256 }, 00:22:40.256 { 00:22:40.256 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:40.256 "dma_device_type": 2 00:22:40.256 } 00:22:40.256 ], 00:22:40.256 "driver_specific": { 00:22:40.256 "raid": { 00:22:40.256 "uuid": "2b953989-246f-4db5-8bdf-1003dd7d42cc", 00:22:40.256 "strip_size_kb": 0, 00:22:40.256 "state": "online", 00:22:40.256 "raid_level": "raid1", 00:22:40.256 "superblock": true, 00:22:40.256 "num_base_bdevs": 4, 00:22:40.256 "num_base_bdevs_discovered": 4, 00:22:40.256 "num_base_bdevs_operational": 4, 00:22:40.256 "base_bdevs_list": [ 00:22:40.256 { 00:22:40.256 "name": "BaseBdev1", 00:22:40.256 "uuid": "cae63881-b227-44cf-aa5d-ad6954185cbf", 00:22:40.256 "is_configured": true, 00:22:40.256 "data_offset": 2048, 00:22:40.256 "data_size": 63488 00:22:40.256 }, 00:22:40.256 { 00:22:40.256 "name": "BaseBdev2", 00:22:40.256 "uuid": "3bde1df2-ac1f-4fdb-832e-623e6c603bb5", 00:22:40.256 "is_configured": true, 00:22:40.256 "data_offset": 2048, 00:22:40.256 "data_size": 63488 00:22:40.256 }, 00:22:40.256 { 00:22:40.256 "name": "BaseBdev3", 00:22:40.256 "uuid": "35dd9637-da65-472f-9317-2e41c176da96", 00:22:40.256 "is_configured": true, 00:22:40.256 "data_offset": 2048, 00:22:40.256 "data_size": 63488 00:22:40.256 }, 00:22:40.256 { 00:22:40.256 "name": "BaseBdev4", 00:22:40.256 "uuid": "4e9e6af5-5f54-40b1-a481-5879141c4ce7", 00:22:40.256 "is_configured": true, 00:22:40.256 "data_offset": 2048, 00:22:40.256 "data_size": 63488 00:22:40.256 } 00:22:40.256 ] 00:22:40.256 } 00:22:40.256 } 00:22:40.256 }' 00:22:40.256 19:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:40.256 19:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:22:40.256 BaseBdev2 00:22:40.256 BaseBdev3 00:22:40.256 BaseBdev4' 00:22:40.256 19:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:40.256 19:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:22:40.256 19:58:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:40.514 19:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:40.514 "name": "BaseBdev1", 00:22:40.514 "aliases": [ 00:22:40.514 "cae63881-b227-44cf-aa5d-ad6954185cbf" 00:22:40.514 ], 00:22:40.514 "product_name": "Malloc disk", 00:22:40.514 "block_size": 512, 00:22:40.514 "num_blocks": 65536, 00:22:40.514 "uuid": "cae63881-b227-44cf-aa5d-ad6954185cbf", 00:22:40.514 "assigned_rate_limits": { 00:22:40.514 "rw_ios_per_sec": 0, 00:22:40.514 "rw_mbytes_per_sec": 0, 00:22:40.514 "r_mbytes_per_sec": 0, 00:22:40.514 "w_mbytes_per_sec": 0 00:22:40.514 }, 00:22:40.514 "claimed": true, 00:22:40.514 "claim_type": "exclusive_write", 00:22:40.514 "zoned": false, 00:22:40.514 "supported_io_types": { 00:22:40.514 "read": true, 00:22:40.514 "write": true, 00:22:40.514 "unmap": true, 00:22:40.514 "flush": true, 00:22:40.514 "reset": true, 00:22:40.514 "nvme_admin": false, 00:22:40.514 "nvme_io": false, 00:22:40.514 "nvme_io_md": false, 00:22:40.514 "write_zeroes": true, 00:22:40.514 "zcopy": true, 00:22:40.514 "get_zone_info": false, 00:22:40.514 "zone_management": false, 00:22:40.514 "zone_append": false, 00:22:40.514 "compare": false, 00:22:40.514 "compare_and_write": false, 00:22:40.514 "abort": true, 00:22:40.514 "seek_hole": false, 00:22:40.514 "seek_data": false, 00:22:40.514 "copy": true, 00:22:40.514 "nvme_iov_md": false 00:22:40.514 }, 00:22:40.514 "memory_domains": [ 00:22:40.514 { 00:22:40.514 "dma_device_id": "system", 00:22:40.514 "dma_device_type": 1 00:22:40.514 }, 00:22:40.514 { 00:22:40.514 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:40.514 "dma_device_type": 2 00:22:40.514 } 00:22:40.514 ], 00:22:40.514 "driver_specific": {} 00:22:40.514 }' 00:22:40.514 19:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:40.514 19:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:40.514 19:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:40.514 19:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:40.772 19:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:40.772 19:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:40.772 19:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:40.772 19:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:40.772 19:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:40.772 19:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:40.773 19:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:41.031 19:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:41.031 19:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:41.031 19:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:41.031 19:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:41.289 19:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:41.289 "name": "BaseBdev2", 00:22:41.289 "aliases": [ 00:22:41.289 "3bde1df2-ac1f-4fdb-832e-623e6c603bb5" 00:22:41.289 ], 00:22:41.289 "product_name": "Malloc disk", 00:22:41.289 "block_size": 512, 00:22:41.289 "num_blocks": 65536, 00:22:41.289 "uuid": "3bde1df2-ac1f-4fdb-832e-623e6c603bb5", 00:22:41.289 "assigned_rate_limits": { 00:22:41.289 "rw_ios_per_sec": 0, 00:22:41.289 "rw_mbytes_per_sec": 0, 00:22:41.289 "r_mbytes_per_sec": 0, 00:22:41.289 "w_mbytes_per_sec": 0 00:22:41.289 }, 00:22:41.289 "claimed": true, 00:22:41.289 "claim_type": "exclusive_write", 00:22:41.289 "zoned": false, 00:22:41.289 "supported_io_types": { 00:22:41.289 "read": true, 00:22:41.289 "write": true, 00:22:41.289 "unmap": true, 00:22:41.289 "flush": true, 00:22:41.289 "reset": true, 00:22:41.289 "nvme_admin": false, 00:22:41.289 "nvme_io": false, 00:22:41.289 "nvme_io_md": false, 00:22:41.289 "write_zeroes": true, 00:22:41.289 "zcopy": true, 00:22:41.289 "get_zone_info": false, 00:22:41.289 "zone_management": false, 00:22:41.289 "zone_append": false, 00:22:41.289 "compare": false, 00:22:41.289 "compare_and_write": false, 00:22:41.289 "abort": true, 00:22:41.289 "seek_hole": false, 00:22:41.289 "seek_data": false, 00:22:41.289 "copy": true, 00:22:41.289 "nvme_iov_md": false 00:22:41.289 }, 00:22:41.289 "memory_domains": [ 00:22:41.289 { 00:22:41.289 "dma_device_id": "system", 00:22:41.289 "dma_device_type": 1 00:22:41.289 }, 00:22:41.289 { 00:22:41.289 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:41.289 "dma_device_type": 2 00:22:41.289 } 00:22:41.289 ], 00:22:41.289 "driver_specific": {} 00:22:41.289 }' 00:22:41.289 19:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:41.289 19:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:41.289 19:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:41.289 19:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:41.289 19:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:41.289 19:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:41.289 19:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:41.548 19:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:41.548 19:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:41.548 19:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:41.548 19:58:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:41.548 19:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:41.548 19:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:41.548 19:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:41.548 19:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:41.807 19:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:41.807 "name": "BaseBdev3", 00:22:41.807 "aliases": [ 00:22:41.807 "35dd9637-da65-472f-9317-2e41c176da96" 00:22:41.807 ], 00:22:41.807 "product_name": "Malloc disk", 00:22:41.807 "block_size": 512, 00:22:41.807 "num_blocks": 65536, 00:22:41.807 "uuid": "35dd9637-da65-472f-9317-2e41c176da96", 00:22:41.807 "assigned_rate_limits": { 00:22:41.807 "rw_ios_per_sec": 0, 00:22:41.807 "rw_mbytes_per_sec": 0, 00:22:41.807 "r_mbytes_per_sec": 0, 00:22:41.807 "w_mbytes_per_sec": 0 00:22:41.807 }, 00:22:41.807 "claimed": true, 00:22:41.807 "claim_type": "exclusive_write", 00:22:41.807 "zoned": false, 00:22:41.807 "supported_io_types": { 00:22:41.807 "read": true, 00:22:41.807 "write": true, 00:22:41.807 "unmap": true, 00:22:41.807 "flush": true, 00:22:41.807 "reset": true, 00:22:41.807 "nvme_admin": false, 00:22:41.807 "nvme_io": false, 00:22:41.807 "nvme_io_md": false, 00:22:41.807 "write_zeroes": true, 00:22:41.807 "zcopy": true, 00:22:41.807 "get_zone_info": false, 00:22:41.807 "zone_management": false, 00:22:41.807 "zone_append": false, 00:22:41.807 "compare": false, 00:22:41.807 "compare_and_write": false, 00:22:41.807 "abort": true, 00:22:41.807 "seek_hole": false, 00:22:41.807 "seek_data": false, 00:22:41.807 "copy": true, 00:22:41.807 "nvme_iov_md": false 00:22:41.807 }, 00:22:41.807 "memory_domains": [ 00:22:41.807 { 00:22:41.807 "dma_device_id": "system", 00:22:41.807 "dma_device_type": 1 00:22:41.807 }, 00:22:41.807 { 00:22:41.807 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:41.807 "dma_device_type": 2 00:22:41.807 } 00:22:41.807 ], 00:22:41.807 "driver_specific": {} 00:22:41.807 }' 00:22:41.807 19:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:41.807 19:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:41.807 19:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:41.807 19:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:42.065 19:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:42.065 19:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:42.065 19:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:42.065 19:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:42.065 19:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:42.065 19:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:42.065 19:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:42.065 19:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:42.065 19:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:42.065 19:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:42.065 19:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:42.323 19:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:42.323 "name": "BaseBdev4", 00:22:42.324 "aliases": [ 00:22:42.324 "4e9e6af5-5f54-40b1-a481-5879141c4ce7" 00:22:42.324 ], 00:22:42.324 "product_name": "Malloc disk", 00:22:42.324 "block_size": 512, 00:22:42.324 "num_blocks": 65536, 00:22:42.324 "uuid": "4e9e6af5-5f54-40b1-a481-5879141c4ce7", 00:22:42.324 "assigned_rate_limits": { 00:22:42.324 "rw_ios_per_sec": 0, 00:22:42.324 "rw_mbytes_per_sec": 0, 00:22:42.324 "r_mbytes_per_sec": 0, 00:22:42.324 "w_mbytes_per_sec": 0 00:22:42.324 }, 00:22:42.324 "claimed": true, 00:22:42.324 "claim_type": "exclusive_write", 00:22:42.324 "zoned": false, 00:22:42.324 "supported_io_types": { 00:22:42.324 "read": true, 00:22:42.324 "write": true, 00:22:42.324 "unmap": true, 00:22:42.324 "flush": true, 00:22:42.324 "reset": true, 00:22:42.324 "nvme_admin": false, 00:22:42.324 "nvme_io": false, 00:22:42.324 "nvme_io_md": false, 00:22:42.324 "write_zeroes": true, 00:22:42.324 "zcopy": true, 00:22:42.324 "get_zone_info": false, 00:22:42.324 "zone_management": false, 00:22:42.324 "zone_append": false, 00:22:42.324 "compare": false, 00:22:42.324 "compare_and_write": false, 00:22:42.324 "abort": true, 00:22:42.324 "seek_hole": false, 00:22:42.324 "seek_data": false, 00:22:42.324 "copy": true, 00:22:42.324 "nvme_iov_md": false 00:22:42.324 }, 00:22:42.324 "memory_domains": [ 00:22:42.324 { 00:22:42.324 "dma_device_id": "system", 00:22:42.324 "dma_device_type": 1 00:22:42.324 }, 00:22:42.324 { 00:22:42.324 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:42.324 "dma_device_type": 2 00:22:42.324 } 00:22:42.324 ], 00:22:42.324 "driver_specific": {} 00:22:42.324 }' 00:22:42.324 19:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:42.324 19:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:42.582 19:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:42.582 19:58:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:42.582 19:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:42.582 19:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:42.582 19:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:42.582 19:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:42.582 19:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:42.582 19:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:42.842 19:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:42.842 19:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:42.842 19:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:43.100 [2024-07-24 19:58:34.446608] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:43.100 19:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:22:43.100 19:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:22:43.100 19:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:43.100 19:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:22:43.100 19:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:22:43.100 19:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:22:43.100 19:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:43.100 19:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:43.100 19:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:43.100 19:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:43.100 19:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:43.100 19:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:43.100 19:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:43.100 19:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:43.100 19:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:43.100 19:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:43.100 19:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:43.358 19:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:43.358 "name": "Existed_Raid", 00:22:43.358 "uuid": "2b953989-246f-4db5-8bdf-1003dd7d42cc", 00:22:43.358 "strip_size_kb": 0, 00:22:43.358 "state": "online", 00:22:43.358 "raid_level": "raid1", 00:22:43.358 "superblock": true, 00:22:43.358 "num_base_bdevs": 4, 00:22:43.358 "num_base_bdevs_discovered": 3, 00:22:43.358 "num_base_bdevs_operational": 3, 00:22:43.358 "base_bdevs_list": [ 00:22:43.358 { 00:22:43.358 "name": null, 00:22:43.358 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:43.358 "is_configured": false, 00:22:43.358 "data_offset": 2048, 00:22:43.358 "data_size": 63488 00:22:43.358 }, 00:22:43.358 { 00:22:43.358 "name": "BaseBdev2", 00:22:43.358 "uuid": "3bde1df2-ac1f-4fdb-832e-623e6c603bb5", 00:22:43.358 "is_configured": true, 00:22:43.358 "data_offset": 2048, 00:22:43.358 "data_size": 63488 00:22:43.358 }, 00:22:43.358 { 00:22:43.358 "name": "BaseBdev3", 00:22:43.358 "uuid": "35dd9637-da65-472f-9317-2e41c176da96", 00:22:43.358 "is_configured": true, 00:22:43.358 "data_offset": 2048, 00:22:43.358 "data_size": 63488 00:22:43.358 }, 00:22:43.358 { 00:22:43.358 "name": "BaseBdev4", 00:22:43.358 "uuid": "4e9e6af5-5f54-40b1-a481-5879141c4ce7", 00:22:43.358 "is_configured": true, 00:22:43.358 "data_offset": 2048, 00:22:43.358 "data_size": 63488 00:22:43.358 } 00:22:43.358 ] 00:22:43.358 }' 00:22:43.358 19:58:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:43.358 19:58:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:43.922 19:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:22:43.922 19:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:43.922 19:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:43.922 19:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:44.181 19:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:44.181 19:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:44.181 19:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:22:44.438 [2024-07-24 19:58:35.855588] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:44.438 19:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:44.438 19:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:44.438 19:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:44.438 19:58:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:44.696 19:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:44.696 19:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:44.697 19:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:22:44.956 [2024-07-24 19:58:36.361434] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:44.956 19:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:44.956 19:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:44.956 19:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:44.956 19:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:45.215 19:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:45.215 19:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:45.216 19:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:22:45.475 [2024-07-24 19:58:36.877341] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:22:45.475 [2024-07-24 19:58:36.877440] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:45.475 [2024-07-24 19:58:36.890011] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:45.475 [2024-07-24 19:58:36.890056] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:45.475 [2024-07-24 19:58:36.890069] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16bf300 name Existed_Raid, state offline 00:22:45.475 19:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:45.475 19:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:45.475 19:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:45.475 19:58:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:22:45.733 19:58:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:22:45.733 19:58:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:22:45.733 19:58:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:22:45.733 19:58:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:22:45.734 19:58:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:45.734 19:58:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:22:45.992 BaseBdev2 00:22:45.992 19:58:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:22:45.992 19:58:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:22:45.992 19:58:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:45.992 19:58:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:22:45.992 19:58:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:45.992 19:58:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:45.992 19:58:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:46.252 19:58:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:46.511 [ 00:22:46.511 { 00:22:46.511 "name": "BaseBdev2", 00:22:46.511 "aliases": [ 00:22:46.511 "aa3a78f6-1032-448c-9198-b23cec18b3f3" 00:22:46.511 ], 00:22:46.511 "product_name": "Malloc disk", 00:22:46.511 "block_size": 512, 00:22:46.511 "num_blocks": 65536, 00:22:46.511 "uuid": "aa3a78f6-1032-448c-9198-b23cec18b3f3", 00:22:46.511 "assigned_rate_limits": { 00:22:46.511 "rw_ios_per_sec": 0, 00:22:46.511 "rw_mbytes_per_sec": 0, 00:22:46.511 "r_mbytes_per_sec": 0, 00:22:46.511 "w_mbytes_per_sec": 0 00:22:46.511 }, 00:22:46.511 "claimed": false, 00:22:46.511 "zoned": false, 00:22:46.511 "supported_io_types": { 00:22:46.511 "read": true, 00:22:46.511 "write": true, 00:22:46.511 "unmap": true, 00:22:46.511 "flush": true, 00:22:46.511 "reset": true, 00:22:46.511 "nvme_admin": false, 00:22:46.511 "nvme_io": false, 00:22:46.511 "nvme_io_md": false, 00:22:46.511 "write_zeroes": true, 00:22:46.511 "zcopy": true, 00:22:46.511 "get_zone_info": false, 00:22:46.511 "zone_management": false, 00:22:46.511 "zone_append": false, 00:22:46.511 "compare": false, 00:22:46.511 "compare_and_write": false, 00:22:46.511 "abort": true, 00:22:46.511 "seek_hole": false, 00:22:46.511 "seek_data": false, 00:22:46.511 "copy": true, 00:22:46.511 "nvme_iov_md": false 00:22:46.511 }, 00:22:46.511 "memory_domains": [ 00:22:46.511 { 00:22:46.511 "dma_device_id": "system", 00:22:46.511 "dma_device_type": 1 00:22:46.511 }, 00:22:46.511 { 00:22:46.511 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:46.511 "dma_device_type": 2 00:22:46.511 } 00:22:46.511 ], 00:22:46.511 "driver_specific": {} 00:22:46.511 } 00:22:46.511 ] 00:22:46.511 19:58:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:22:46.511 19:58:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:46.511 19:58:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:46.511 19:58:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:22:46.770 BaseBdev3 00:22:46.770 19:58:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:22:46.770 19:58:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:22:46.770 19:58:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:46.770 19:58:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:22:46.770 19:58:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:46.770 19:58:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:46.770 19:58:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:47.029 19:58:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:22:47.288 [ 00:22:47.288 { 00:22:47.288 "name": "BaseBdev3", 00:22:47.288 "aliases": [ 00:22:47.288 "e21fe334-8b39-444b-aa10-d5c8b4c9a4b8" 00:22:47.288 ], 00:22:47.288 "product_name": "Malloc disk", 00:22:47.288 "block_size": 512, 00:22:47.288 "num_blocks": 65536, 00:22:47.288 "uuid": "e21fe334-8b39-444b-aa10-d5c8b4c9a4b8", 00:22:47.288 "assigned_rate_limits": { 00:22:47.288 "rw_ios_per_sec": 0, 00:22:47.288 "rw_mbytes_per_sec": 0, 00:22:47.288 "r_mbytes_per_sec": 0, 00:22:47.288 "w_mbytes_per_sec": 0 00:22:47.288 }, 00:22:47.288 "claimed": false, 00:22:47.288 "zoned": false, 00:22:47.288 "supported_io_types": { 00:22:47.288 "read": true, 00:22:47.288 "write": true, 00:22:47.288 "unmap": true, 00:22:47.288 "flush": true, 00:22:47.288 "reset": true, 00:22:47.288 "nvme_admin": false, 00:22:47.288 "nvme_io": false, 00:22:47.288 "nvme_io_md": false, 00:22:47.288 "write_zeroes": true, 00:22:47.288 "zcopy": true, 00:22:47.288 "get_zone_info": false, 00:22:47.288 "zone_management": false, 00:22:47.288 "zone_append": false, 00:22:47.288 "compare": false, 00:22:47.288 "compare_and_write": false, 00:22:47.288 "abort": true, 00:22:47.288 "seek_hole": false, 00:22:47.288 "seek_data": false, 00:22:47.288 "copy": true, 00:22:47.288 "nvme_iov_md": false 00:22:47.288 }, 00:22:47.288 "memory_domains": [ 00:22:47.288 { 00:22:47.288 "dma_device_id": "system", 00:22:47.288 "dma_device_type": 1 00:22:47.288 }, 00:22:47.288 { 00:22:47.288 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:47.288 "dma_device_type": 2 00:22:47.288 } 00:22:47.288 ], 00:22:47.288 "driver_specific": {} 00:22:47.288 } 00:22:47.288 ] 00:22:47.288 19:58:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:22:47.288 19:58:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:47.288 19:58:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:47.288 19:58:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:22:47.288 BaseBdev4 00:22:47.547 19:58:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:22:47.547 19:58:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:22:47.547 19:58:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:47.547 19:58:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:22:47.547 19:58:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:47.547 19:58:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:47.547 19:58:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:47.547 19:58:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:47.806 [ 00:22:47.806 { 00:22:47.806 "name": "BaseBdev4", 00:22:47.806 "aliases": [ 00:22:47.806 "fc54eec3-0e01-4dc7-a7fd-7c073748fdee" 00:22:47.806 ], 00:22:47.806 "product_name": "Malloc disk", 00:22:47.806 "block_size": 512, 00:22:47.806 "num_blocks": 65536, 00:22:47.806 "uuid": "fc54eec3-0e01-4dc7-a7fd-7c073748fdee", 00:22:47.806 "assigned_rate_limits": { 00:22:47.806 "rw_ios_per_sec": 0, 00:22:47.806 "rw_mbytes_per_sec": 0, 00:22:47.806 "r_mbytes_per_sec": 0, 00:22:47.806 "w_mbytes_per_sec": 0 00:22:47.806 }, 00:22:47.806 "claimed": false, 00:22:47.806 "zoned": false, 00:22:47.806 "supported_io_types": { 00:22:47.806 "read": true, 00:22:47.806 "write": true, 00:22:47.806 "unmap": true, 00:22:47.806 "flush": true, 00:22:47.806 "reset": true, 00:22:47.806 "nvme_admin": false, 00:22:47.806 "nvme_io": false, 00:22:47.806 "nvme_io_md": false, 00:22:47.806 "write_zeroes": true, 00:22:47.806 "zcopy": true, 00:22:47.806 "get_zone_info": false, 00:22:47.806 "zone_management": false, 00:22:47.806 "zone_append": false, 00:22:47.806 "compare": false, 00:22:47.806 "compare_and_write": false, 00:22:47.806 "abort": true, 00:22:47.806 "seek_hole": false, 00:22:47.806 "seek_data": false, 00:22:47.806 "copy": true, 00:22:47.806 "nvme_iov_md": false 00:22:47.806 }, 00:22:47.806 "memory_domains": [ 00:22:47.806 { 00:22:47.806 "dma_device_id": "system", 00:22:47.806 "dma_device_type": 1 00:22:47.806 }, 00:22:47.806 { 00:22:47.806 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:47.806 "dma_device_type": 2 00:22:47.806 } 00:22:47.806 ], 00:22:47.806 "driver_specific": {} 00:22:47.806 } 00:22:47.806 ] 00:22:47.806 19:58:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:22:47.806 19:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:47.806 19:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:47.806 19:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:48.067 [2024-07-24 19:58:39.591439] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:48.067 [2024-07-24 19:58:39.591485] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:48.067 [2024-07-24 19:58:39.591505] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:48.067 [2024-07-24 19:58:39.592951] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:48.067 [2024-07-24 19:58:39.592995] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:48.067 19:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:48.067 19:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:48.067 19:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:48.067 19:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:48.067 19:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:48.067 19:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:48.067 19:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:48.067 19:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:48.067 19:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:48.067 19:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:48.067 19:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:48.067 19:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:48.326 19:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:48.326 "name": "Existed_Raid", 00:22:48.326 "uuid": "395512f7-7045-4b7f-bc54-3db130f35e3b", 00:22:48.326 "strip_size_kb": 0, 00:22:48.326 "state": "configuring", 00:22:48.326 "raid_level": "raid1", 00:22:48.326 "superblock": true, 00:22:48.326 "num_base_bdevs": 4, 00:22:48.326 "num_base_bdevs_discovered": 3, 00:22:48.326 "num_base_bdevs_operational": 4, 00:22:48.326 "base_bdevs_list": [ 00:22:48.326 { 00:22:48.326 "name": "BaseBdev1", 00:22:48.326 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:48.326 "is_configured": false, 00:22:48.326 "data_offset": 0, 00:22:48.326 "data_size": 0 00:22:48.326 }, 00:22:48.326 { 00:22:48.326 "name": "BaseBdev2", 00:22:48.326 "uuid": "aa3a78f6-1032-448c-9198-b23cec18b3f3", 00:22:48.326 "is_configured": true, 00:22:48.326 "data_offset": 2048, 00:22:48.326 "data_size": 63488 00:22:48.326 }, 00:22:48.326 { 00:22:48.326 "name": "BaseBdev3", 00:22:48.326 "uuid": "e21fe334-8b39-444b-aa10-d5c8b4c9a4b8", 00:22:48.326 "is_configured": true, 00:22:48.326 "data_offset": 2048, 00:22:48.326 "data_size": 63488 00:22:48.326 }, 00:22:48.326 { 00:22:48.326 "name": "BaseBdev4", 00:22:48.327 "uuid": "fc54eec3-0e01-4dc7-a7fd-7c073748fdee", 00:22:48.327 "is_configured": true, 00:22:48.327 "data_offset": 2048, 00:22:48.327 "data_size": 63488 00:22:48.327 } 00:22:48.327 ] 00:22:48.327 }' 00:22:48.327 19:58:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:48.327 19:58:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:49.264 19:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:49.264 [2024-07-24 19:58:40.746467] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:49.264 19:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:49.264 19:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:49.264 19:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:49.264 19:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:49.264 19:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:49.264 19:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:49.264 19:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:49.264 19:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:49.264 19:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:49.264 19:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:49.264 19:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.264 19:58:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:49.523 19:58:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:49.523 "name": "Existed_Raid", 00:22:49.523 "uuid": "395512f7-7045-4b7f-bc54-3db130f35e3b", 00:22:49.523 "strip_size_kb": 0, 00:22:49.523 "state": "configuring", 00:22:49.523 "raid_level": "raid1", 00:22:49.523 "superblock": true, 00:22:49.523 "num_base_bdevs": 4, 00:22:49.523 "num_base_bdevs_discovered": 2, 00:22:49.523 "num_base_bdevs_operational": 4, 00:22:49.523 "base_bdevs_list": [ 00:22:49.523 { 00:22:49.523 "name": "BaseBdev1", 00:22:49.523 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:49.523 "is_configured": false, 00:22:49.523 "data_offset": 0, 00:22:49.523 "data_size": 0 00:22:49.523 }, 00:22:49.523 { 00:22:49.523 "name": null, 00:22:49.523 "uuid": "aa3a78f6-1032-448c-9198-b23cec18b3f3", 00:22:49.523 "is_configured": false, 00:22:49.523 "data_offset": 2048, 00:22:49.523 "data_size": 63488 00:22:49.523 }, 00:22:49.523 { 00:22:49.523 "name": "BaseBdev3", 00:22:49.523 "uuid": "e21fe334-8b39-444b-aa10-d5c8b4c9a4b8", 00:22:49.523 "is_configured": true, 00:22:49.523 "data_offset": 2048, 00:22:49.523 "data_size": 63488 00:22:49.523 }, 00:22:49.523 { 00:22:49.523 "name": "BaseBdev4", 00:22:49.523 "uuid": "fc54eec3-0e01-4dc7-a7fd-7c073748fdee", 00:22:49.523 "is_configured": true, 00:22:49.523 "data_offset": 2048, 00:22:49.523 "data_size": 63488 00:22:49.523 } 00:22:49.523 ] 00:22:49.523 }' 00:22:49.523 19:58:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:49.523 19:58:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:50.091 19:58:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:50.091 19:58:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:50.351 19:58:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:22:50.351 19:58:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:50.612 [2024-07-24 19:58:42.033293] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:50.612 BaseBdev1 00:22:50.612 19:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:22:50.612 19:58:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:22:50.612 19:58:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:50.612 19:58:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:22:50.612 19:58:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:50.612 19:58:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:50.612 19:58:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:50.873 19:58:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:51.132 [ 00:22:51.132 { 00:22:51.132 "name": "BaseBdev1", 00:22:51.132 "aliases": [ 00:22:51.132 "4ee4ab7e-a5bd-40ab-b0aa-225800743867" 00:22:51.132 ], 00:22:51.132 "product_name": "Malloc disk", 00:22:51.132 "block_size": 512, 00:22:51.132 "num_blocks": 65536, 00:22:51.132 "uuid": "4ee4ab7e-a5bd-40ab-b0aa-225800743867", 00:22:51.132 "assigned_rate_limits": { 00:22:51.132 "rw_ios_per_sec": 0, 00:22:51.132 "rw_mbytes_per_sec": 0, 00:22:51.132 "r_mbytes_per_sec": 0, 00:22:51.132 "w_mbytes_per_sec": 0 00:22:51.132 }, 00:22:51.132 "claimed": true, 00:22:51.132 "claim_type": "exclusive_write", 00:22:51.132 "zoned": false, 00:22:51.132 "supported_io_types": { 00:22:51.132 "read": true, 00:22:51.132 "write": true, 00:22:51.132 "unmap": true, 00:22:51.132 "flush": true, 00:22:51.132 "reset": true, 00:22:51.132 "nvme_admin": false, 00:22:51.132 "nvme_io": false, 00:22:51.132 "nvme_io_md": false, 00:22:51.132 "write_zeroes": true, 00:22:51.132 "zcopy": true, 00:22:51.132 "get_zone_info": false, 00:22:51.132 "zone_management": false, 00:22:51.132 "zone_append": false, 00:22:51.132 "compare": false, 00:22:51.132 "compare_and_write": false, 00:22:51.132 "abort": true, 00:22:51.132 "seek_hole": false, 00:22:51.132 "seek_data": false, 00:22:51.132 "copy": true, 00:22:51.132 "nvme_iov_md": false 00:22:51.132 }, 00:22:51.132 "memory_domains": [ 00:22:51.132 { 00:22:51.132 "dma_device_id": "system", 00:22:51.132 "dma_device_type": 1 00:22:51.132 }, 00:22:51.132 { 00:22:51.132 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:51.132 "dma_device_type": 2 00:22:51.132 } 00:22:51.132 ], 00:22:51.132 "driver_specific": {} 00:22:51.132 } 00:22:51.132 ] 00:22:51.132 19:58:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:22:51.132 19:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:51.132 19:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:51.132 19:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:51.132 19:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:51.132 19:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:51.132 19:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:51.132 19:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:51.132 19:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:51.132 19:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:51.132 19:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:51.132 19:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:51.132 19:58:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:51.700 19:58:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:51.700 "name": "Existed_Raid", 00:22:51.700 "uuid": "395512f7-7045-4b7f-bc54-3db130f35e3b", 00:22:51.700 "strip_size_kb": 0, 00:22:51.700 "state": "configuring", 00:22:51.700 "raid_level": "raid1", 00:22:51.700 "superblock": true, 00:22:51.700 "num_base_bdevs": 4, 00:22:51.700 "num_base_bdevs_discovered": 3, 00:22:51.700 "num_base_bdevs_operational": 4, 00:22:51.700 "base_bdevs_list": [ 00:22:51.700 { 00:22:51.700 "name": "BaseBdev1", 00:22:51.700 "uuid": "4ee4ab7e-a5bd-40ab-b0aa-225800743867", 00:22:51.700 "is_configured": true, 00:22:51.700 "data_offset": 2048, 00:22:51.700 "data_size": 63488 00:22:51.700 }, 00:22:51.700 { 00:22:51.700 "name": null, 00:22:51.700 "uuid": "aa3a78f6-1032-448c-9198-b23cec18b3f3", 00:22:51.700 "is_configured": false, 00:22:51.700 "data_offset": 2048, 00:22:51.700 "data_size": 63488 00:22:51.700 }, 00:22:51.700 { 00:22:51.700 "name": "BaseBdev3", 00:22:51.700 "uuid": "e21fe334-8b39-444b-aa10-d5c8b4c9a4b8", 00:22:51.700 "is_configured": true, 00:22:51.700 "data_offset": 2048, 00:22:51.700 "data_size": 63488 00:22:51.700 }, 00:22:51.700 { 00:22:51.700 "name": "BaseBdev4", 00:22:51.700 "uuid": "fc54eec3-0e01-4dc7-a7fd-7c073748fdee", 00:22:51.700 "is_configured": true, 00:22:51.700 "data_offset": 2048, 00:22:51.700 "data_size": 63488 00:22:51.700 } 00:22:51.700 ] 00:22:51.700 }' 00:22:51.700 19:58:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:51.700 19:58:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:52.735 19:58:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:52.735 19:58:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:52.735 19:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:22:52.735 19:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:22:53.013 [2024-07-24 19:58:44.351483] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:53.013 19:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:53.013 19:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:53.013 19:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:53.013 19:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:53.013 19:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:53.013 19:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:53.013 19:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:53.013 19:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:53.013 19:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:53.013 19:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:53.013 19:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:53.013 19:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:53.272 19:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:53.272 "name": "Existed_Raid", 00:22:53.272 "uuid": "395512f7-7045-4b7f-bc54-3db130f35e3b", 00:22:53.272 "strip_size_kb": 0, 00:22:53.272 "state": "configuring", 00:22:53.272 "raid_level": "raid1", 00:22:53.272 "superblock": true, 00:22:53.272 "num_base_bdevs": 4, 00:22:53.272 "num_base_bdevs_discovered": 2, 00:22:53.272 "num_base_bdevs_operational": 4, 00:22:53.272 "base_bdevs_list": [ 00:22:53.272 { 00:22:53.272 "name": "BaseBdev1", 00:22:53.272 "uuid": "4ee4ab7e-a5bd-40ab-b0aa-225800743867", 00:22:53.272 "is_configured": true, 00:22:53.272 "data_offset": 2048, 00:22:53.272 "data_size": 63488 00:22:53.272 }, 00:22:53.272 { 00:22:53.272 "name": null, 00:22:53.272 "uuid": "aa3a78f6-1032-448c-9198-b23cec18b3f3", 00:22:53.272 "is_configured": false, 00:22:53.272 "data_offset": 2048, 00:22:53.272 "data_size": 63488 00:22:53.272 }, 00:22:53.272 { 00:22:53.272 "name": null, 00:22:53.272 "uuid": "e21fe334-8b39-444b-aa10-d5c8b4c9a4b8", 00:22:53.272 "is_configured": false, 00:22:53.272 "data_offset": 2048, 00:22:53.272 "data_size": 63488 00:22:53.272 }, 00:22:53.272 { 00:22:53.272 "name": "BaseBdev4", 00:22:53.272 "uuid": "fc54eec3-0e01-4dc7-a7fd-7c073748fdee", 00:22:53.272 "is_configured": true, 00:22:53.272 "data_offset": 2048, 00:22:53.272 "data_size": 63488 00:22:53.272 } 00:22:53.272 ] 00:22:53.272 }' 00:22:53.272 19:58:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:53.272 19:58:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:54.210 19:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:54.210 19:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:54.210 19:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:22:54.210 19:58:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:22:54.470 [2024-07-24 19:58:45.991815] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:54.470 19:58:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:54.470 19:58:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:54.470 19:58:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:54.470 19:58:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:54.470 19:58:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:54.470 19:58:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:54.470 19:58:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:54.470 19:58:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:54.470 19:58:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:54.470 19:58:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:54.470 19:58:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:54.470 19:58:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:54.729 19:58:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:54.729 "name": "Existed_Raid", 00:22:54.729 "uuid": "395512f7-7045-4b7f-bc54-3db130f35e3b", 00:22:54.729 "strip_size_kb": 0, 00:22:54.729 "state": "configuring", 00:22:54.729 "raid_level": "raid1", 00:22:54.729 "superblock": true, 00:22:54.729 "num_base_bdevs": 4, 00:22:54.729 "num_base_bdevs_discovered": 3, 00:22:54.729 "num_base_bdevs_operational": 4, 00:22:54.729 "base_bdevs_list": [ 00:22:54.729 { 00:22:54.729 "name": "BaseBdev1", 00:22:54.729 "uuid": "4ee4ab7e-a5bd-40ab-b0aa-225800743867", 00:22:54.729 "is_configured": true, 00:22:54.729 "data_offset": 2048, 00:22:54.729 "data_size": 63488 00:22:54.729 }, 00:22:54.729 { 00:22:54.729 "name": null, 00:22:54.729 "uuid": "aa3a78f6-1032-448c-9198-b23cec18b3f3", 00:22:54.729 "is_configured": false, 00:22:54.729 "data_offset": 2048, 00:22:54.729 "data_size": 63488 00:22:54.729 }, 00:22:54.729 { 00:22:54.729 "name": "BaseBdev3", 00:22:54.729 "uuid": "e21fe334-8b39-444b-aa10-d5c8b4c9a4b8", 00:22:54.729 "is_configured": true, 00:22:54.729 "data_offset": 2048, 00:22:54.729 "data_size": 63488 00:22:54.729 }, 00:22:54.729 { 00:22:54.729 "name": "BaseBdev4", 00:22:54.729 "uuid": "fc54eec3-0e01-4dc7-a7fd-7c073748fdee", 00:22:54.729 "is_configured": true, 00:22:54.729 "data_offset": 2048, 00:22:54.729 "data_size": 63488 00:22:54.729 } 00:22:54.729 ] 00:22:54.729 }' 00:22:54.729 19:58:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:54.729 19:58:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:55.667 19:58:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:55.667 19:58:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:55.926 19:58:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:22:55.926 19:58:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:55.926 [2024-07-24 19:58:47.499838] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:56.186 19:58:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:56.186 19:58:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:56.186 19:58:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:56.186 19:58:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:56.186 19:58:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:56.186 19:58:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:56.186 19:58:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:56.186 19:58:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:56.186 19:58:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:56.186 19:58:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:56.186 19:58:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:56.186 19:58:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:56.754 19:58:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:56.754 "name": "Existed_Raid", 00:22:56.754 "uuid": "395512f7-7045-4b7f-bc54-3db130f35e3b", 00:22:56.754 "strip_size_kb": 0, 00:22:56.754 "state": "configuring", 00:22:56.754 "raid_level": "raid1", 00:22:56.754 "superblock": true, 00:22:56.754 "num_base_bdevs": 4, 00:22:56.754 "num_base_bdevs_discovered": 2, 00:22:56.754 "num_base_bdevs_operational": 4, 00:22:56.754 "base_bdevs_list": [ 00:22:56.754 { 00:22:56.754 "name": null, 00:22:56.754 "uuid": "4ee4ab7e-a5bd-40ab-b0aa-225800743867", 00:22:56.754 "is_configured": false, 00:22:56.754 "data_offset": 2048, 00:22:56.754 "data_size": 63488 00:22:56.754 }, 00:22:56.754 { 00:22:56.754 "name": null, 00:22:56.754 "uuid": "aa3a78f6-1032-448c-9198-b23cec18b3f3", 00:22:56.754 "is_configured": false, 00:22:56.755 "data_offset": 2048, 00:22:56.755 "data_size": 63488 00:22:56.755 }, 00:22:56.755 { 00:22:56.755 "name": "BaseBdev3", 00:22:56.755 "uuid": "e21fe334-8b39-444b-aa10-d5c8b4c9a4b8", 00:22:56.755 "is_configured": true, 00:22:56.755 "data_offset": 2048, 00:22:56.755 "data_size": 63488 00:22:56.755 }, 00:22:56.755 { 00:22:56.755 "name": "BaseBdev4", 00:22:56.755 "uuid": "fc54eec3-0e01-4dc7-a7fd-7c073748fdee", 00:22:56.755 "is_configured": true, 00:22:56.755 "data_offset": 2048, 00:22:56.755 "data_size": 63488 00:22:56.755 } 00:22:56.755 ] 00:22:56.755 }' 00:22:56.755 19:58:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:56.755 19:58:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:57.322 19:58:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:57.322 19:58:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:57.322 19:58:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:22:57.322 19:58:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:22:57.891 [2024-07-24 19:58:49.381272] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:57.891 19:58:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:57.891 19:58:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:57.891 19:58:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:57.891 19:58:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:57.891 19:58:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:57.891 19:58:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:57.892 19:58:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:57.892 19:58:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:57.892 19:58:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:57.892 19:58:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:57.892 19:58:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:57.892 19:58:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:58.461 19:58:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:58.461 "name": "Existed_Raid", 00:22:58.461 "uuid": "395512f7-7045-4b7f-bc54-3db130f35e3b", 00:22:58.461 "strip_size_kb": 0, 00:22:58.461 "state": "configuring", 00:22:58.461 "raid_level": "raid1", 00:22:58.461 "superblock": true, 00:22:58.461 "num_base_bdevs": 4, 00:22:58.461 "num_base_bdevs_discovered": 3, 00:22:58.461 "num_base_bdevs_operational": 4, 00:22:58.461 "base_bdevs_list": [ 00:22:58.461 { 00:22:58.461 "name": null, 00:22:58.461 "uuid": "4ee4ab7e-a5bd-40ab-b0aa-225800743867", 00:22:58.461 "is_configured": false, 00:22:58.461 "data_offset": 2048, 00:22:58.461 "data_size": 63488 00:22:58.461 }, 00:22:58.461 { 00:22:58.461 "name": "BaseBdev2", 00:22:58.461 "uuid": "aa3a78f6-1032-448c-9198-b23cec18b3f3", 00:22:58.461 "is_configured": true, 00:22:58.461 "data_offset": 2048, 00:22:58.461 "data_size": 63488 00:22:58.461 }, 00:22:58.461 { 00:22:58.461 "name": "BaseBdev3", 00:22:58.461 "uuid": "e21fe334-8b39-444b-aa10-d5c8b4c9a4b8", 00:22:58.461 "is_configured": true, 00:22:58.461 "data_offset": 2048, 00:22:58.461 "data_size": 63488 00:22:58.461 }, 00:22:58.461 { 00:22:58.461 "name": "BaseBdev4", 00:22:58.461 "uuid": "fc54eec3-0e01-4dc7-a7fd-7c073748fdee", 00:22:58.461 "is_configured": true, 00:22:58.461 "data_offset": 2048, 00:22:58.461 "data_size": 63488 00:22:58.461 } 00:22:58.461 ] 00:22:58.461 }' 00:22:58.461 19:58:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:58.461 19:58:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:59.029 19:58:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.029 19:58:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:59.288 19:58:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:22:59.288 19:58:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.288 19:58:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:22:59.856 19:58:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 4ee4ab7e-a5bd-40ab-b0aa-225800743867 00:23:00.424 [2024-07-24 19:58:51.788253] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:23:00.424 [2024-07-24 19:58:51.788443] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x16c18b0 00:23:00.424 [2024-07-24 19:58:51.788457] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:00.424 [2024-07-24 19:58:51.788643] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16be170 00:23:00.424 [2024-07-24 19:58:51.788766] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16c18b0 00:23:00.424 [2024-07-24 19:58:51.788776] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x16c18b0 00:23:00.424 [2024-07-24 19:58:51.788868] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:00.424 NewBaseBdev 00:23:00.424 19:58:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:23:00.424 19:58:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:23:00.424 19:58:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:23:00.424 19:58:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:23:00.424 19:58:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:23:00.424 19:58:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:23:00.424 19:58:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:00.724 19:58:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:23:00.984 [ 00:23:00.984 { 00:23:00.984 "name": "NewBaseBdev", 00:23:00.984 "aliases": [ 00:23:00.984 "4ee4ab7e-a5bd-40ab-b0aa-225800743867" 00:23:00.984 ], 00:23:00.984 "product_name": "Malloc disk", 00:23:00.984 "block_size": 512, 00:23:00.984 "num_blocks": 65536, 00:23:00.984 "uuid": "4ee4ab7e-a5bd-40ab-b0aa-225800743867", 00:23:00.984 "assigned_rate_limits": { 00:23:00.984 "rw_ios_per_sec": 0, 00:23:00.984 "rw_mbytes_per_sec": 0, 00:23:00.984 "r_mbytes_per_sec": 0, 00:23:00.984 "w_mbytes_per_sec": 0 00:23:00.984 }, 00:23:00.984 "claimed": true, 00:23:00.984 "claim_type": "exclusive_write", 00:23:00.984 "zoned": false, 00:23:00.984 "supported_io_types": { 00:23:00.984 "read": true, 00:23:00.984 "write": true, 00:23:00.984 "unmap": true, 00:23:00.984 "flush": true, 00:23:00.984 "reset": true, 00:23:00.984 "nvme_admin": false, 00:23:00.984 "nvme_io": false, 00:23:00.984 "nvme_io_md": false, 00:23:00.984 "write_zeroes": true, 00:23:00.984 "zcopy": true, 00:23:00.984 "get_zone_info": false, 00:23:00.984 "zone_management": false, 00:23:00.984 "zone_append": false, 00:23:00.984 "compare": false, 00:23:00.984 "compare_and_write": false, 00:23:00.984 "abort": true, 00:23:00.984 "seek_hole": false, 00:23:00.984 "seek_data": false, 00:23:00.984 "copy": true, 00:23:00.984 "nvme_iov_md": false 00:23:00.984 }, 00:23:00.984 "memory_domains": [ 00:23:00.984 { 00:23:00.984 "dma_device_id": "system", 00:23:00.984 "dma_device_type": 1 00:23:00.984 }, 00:23:00.984 { 00:23:00.984 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:00.984 "dma_device_type": 2 00:23:00.984 } 00:23:00.984 ], 00:23:00.984 "driver_specific": {} 00:23:00.984 } 00:23:00.984 ] 00:23:00.984 19:58:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:23:00.984 19:58:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:23:00.984 19:58:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:00.984 19:58:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:00.984 19:58:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:00.984 19:58:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:00.984 19:58:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:00.984 19:58:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:00.984 19:58:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:00.984 19:58:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:00.984 19:58:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:00.984 19:58:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:00.984 19:58:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:01.552 19:58:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:01.552 "name": "Existed_Raid", 00:23:01.552 "uuid": "395512f7-7045-4b7f-bc54-3db130f35e3b", 00:23:01.552 "strip_size_kb": 0, 00:23:01.552 "state": "online", 00:23:01.552 "raid_level": "raid1", 00:23:01.552 "superblock": true, 00:23:01.552 "num_base_bdevs": 4, 00:23:01.552 "num_base_bdevs_discovered": 4, 00:23:01.552 "num_base_bdevs_operational": 4, 00:23:01.552 "base_bdevs_list": [ 00:23:01.552 { 00:23:01.552 "name": "NewBaseBdev", 00:23:01.552 "uuid": "4ee4ab7e-a5bd-40ab-b0aa-225800743867", 00:23:01.552 "is_configured": true, 00:23:01.552 "data_offset": 2048, 00:23:01.552 "data_size": 63488 00:23:01.552 }, 00:23:01.552 { 00:23:01.552 "name": "BaseBdev2", 00:23:01.552 "uuid": "aa3a78f6-1032-448c-9198-b23cec18b3f3", 00:23:01.552 "is_configured": true, 00:23:01.552 "data_offset": 2048, 00:23:01.552 "data_size": 63488 00:23:01.552 }, 00:23:01.552 { 00:23:01.552 "name": "BaseBdev3", 00:23:01.552 "uuid": "e21fe334-8b39-444b-aa10-d5c8b4c9a4b8", 00:23:01.552 "is_configured": true, 00:23:01.552 "data_offset": 2048, 00:23:01.552 "data_size": 63488 00:23:01.552 }, 00:23:01.552 { 00:23:01.552 "name": "BaseBdev4", 00:23:01.552 "uuid": "fc54eec3-0e01-4dc7-a7fd-7c073748fdee", 00:23:01.552 "is_configured": true, 00:23:01.552 "data_offset": 2048, 00:23:01.552 "data_size": 63488 00:23:01.552 } 00:23:01.552 ] 00:23:01.552 }' 00:23:01.552 19:58:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:01.552 19:58:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:02.119 19:58:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:23:02.119 19:58:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:23:02.119 19:58:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:02.119 19:58:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:02.119 19:58:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:02.119 19:58:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:23:02.119 19:58:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:23:02.119 19:58:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:02.379 [2024-07-24 19:58:53.721698] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:02.379 19:58:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:02.379 "name": "Existed_Raid", 00:23:02.379 "aliases": [ 00:23:02.379 "395512f7-7045-4b7f-bc54-3db130f35e3b" 00:23:02.379 ], 00:23:02.379 "product_name": "Raid Volume", 00:23:02.379 "block_size": 512, 00:23:02.379 "num_blocks": 63488, 00:23:02.379 "uuid": "395512f7-7045-4b7f-bc54-3db130f35e3b", 00:23:02.379 "assigned_rate_limits": { 00:23:02.379 "rw_ios_per_sec": 0, 00:23:02.379 "rw_mbytes_per_sec": 0, 00:23:02.379 "r_mbytes_per_sec": 0, 00:23:02.379 "w_mbytes_per_sec": 0 00:23:02.379 }, 00:23:02.379 "claimed": false, 00:23:02.379 "zoned": false, 00:23:02.379 "supported_io_types": { 00:23:02.379 "read": true, 00:23:02.379 "write": true, 00:23:02.379 "unmap": false, 00:23:02.379 "flush": false, 00:23:02.379 "reset": true, 00:23:02.379 "nvme_admin": false, 00:23:02.379 "nvme_io": false, 00:23:02.379 "nvme_io_md": false, 00:23:02.379 "write_zeroes": true, 00:23:02.379 "zcopy": false, 00:23:02.379 "get_zone_info": false, 00:23:02.379 "zone_management": false, 00:23:02.379 "zone_append": false, 00:23:02.379 "compare": false, 00:23:02.379 "compare_and_write": false, 00:23:02.379 "abort": false, 00:23:02.379 "seek_hole": false, 00:23:02.379 "seek_data": false, 00:23:02.379 "copy": false, 00:23:02.379 "nvme_iov_md": false 00:23:02.379 }, 00:23:02.379 "memory_domains": [ 00:23:02.379 { 00:23:02.379 "dma_device_id": "system", 00:23:02.379 "dma_device_type": 1 00:23:02.379 }, 00:23:02.379 { 00:23:02.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:02.379 "dma_device_type": 2 00:23:02.379 }, 00:23:02.379 { 00:23:02.379 "dma_device_id": "system", 00:23:02.379 "dma_device_type": 1 00:23:02.379 }, 00:23:02.379 { 00:23:02.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:02.379 "dma_device_type": 2 00:23:02.379 }, 00:23:02.379 { 00:23:02.379 "dma_device_id": "system", 00:23:02.379 "dma_device_type": 1 00:23:02.379 }, 00:23:02.379 { 00:23:02.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:02.379 "dma_device_type": 2 00:23:02.379 }, 00:23:02.379 { 00:23:02.379 "dma_device_id": "system", 00:23:02.379 "dma_device_type": 1 00:23:02.379 }, 00:23:02.379 { 00:23:02.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:02.379 "dma_device_type": 2 00:23:02.379 } 00:23:02.379 ], 00:23:02.379 "driver_specific": { 00:23:02.379 "raid": { 00:23:02.379 "uuid": "395512f7-7045-4b7f-bc54-3db130f35e3b", 00:23:02.379 "strip_size_kb": 0, 00:23:02.379 "state": "online", 00:23:02.379 "raid_level": "raid1", 00:23:02.379 "superblock": true, 00:23:02.379 "num_base_bdevs": 4, 00:23:02.379 "num_base_bdevs_discovered": 4, 00:23:02.379 "num_base_bdevs_operational": 4, 00:23:02.379 "base_bdevs_list": [ 00:23:02.379 { 00:23:02.379 "name": "NewBaseBdev", 00:23:02.379 "uuid": "4ee4ab7e-a5bd-40ab-b0aa-225800743867", 00:23:02.379 "is_configured": true, 00:23:02.379 "data_offset": 2048, 00:23:02.379 "data_size": 63488 00:23:02.379 }, 00:23:02.379 { 00:23:02.379 "name": "BaseBdev2", 00:23:02.379 "uuid": "aa3a78f6-1032-448c-9198-b23cec18b3f3", 00:23:02.379 "is_configured": true, 00:23:02.379 "data_offset": 2048, 00:23:02.379 "data_size": 63488 00:23:02.379 }, 00:23:02.379 { 00:23:02.379 "name": "BaseBdev3", 00:23:02.379 "uuid": "e21fe334-8b39-444b-aa10-d5c8b4c9a4b8", 00:23:02.379 "is_configured": true, 00:23:02.379 "data_offset": 2048, 00:23:02.379 "data_size": 63488 00:23:02.379 }, 00:23:02.379 { 00:23:02.379 "name": "BaseBdev4", 00:23:02.379 "uuid": "fc54eec3-0e01-4dc7-a7fd-7c073748fdee", 00:23:02.379 "is_configured": true, 00:23:02.379 "data_offset": 2048, 00:23:02.379 "data_size": 63488 00:23:02.379 } 00:23:02.379 ] 00:23:02.379 } 00:23:02.379 } 00:23:02.379 }' 00:23:02.379 19:58:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:02.379 19:58:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:23:02.379 BaseBdev2 00:23:02.379 BaseBdev3 00:23:02.379 BaseBdev4' 00:23:02.379 19:58:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:02.380 19:58:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:23:02.380 19:58:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:02.947 19:58:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:02.947 "name": "NewBaseBdev", 00:23:02.947 "aliases": [ 00:23:02.947 "4ee4ab7e-a5bd-40ab-b0aa-225800743867" 00:23:02.947 ], 00:23:02.947 "product_name": "Malloc disk", 00:23:02.947 "block_size": 512, 00:23:02.947 "num_blocks": 65536, 00:23:02.947 "uuid": "4ee4ab7e-a5bd-40ab-b0aa-225800743867", 00:23:02.947 "assigned_rate_limits": { 00:23:02.947 "rw_ios_per_sec": 0, 00:23:02.947 "rw_mbytes_per_sec": 0, 00:23:02.947 "r_mbytes_per_sec": 0, 00:23:02.947 "w_mbytes_per_sec": 0 00:23:02.947 }, 00:23:02.947 "claimed": true, 00:23:02.947 "claim_type": "exclusive_write", 00:23:02.947 "zoned": false, 00:23:02.947 "supported_io_types": { 00:23:02.947 "read": true, 00:23:02.947 "write": true, 00:23:02.947 "unmap": true, 00:23:02.947 "flush": true, 00:23:02.947 "reset": true, 00:23:02.947 "nvme_admin": false, 00:23:02.947 "nvme_io": false, 00:23:02.947 "nvme_io_md": false, 00:23:02.947 "write_zeroes": true, 00:23:02.947 "zcopy": true, 00:23:02.947 "get_zone_info": false, 00:23:02.947 "zone_management": false, 00:23:02.947 "zone_append": false, 00:23:02.947 "compare": false, 00:23:02.947 "compare_and_write": false, 00:23:02.947 "abort": true, 00:23:02.947 "seek_hole": false, 00:23:02.947 "seek_data": false, 00:23:02.947 "copy": true, 00:23:02.947 "nvme_iov_md": false 00:23:02.947 }, 00:23:02.947 "memory_domains": [ 00:23:02.947 { 00:23:02.947 "dma_device_id": "system", 00:23:02.947 "dma_device_type": 1 00:23:02.947 }, 00:23:02.947 { 00:23:02.947 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:02.947 "dma_device_type": 2 00:23:02.947 } 00:23:02.947 ], 00:23:02.947 "driver_specific": {} 00:23:02.947 }' 00:23:02.947 19:58:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:02.947 19:58:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:02.947 19:58:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:02.947 19:58:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:02.947 19:58:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:03.206 19:58:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:03.206 19:58:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:03.206 19:58:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:03.206 19:58:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:03.206 19:58:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:03.206 19:58:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:03.206 19:58:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:03.206 19:58:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:03.206 19:58:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:23:03.207 19:58:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:03.466 19:58:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:03.466 "name": "BaseBdev2", 00:23:03.466 "aliases": [ 00:23:03.466 "aa3a78f6-1032-448c-9198-b23cec18b3f3" 00:23:03.466 ], 00:23:03.466 "product_name": "Malloc disk", 00:23:03.466 "block_size": 512, 00:23:03.466 "num_blocks": 65536, 00:23:03.466 "uuid": "aa3a78f6-1032-448c-9198-b23cec18b3f3", 00:23:03.466 "assigned_rate_limits": { 00:23:03.466 "rw_ios_per_sec": 0, 00:23:03.466 "rw_mbytes_per_sec": 0, 00:23:03.466 "r_mbytes_per_sec": 0, 00:23:03.466 "w_mbytes_per_sec": 0 00:23:03.466 }, 00:23:03.466 "claimed": true, 00:23:03.466 "claim_type": "exclusive_write", 00:23:03.466 "zoned": false, 00:23:03.466 "supported_io_types": { 00:23:03.466 "read": true, 00:23:03.466 "write": true, 00:23:03.466 "unmap": true, 00:23:03.466 "flush": true, 00:23:03.466 "reset": true, 00:23:03.466 "nvme_admin": false, 00:23:03.466 "nvme_io": false, 00:23:03.466 "nvme_io_md": false, 00:23:03.466 "write_zeroes": true, 00:23:03.466 "zcopy": true, 00:23:03.466 "get_zone_info": false, 00:23:03.466 "zone_management": false, 00:23:03.466 "zone_append": false, 00:23:03.466 "compare": false, 00:23:03.466 "compare_and_write": false, 00:23:03.466 "abort": true, 00:23:03.466 "seek_hole": false, 00:23:03.466 "seek_data": false, 00:23:03.466 "copy": true, 00:23:03.466 "nvme_iov_md": false 00:23:03.466 }, 00:23:03.466 "memory_domains": [ 00:23:03.466 { 00:23:03.466 "dma_device_id": "system", 00:23:03.466 "dma_device_type": 1 00:23:03.466 }, 00:23:03.466 { 00:23:03.466 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:03.466 "dma_device_type": 2 00:23:03.466 } 00:23:03.466 ], 00:23:03.466 "driver_specific": {} 00:23:03.466 }' 00:23:03.466 19:58:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:03.725 19:58:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:03.725 19:58:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:03.725 19:58:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:03.725 19:58:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:03.725 19:58:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:03.725 19:58:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:03.983 19:58:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:03.983 19:58:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:03.983 19:58:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:03.983 19:58:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:04.243 19:58:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:04.243 19:58:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:04.243 19:58:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:23:04.243 19:58:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:04.810 19:58:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:04.810 "name": "BaseBdev3", 00:23:04.810 "aliases": [ 00:23:04.810 "e21fe334-8b39-444b-aa10-d5c8b4c9a4b8" 00:23:04.810 ], 00:23:04.810 "product_name": "Malloc disk", 00:23:04.810 "block_size": 512, 00:23:04.810 "num_blocks": 65536, 00:23:04.810 "uuid": "e21fe334-8b39-444b-aa10-d5c8b4c9a4b8", 00:23:04.810 "assigned_rate_limits": { 00:23:04.810 "rw_ios_per_sec": 0, 00:23:04.810 "rw_mbytes_per_sec": 0, 00:23:04.810 "r_mbytes_per_sec": 0, 00:23:04.810 "w_mbytes_per_sec": 0 00:23:04.810 }, 00:23:04.810 "claimed": true, 00:23:04.810 "claim_type": "exclusive_write", 00:23:04.810 "zoned": false, 00:23:04.810 "supported_io_types": { 00:23:04.810 "read": true, 00:23:04.810 "write": true, 00:23:04.810 "unmap": true, 00:23:04.810 "flush": true, 00:23:04.810 "reset": true, 00:23:04.810 "nvme_admin": false, 00:23:04.810 "nvme_io": false, 00:23:04.810 "nvme_io_md": false, 00:23:04.810 "write_zeroes": true, 00:23:04.810 "zcopy": true, 00:23:04.810 "get_zone_info": false, 00:23:04.810 "zone_management": false, 00:23:04.810 "zone_append": false, 00:23:04.810 "compare": false, 00:23:04.810 "compare_and_write": false, 00:23:04.810 "abort": true, 00:23:04.810 "seek_hole": false, 00:23:04.810 "seek_data": false, 00:23:04.810 "copy": true, 00:23:04.810 "nvme_iov_md": false 00:23:04.810 }, 00:23:04.810 "memory_domains": [ 00:23:04.810 { 00:23:04.810 "dma_device_id": "system", 00:23:04.810 "dma_device_type": 1 00:23:04.810 }, 00:23:04.810 { 00:23:04.810 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:04.810 "dma_device_type": 2 00:23:04.810 } 00:23:04.810 ], 00:23:04.810 "driver_specific": {} 00:23:04.810 }' 00:23:04.810 19:58:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:04.810 19:58:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:04.810 19:58:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:04.810 19:58:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:04.810 19:58:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:04.810 19:58:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:04.810 19:58:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:05.069 19:58:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:05.069 19:58:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:05.069 19:58:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:05.069 19:58:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:05.069 19:58:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:05.069 19:58:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:05.328 19:58:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:23:05.328 19:58:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:05.587 19:58:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:05.587 "name": "BaseBdev4", 00:23:05.587 "aliases": [ 00:23:05.587 "fc54eec3-0e01-4dc7-a7fd-7c073748fdee" 00:23:05.587 ], 00:23:05.587 "product_name": "Malloc disk", 00:23:05.587 "block_size": 512, 00:23:05.587 "num_blocks": 65536, 00:23:05.587 "uuid": "fc54eec3-0e01-4dc7-a7fd-7c073748fdee", 00:23:05.587 "assigned_rate_limits": { 00:23:05.587 "rw_ios_per_sec": 0, 00:23:05.587 "rw_mbytes_per_sec": 0, 00:23:05.587 "r_mbytes_per_sec": 0, 00:23:05.587 "w_mbytes_per_sec": 0 00:23:05.587 }, 00:23:05.587 "claimed": true, 00:23:05.587 "claim_type": "exclusive_write", 00:23:05.587 "zoned": false, 00:23:05.587 "supported_io_types": { 00:23:05.587 "read": true, 00:23:05.587 "write": true, 00:23:05.587 "unmap": true, 00:23:05.587 "flush": true, 00:23:05.587 "reset": true, 00:23:05.587 "nvme_admin": false, 00:23:05.587 "nvme_io": false, 00:23:05.587 "nvme_io_md": false, 00:23:05.587 "write_zeroes": true, 00:23:05.587 "zcopy": true, 00:23:05.587 "get_zone_info": false, 00:23:05.587 "zone_management": false, 00:23:05.587 "zone_append": false, 00:23:05.587 "compare": false, 00:23:05.587 "compare_and_write": false, 00:23:05.587 "abort": true, 00:23:05.587 "seek_hole": false, 00:23:05.587 "seek_data": false, 00:23:05.587 "copy": true, 00:23:05.587 "nvme_iov_md": false 00:23:05.587 }, 00:23:05.587 "memory_domains": [ 00:23:05.587 { 00:23:05.587 "dma_device_id": "system", 00:23:05.587 "dma_device_type": 1 00:23:05.587 }, 00:23:05.587 { 00:23:05.587 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:05.587 "dma_device_type": 2 00:23:05.587 } 00:23:05.587 ], 00:23:05.587 "driver_specific": {} 00:23:05.587 }' 00:23:05.587 19:58:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:05.846 19:58:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:05.846 19:58:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:05.846 19:58:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:05.846 19:58:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:05.846 19:58:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:05.846 19:58:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:05.846 19:58:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:06.104 19:58:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:06.104 19:58:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:06.104 19:58:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:06.104 19:58:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:06.104 19:58:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:06.672 [2024-07-24 19:58:58.068917] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:06.672 [2024-07-24 19:58:58.068954] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:06.672 [2024-07-24 19:58:58.069008] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:06.672 [2024-07-24 19:58:58.069278] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:06.672 [2024-07-24 19:58:58.069290] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16c18b0 name Existed_Raid, state offline 00:23:06.672 19:58:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1472949 00:23:06.672 19:58:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1472949 ']' 00:23:06.672 19:58:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 1472949 00:23:06.672 19:58:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:23:06.672 19:58:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:23:06.672 19:58:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1472949 00:23:06.672 19:58:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:23:06.672 19:58:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:23:06.672 19:58:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1472949' 00:23:06.672 killing process with pid 1472949 00:23:06.672 19:58:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 1472949 00:23:06.672 [2024-07-24 19:58:58.150951] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:06.672 19:58:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 1472949 00:23:06.672 [2024-07-24 19:58:58.192706] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:06.932 19:58:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:23:06.932 00:23:06.932 real 0m37.548s 00:23:06.932 user 1m9.221s 00:23:06.932 sys 0m6.424s 00:23:06.932 19:58:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:23:06.932 19:58:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:06.932 ************************************ 00:23:06.932 END TEST raid_state_function_test_sb 00:23:06.932 ************************************ 00:23:06.932 19:58:58 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:23:06.932 19:58:58 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:23:06.932 19:58:58 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:23:06.932 19:58:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:06.932 ************************************ 00:23:06.932 START TEST raid_superblock_test 00:23:06.932 ************************************ 00:23:06.932 19:58:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 4 00:23:06.932 19:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=raid1 00:23:06.932 19:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=4 00:23:06.932 19:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:23:06.932 19:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:23:06.932 19:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:23:06.932 19:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:23:06.932 19:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:23:06.932 19:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:23:06.932 19:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:23:06.932 19:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:23:06.932 19:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:23:06.932 19:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:23:06.932 19:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:23:06.932 19:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' raid1 '!=' raid1 ']' 00:23:06.932 19:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # strip_size=0 00:23:06.932 19:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=1478375 00:23:06.932 19:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 1478375 /var/tmp/spdk-raid.sock 00:23:06.932 19:58:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:23:06.932 19:58:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 1478375 ']' 00:23:06.932 19:58:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:06.932 19:58:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:23:06.932 19:58:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:06.932 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:06.932 19:58:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:23:06.932 19:58:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:07.192 [2024-07-24 19:58:58.570412] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:23:07.192 [2024-07-24 19:58:58.570484] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1478375 ] 00:23:07.192 [2024-07-24 19:58:58.701583] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:07.450 [2024-07-24 19:58:58.813765] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:07.450 [2024-07-24 19:58:58.879350] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:07.450 [2024-07-24 19:58:58.879378] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:08.019 19:58:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:23:08.019 19:58:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:23:08.019 19:58:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:23:08.019 19:58:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:23:08.019 19:58:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:23:08.019 19:58:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:23:08.019 19:58:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:23:08.019 19:58:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:08.019 19:58:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:23:08.019 19:58:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:08.019 19:58:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:23:08.278 malloc1 00:23:08.278 19:58:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:08.846 [2024-07-24 19:59:00.205131] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:08.846 [2024-07-24 19:59:00.205184] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:08.846 [2024-07-24 19:59:00.205210] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c58590 00:23:08.846 [2024-07-24 19:59:00.205223] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:08.846 [2024-07-24 19:59:00.207035] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:08.846 [2024-07-24 19:59:00.207066] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:08.846 pt1 00:23:08.846 19:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:23:08.846 19:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:23:08.846 19:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:23:08.846 19:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:23:08.846 19:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:23:08.846 19:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:08.846 19:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:23:08.846 19:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:08.846 19:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:23:09.105 malloc2 00:23:09.105 19:59:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:09.673 [2024-07-24 19:59:00.977100] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:09.673 [2024-07-24 19:59:00.977146] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:09.673 [2024-07-24 19:59:00.977164] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1dfe690 00:23:09.673 [2024-07-24 19:59:00.977176] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:09.673 [2024-07-24 19:59:00.978744] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:09.673 [2024-07-24 19:59:00.978771] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:09.673 pt2 00:23:09.673 19:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:23:09.673 19:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:23:09.673 19:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc3 00:23:09.673 19:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt3 00:23:09.673 19:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:23:09.673 19:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:09.673 19:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:23:09.673 19:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:09.673 19:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:23:09.932 malloc3 00:23:10.191 19:59:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:23:10.449 [2024-07-24 19:59:02.012387] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:23:10.449 [2024-07-24 19:59:02.012440] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:10.449 [2024-07-24 19:59:02.012459] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1dfffc0 00:23:10.449 [2024-07-24 19:59:02.012472] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:10.449 [2024-07-24 19:59:02.014085] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:10.449 [2024-07-24 19:59:02.014114] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:23:10.449 pt3 00:23:10.708 19:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:23:10.708 19:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:23:10.708 19:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc4 00:23:10.708 19:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt4 00:23:10.708 19:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:23:10.708 19:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:10.708 19:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:23:10.708 19:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:10.708 19:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:23:10.967 malloc4 00:23:11.226 19:59:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:23:11.485 [2024-07-24 19:59:03.044885] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:23:11.485 [2024-07-24 19:59:03.044937] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:11.485 [2024-07-24 19:59:03.044957] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e011c0 00:23:11.485 [2024-07-24 19:59:03.044970] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:11.485 [2024-07-24 19:59:03.046557] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:11.485 [2024-07-24 19:59:03.046585] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:23:11.485 pt4 00:23:11.485 19:59:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:23:11.485 19:59:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:23:11.485 19:59:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:23:12.052 [2024-07-24 19:59:03.558250] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:12.052 [2024-07-24 19:59:03.559572] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:12.052 [2024-07-24 19:59:03.559627] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:23:12.052 [2024-07-24 19:59:03.559673] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:23:12.052 [2024-07-24 19:59:03.559852] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e09e80 00:23:12.052 [2024-07-24 19:59:03.559863] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:12.052 [2024-07-24 19:59:03.560059] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c6f480 00:23:12.053 [2024-07-24 19:59:03.560215] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e09e80 00:23:12.053 [2024-07-24 19:59:03.560225] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e09e80 00:23:12.053 [2024-07-24 19:59:03.560324] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:12.053 19:59:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:12.053 19:59:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:12.053 19:59:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:12.053 19:59:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:12.053 19:59:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:12.053 19:59:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:12.053 19:59:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:12.053 19:59:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:12.053 19:59:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:12.053 19:59:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:12.053 19:59:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:12.053 19:59:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:12.353 19:59:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:12.353 "name": "raid_bdev1", 00:23:12.353 "uuid": "cf506415-f26f-49b3-9317-04c77703b1be", 00:23:12.353 "strip_size_kb": 0, 00:23:12.353 "state": "online", 00:23:12.353 "raid_level": "raid1", 00:23:12.353 "superblock": true, 00:23:12.353 "num_base_bdevs": 4, 00:23:12.353 "num_base_bdevs_discovered": 4, 00:23:12.353 "num_base_bdevs_operational": 4, 00:23:12.353 "base_bdevs_list": [ 00:23:12.353 { 00:23:12.353 "name": "pt1", 00:23:12.353 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:12.353 "is_configured": true, 00:23:12.353 "data_offset": 2048, 00:23:12.353 "data_size": 63488 00:23:12.353 }, 00:23:12.353 { 00:23:12.353 "name": "pt2", 00:23:12.353 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:12.353 "is_configured": true, 00:23:12.353 "data_offset": 2048, 00:23:12.353 "data_size": 63488 00:23:12.353 }, 00:23:12.353 { 00:23:12.353 "name": "pt3", 00:23:12.353 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:12.353 "is_configured": true, 00:23:12.353 "data_offset": 2048, 00:23:12.353 "data_size": 63488 00:23:12.353 }, 00:23:12.353 { 00:23:12.353 "name": "pt4", 00:23:12.353 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:12.353 "is_configured": true, 00:23:12.353 "data_offset": 2048, 00:23:12.353 "data_size": 63488 00:23:12.353 } 00:23:12.353 ] 00:23:12.353 }' 00:23:12.353 19:59:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:12.353 19:59:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:13.331 19:59:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:23:13.331 19:59:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:23:13.331 19:59:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:13.331 19:59:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:13.331 19:59:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:13.332 19:59:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:23:13.332 19:59:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:13.332 19:59:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:13.332 [2024-07-24 19:59:04.853946] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:13.332 19:59:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:13.332 "name": "raid_bdev1", 00:23:13.332 "aliases": [ 00:23:13.332 "cf506415-f26f-49b3-9317-04c77703b1be" 00:23:13.332 ], 00:23:13.332 "product_name": "Raid Volume", 00:23:13.332 "block_size": 512, 00:23:13.332 "num_blocks": 63488, 00:23:13.332 "uuid": "cf506415-f26f-49b3-9317-04c77703b1be", 00:23:13.332 "assigned_rate_limits": { 00:23:13.332 "rw_ios_per_sec": 0, 00:23:13.332 "rw_mbytes_per_sec": 0, 00:23:13.332 "r_mbytes_per_sec": 0, 00:23:13.332 "w_mbytes_per_sec": 0 00:23:13.332 }, 00:23:13.332 "claimed": false, 00:23:13.332 "zoned": false, 00:23:13.332 "supported_io_types": { 00:23:13.332 "read": true, 00:23:13.332 "write": true, 00:23:13.332 "unmap": false, 00:23:13.332 "flush": false, 00:23:13.332 "reset": true, 00:23:13.332 "nvme_admin": false, 00:23:13.332 "nvme_io": false, 00:23:13.332 "nvme_io_md": false, 00:23:13.332 "write_zeroes": true, 00:23:13.332 "zcopy": false, 00:23:13.332 "get_zone_info": false, 00:23:13.332 "zone_management": false, 00:23:13.332 "zone_append": false, 00:23:13.332 "compare": false, 00:23:13.332 "compare_and_write": false, 00:23:13.332 "abort": false, 00:23:13.332 "seek_hole": false, 00:23:13.332 "seek_data": false, 00:23:13.332 "copy": false, 00:23:13.332 "nvme_iov_md": false 00:23:13.332 }, 00:23:13.332 "memory_domains": [ 00:23:13.332 { 00:23:13.332 "dma_device_id": "system", 00:23:13.332 "dma_device_type": 1 00:23:13.332 }, 00:23:13.332 { 00:23:13.332 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:13.332 "dma_device_type": 2 00:23:13.332 }, 00:23:13.332 { 00:23:13.332 "dma_device_id": "system", 00:23:13.332 "dma_device_type": 1 00:23:13.332 }, 00:23:13.332 { 00:23:13.332 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:13.332 "dma_device_type": 2 00:23:13.332 }, 00:23:13.332 { 00:23:13.332 "dma_device_id": "system", 00:23:13.332 "dma_device_type": 1 00:23:13.332 }, 00:23:13.332 { 00:23:13.332 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:13.332 "dma_device_type": 2 00:23:13.332 }, 00:23:13.332 { 00:23:13.332 "dma_device_id": "system", 00:23:13.332 "dma_device_type": 1 00:23:13.332 }, 00:23:13.332 { 00:23:13.332 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:13.332 "dma_device_type": 2 00:23:13.332 } 00:23:13.332 ], 00:23:13.332 "driver_specific": { 00:23:13.332 "raid": { 00:23:13.332 "uuid": "cf506415-f26f-49b3-9317-04c77703b1be", 00:23:13.332 "strip_size_kb": 0, 00:23:13.332 "state": "online", 00:23:13.332 "raid_level": "raid1", 00:23:13.332 "superblock": true, 00:23:13.332 "num_base_bdevs": 4, 00:23:13.332 "num_base_bdevs_discovered": 4, 00:23:13.332 "num_base_bdevs_operational": 4, 00:23:13.332 "base_bdevs_list": [ 00:23:13.332 { 00:23:13.332 "name": "pt1", 00:23:13.332 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:13.332 "is_configured": true, 00:23:13.332 "data_offset": 2048, 00:23:13.332 "data_size": 63488 00:23:13.332 }, 00:23:13.332 { 00:23:13.332 "name": "pt2", 00:23:13.332 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:13.332 "is_configured": true, 00:23:13.332 "data_offset": 2048, 00:23:13.332 "data_size": 63488 00:23:13.332 }, 00:23:13.332 { 00:23:13.332 "name": "pt3", 00:23:13.332 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:13.332 "is_configured": true, 00:23:13.332 "data_offset": 2048, 00:23:13.332 "data_size": 63488 00:23:13.332 }, 00:23:13.332 { 00:23:13.332 "name": "pt4", 00:23:13.332 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:13.332 "is_configured": true, 00:23:13.332 "data_offset": 2048, 00:23:13.332 "data_size": 63488 00:23:13.332 } 00:23:13.332 ] 00:23:13.332 } 00:23:13.332 } 00:23:13.332 }' 00:23:13.332 19:59:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:13.591 19:59:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:23:13.591 pt2 00:23:13.591 pt3 00:23:13.591 pt4' 00:23:13.591 19:59:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:13.591 19:59:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:13.591 19:59:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:13.591 19:59:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:13.591 "name": "pt1", 00:23:13.591 "aliases": [ 00:23:13.591 "00000000-0000-0000-0000-000000000001" 00:23:13.591 ], 00:23:13.591 "product_name": "passthru", 00:23:13.591 "block_size": 512, 00:23:13.591 "num_blocks": 65536, 00:23:13.591 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:13.591 "assigned_rate_limits": { 00:23:13.591 "rw_ios_per_sec": 0, 00:23:13.591 "rw_mbytes_per_sec": 0, 00:23:13.591 "r_mbytes_per_sec": 0, 00:23:13.591 "w_mbytes_per_sec": 0 00:23:13.591 }, 00:23:13.591 "claimed": true, 00:23:13.591 "claim_type": "exclusive_write", 00:23:13.591 "zoned": false, 00:23:13.591 "supported_io_types": { 00:23:13.591 "read": true, 00:23:13.591 "write": true, 00:23:13.591 "unmap": true, 00:23:13.591 "flush": true, 00:23:13.591 "reset": true, 00:23:13.591 "nvme_admin": false, 00:23:13.591 "nvme_io": false, 00:23:13.591 "nvme_io_md": false, 00:23:13.591 "write_zeroes": true, 00:23:13.591 "zcopy": true, 00:23:13.591 "get_zone_info": false, 00:23:13.591 "zone_management": false, 00:23:13.591 "zone_append": false, 00:23:13.591 "compare": false, 00:23:13.591 "compare_and_write": false, 00:23:13.591 "abort": true, 00:23:13.591 "seek_hole": false, 00:23:13.591 "seek_data": false, 00:23:13.591 "copy": true, 00:23:13.591 "nvme_iov_md": false 00:23:13.591 }, 00:23:13.591 "memory_domains": [ 00:23:13.591 { 00:23:13.591 "dma_device_id": "system", 00:23:13.591 "dma_device_type": 1 00:23:13.591 }, 00:23:13.591 { 00:23:13.591 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:13.591 "dma_device_type": 2 00:23:13.591 } 00:23:13.591 ], 00:23:13.591 "driver_specific": { 00:23:13.591 "passthru": { 00:23:13.591 "name": "pt1", 00:23:13.591 "base_bdev_name": "malloc1" 00:23:13.591 } 00:23:13.591 } 00:23:13.591 }' 00:23:13.591 19:59:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:13.851 19:59:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:13.851 19:59:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:13.851 19:59:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:13.851 19:59:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:13.851 19:59:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:13.851 19:59:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:13.851 19:59:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:14.109 19:59:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:14.109 19:59:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:14.109 19:59:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:14.109 19:59:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:14.109 19:59:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:14.109 19:59:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:14.109 19:59:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:14.368 19:59:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:14.368 "name": "pt2", 00:23:14.368 "aliases": [ 00:23:14.368 "00000000-0000-0000-0000-000000000002" 00:23:14.368 ], 00:23:14.368 "product_name": "passthru", 00:23:14.368 "block_size": 512, 00:23:14.368 "num_blocks": 65536, 00:23:14.368 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:14.368 "assigned_rate_limits": { 00:23:14.368 "rw_ios_per_sec": 0, 00:23:14.368 "rw_mbytes_per_sec": 0, 00:23:14.368 "r_mbytes_per_sec": 0, 00:23:14.368 "w_mbytes_per_sec": 0 00:23:14.368 }, 00:23:14.368 "claimed": true, 00:23:14.368 "claim_type": "exclusive_write", 00:23:14.368 "zoned": false, 00:23:14.368 "supported_io_types": { 00:23:14.368 "read": true, 00:23:14.368 "write": true, 00:23:14.368 "unmap": true, 00:23:14.368 "flush": true, 00:23:14.368 "reset": true, 00:23:14.368 "nvme_admin": false, 00:23:14.368 "nvme_io": false, 00:23:14.368 "nvme_io_md": false, 00:23:14.368 "write_zeroes": true, 00:23:14.368 "zcopy": true, 00:23:14.368 "get_zone_info": false, 00:23:14.368 "zone_management": false, 00:23:14.368 "zone_append": false, 00:23:14.368 "compare": false, 00:23:14.368 "compare_and_write": false, 00:23:14.368 "abort": true, 00:23:14.368 "seek_hole": false, 00:23:14.368 "seek_data": false, 00:23:14.368 "copy": true, 00:23:14.368 "nvme_iov_md": false 00:23:14.368 }, 00:23:14.368 "memory_domains": [ 00:23:14.368 { 00:23:14.368 "dma_device_id": "system", 00:23:14.368 "dma_device_type": 1 00:23:14.368 }, 00:23:14.368 { 00:23:14.368 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:14.368 "dma_device_type": 2 00:23:14.368 } 00:23:14.368 ], 00:23:14.368 "driver_specific": { 00:23:14.368 "passthru": { 00:23:14.368 "name": "pt2", 00:23:14.368 "base_bdev_name": "malloc2" 00:23:14.368 } 00:23:14.368 } 00:23:14.368 }' 00:23:14.368 19:59:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:14.368 19:59:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:14.368 19:59:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:14.368 19:59:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:14.368 19:59:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:14.368 19:59:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:14.368 19:59:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:14.627 19:59:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:14.627 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:14.627 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:14.627 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:14.627 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:14.627 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:14.627 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:23:14.627 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:14.886 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:14.886 "name": "pt3", 00:23:14.886 "aliases": [ 00:23:14.886 "00000000-0000-0000-0000-000000000003" 00:23:14.886 ], 00:23:14.886 "product_name": "passthru", 00:23:14.886 "block_size": 512, 00:23:14.886 "num_blocks": 65536, 00:23:14.886 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:14.886 "assigned_rate_limits": { 00:23:14.886 "rw_ios_per_sec": 0, 00:23:14.886 "rw_mbytes_per_sec": 0, 00:23:14.886 "r_mbytes_per_sec": 0, 00:23:14.886 "w_mbytes_per_sec": 0 00:23:14.886 }, 00:23:14.886 "claimed": true, 00:23:14.886 "claim_type": "exclusive_write", 00:23:14.886 "zoned": false, 00:23:14.886 "supported_io_types": { 00:23:14.886 "read": true, 00:23:14.886 "write": true, 00:23:14.886 "unmap": true, 00:23:14.886 "flush": true, 00:23:14.886 "reset": true, 00:23:14.886 "nvme_admin": false, 00:23:14.886 "nvme_io": false, 00:23:14.886 "nvme_io_md": false, 00:23:14.886 "write_zeroes": true, 00:23:14.886 "zcopy": true, 00:23:14.886 "get_zone_info": false, 00:23:14.886 "zone_management": false, 00:23:14.886 "zone_append": false, 00:23:14.886 "compare": false, 00:23:14.886 "compare_and_write": false, 00:23:14.886 "abort": true, 00:23:14.886 "seek_hole": false, 00:23:14.886 "seek_data": false, 00:23:14.886 "copy": true, 00:23:14.886 "nvme_iov_md": false 00:23:14.886 }, 00:23:14.886 "memory_domains": [ 00:23:14.886 { 00:23:14.886 "dma_device_id": "system", 00:23:14.886 "dma_device_type": 1 00:23:14.886 }, 00:23:14.886 { 00:23:14.886 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:14.886 "dma_device_type": 2 00:23:14.886 } 00:23:14.886 ], 00:23:14.886 "driver_specific": { 00:23:14.886 "passthru": { 00:23:14.886 "name": "pt3", 00:23:14.886 "base_bdev_name": "malloc3" 00:23:14.886 } 00:23:14.886 } 00:23:14.886 }' 00:23:14.886 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:14.886 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:14.886 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:14.886 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:14.886 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:14.886 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:14.886 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:15.145 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:15.145 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:15.145 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:15.145 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:15.145 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:15.145 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:15.145 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:15.145 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:23:15.404 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:15.404 "name": "pt4", 00:23:15.404 "aliases": [ 00:23:15.404 "00000000-0000-0000-0000-000000000004" 00:23:15.404 ], 00:23:15.404 "product_name": "passthru", 00:23:15.404 "block_size": 512, 00:23:15.404 "num_blocks": 65536, 00:23:15.404 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:15.404 "assigned_rate_limits": { 00:23:15.404 "rw_ios_per_sec": 0, 00:23:15.404 "rw_mbytes_per_sec": 0, 00:23:15.404 "r_mbytes_per_sec": 0, 00:23:15.404 "w_mbytes_per_sec": 0 00:23:15.404 }, 00:23:15.404 "claimed": true, 00:23:15.404 "claim_type": "exclusive_write", 00:23:15.404 "zoned": false, 00:23:15.404 "supported_io_types": { 00:23:15.404 "read": true, 00:23:15.404 "write": true, 00:23:15.404 "unmap": true, 00:23:15.404 "flush": true, 00:23:15.404 "reset": true, 00:23:15.404 "nvme_admin": false, 00:23:15.404 "nvme_io": false, 00:23:15.404 "nvme_io_md": false, 00:23:15.404 "write_zeroes": true, 00:23:15.404 "zcopy": true, 00:23:15.404 "get_zone_info": false, 00:23:15.404 "zone_management": false, 00:23:15.404 "zone_append": false, 00:23:15.404 "compare": false, 00:23:15.404 "compare_and_write": false, 00:23:15.404 "abort": true, 00:23:15.404 "seek_hole": false, 00:23:15.404 "seek_data": false, 00:23:15.404 "copy": true, 00:23:15.404 "nvme_iov_md": false 00:23:15.404 }, 00:23:15.404 "memory_domains": [ 00:23:15.404 { 00:23:15.404 "dma_device_id": "system", 00:23:15.404 "dma_device_type": 1 00:23:15.404 }, 00:23:15.404 { 00:23:15.404 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:15.404 "dma_device_type": 2 00:23:15.404 } 00:23:15.404 ], 00:23:15.404 "driver_specific": { 00:23:15.404 "passthru": { 00:23:15.404 "name": "pt4", 00:23:15.404 "base_bdev_name": "malloc4" 00:23:15.404 } 00:23:15.404 } 00:23:15.404 }' 00:23:15.404 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:15.404 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:15.404 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:15.404 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:15.404 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:15.404 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:15.404 19:59:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:15.663 19:59:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:15.663 19:59:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:15.663 19:59:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:15.664 19:59:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:15.664 19:59:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:15.664 19:59:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:23:15.664 19:59:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:15.922 [2024-07-24 19:59:07.396683] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:15.922 19:59:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=cf506415-f26f-49b3-9317-04c77703b1be 00:23:15.923 19:59:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z cf506415-f26f-49b3-9317-04c77703b1be ']' 00:23:15.923 19:59:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:16.181 [2024-07-24 19:59:07.641041] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:16.181 [2024-07-24 19:59:07.641067] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:16.181 [2024-07-24 19:59:07.641117] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:16.181 [2024-07-24 19:59:07.641203] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:16.181 [2024-07-24 19:59:07.641215] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e09e80 name raid_bdev1, state offline 00:23:16.181 19:59:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:16.181 19:59:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:23:16.440 19:59:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:23:16.440 19:59:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:23:16.440 19:59:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:23:16.440 19:59:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:16.699 19:59:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:23:16.699 19:59:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:16.958 19:59:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:23:16.958 19:59:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:23:17.217 19:59:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:23:17.217 19:59:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:23:17.475 19:59:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:23:17.475 19:59:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:23:17.733 19:59:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:23:17.733 19:59:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:23:17.733 19:59:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:23:17.733 19:59:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:23:17.733 19:59:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:17.733 19:59:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:23:17.733 19:59:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:17.733 19:59:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:23:17.733 19:59:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:17.733 19:59:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:23:17.733 19:59:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:17.733 19:59:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:17.733 19:59:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:23:17.991 [2024-07-24 19:59:09.349491] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:23:17.991 [2024-07-24 19:59:09.350839] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:23:17.991 [2024-07-24 19:59:09.350882] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:23:17.991 [2024-07-24 19:59:09.350916] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:23:17.991 [2024-07-24 19:59:09.350962] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:23:17.991 [2024-07-24 19:59:09.351001] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:23:17.991 [2024-07-24 19:59:09.351024] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:23:17.991 [2024-07-24 19:59:09.351048] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:23:17.991 [2024-07-24 19:59:09.351066] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:17.991 [2024-07-24 19:59:09.351076] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1dffc40 name raid_bdev1, state configuring 00:23:17.991 request: 00:23:17.991 { 00:23:17.991 "name": "raid_bdev1", 00:23:17.991 "raid_level": "raid1", 00:23:17.991 "base_bdevs": [ 00:23:17.991 "malloc1", 00:23:17.991 "malloc2", 00:23:17.991 "malloc3", 00:23:17.991 "malloc4" 00:23:17.991 ], 00:23:17.991 "superblock": false, 00:23:17.991 "method": "bdev_raid_create", 00:23:17.991 "req_id": 1 00:23:17.991 } 00:23:17.991 Got JSON-RPC error response 00:23:17.991 response: 00:23:17.991 { 00:23:17.991 "code": -17, 00:23:17.991 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:23:17.991 } 00:23:17.991 19:59:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:23:17.991 19:59:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:23:17.991 19:59:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:23:17.991 19:59:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:23:17.991 19:59:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.991 19:59:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:23:18.249 19:59:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:23:18.249 19:59:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:23:18.249 19:59:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:18.249 [2024-07-24 19:59:09.834719] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:18.249 [2024-07-24 19:59:09.834772] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:18.249 [2024-07-24 19:59:09.834798] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e00f40 00:23:18.249 [2024-07-24 19:59:09.834811] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:18.249 [2024-07-24 19:59:09.836472] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:18.249 [2024-07-24 19:59:09.836500] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:18.249 [2024-07-24 19:59:09.836573] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:23:18.249 [2024-07-24 19:59:09.836601] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:18.249 pt1 00:23:18.508 19:59:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:23:18.508 19:59:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:18.508 19:59:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:18.508 19:59:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:18.508 19:59:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:18.508 19:59:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:18.508 19:59:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:18.508 19:59:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:18.508 19:59:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:18.508 19:59:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:18.508 19:59:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:18.508 19:59:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:18.767 19:59:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:18.767 "name": "raid_bdev1", 00:23:18.767 "uuid": "cf506415-f26f-49b3-9317-04c77703b1be", 00:23:18.767 "strip_size_kb": 0, 00:23:18.767 "state": "configuring", 00:23:18.767 "raid_level": "raid1", 00:23:18.767 "superblock": true, 00:23:18.767 "num_base_bdevs": 4, 00:23:18.767 "num_base_bdevs_discovered": 1, 00:23:18.767 "num_base_bdevs_operational": 4, 00:23:18.767 "base_bdevs_list": [ 00:23:18.767 { 00:23:18.767 "name": "pt1", 00:23:18.767 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:18.767 "is_configured": true, 00:23:18.767 "data_offset": 2048, 00:23:18.767 "data_size": 63488 00:23:18.767 }, 00:23:18.767 { 00:23:18.767 "name": null, 00:23:18.767 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:18.767 "is_configured": false, 00:23:18.767 "data_offset": 2048, 00:23:18.767 "data_size": 63488 00:23:18.767 }, 00:23:18.767 { 00:23:18.767 "name": null, 00:23:18.767 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:18.767 "is_configured": false, 00:23:18.767 "data_offset": 2048, 00:23:18.767 "data_size": 63488 00:23:18.767 }, 00:23:18.767 { 00:23:18.767 "name": null, 00:23:18.767 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:18.767 "is_configured": false, 00:23:18.767 "data_offset": 2048, 00:23:18.767 "data_size": 63488 00:23:18.767 } 00:23:18.767 ] 00:23:18.767 }' 00:23:18.767 19:59:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:18.767 19:59:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:19.334 19:59:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 4 -gt 2 ']' 00:23:19.334 19:59:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:19.593 [2024-07-24 19:59:10.937649] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:19.593 [2024-07-24 19:59:10.937697] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:19.593 [2024-07-24 19:59:10.937716] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c51da0 00:23:19.593 [2024-07-24 19:59:10.937728] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:19.593 [2024-07-24 19:59:10.938074] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:19.593 [2024-07-24 19:59:10.938091] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:19.593 [2024-07-24 19:59:10.938161] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:19.593 [2024-07-24 19:59:10.938182] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:19.593 pt2 00:23:19.593 19:59:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@488 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:19.851 [2024-07-24 19:59:11.186320] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:23:19.851 19:59:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@489 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:23:19.851 19:59:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:19.851 19:59:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:19.851 19:59:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:19.851 19:59:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:19.851 19:59:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:19.851 19:59:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:19.851 19:59:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:19.851 19:59:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:19.851 19:59:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:19.851 19:59:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:19.851 19:59:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:19.851 19:59:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:19.851 "name": "raid_bdev1", 00:23:19.851 "uuid": "cf506415-f26f-49b3-9317-04c77703b1be", 00:23:19.851 "strip_size_kb": 0, 00:23:19.851 "state": "configuring", 00:23:19.851 "raid_level": "raid1", 00:23:19.851 "superblock": true, 00:23:19.851 "num_base_bdevs": 4, 00:23:19.851 "num_base_bdevs_discovered": 1, 00:23:19.851 "num_base_bdevs_operational": 4, 00:23:19.851 "base_bdevs_list": [ 00:23:19.851 { 00:23:19.851 "name": "pt1", 00:23:19.851 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:19.851 "is_configured": true, 00:23:19.851 "data_offset": 2048, 00:23:19.851 "data_size": 63488 00:23:19.851 }, 00:23:19.851 { 00:23:19.851 "name": null, 00:23:19.851 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:19.851 "is_configured": false, 00:23:19.852 "data_offset": 2048, 00:23:19.852 "data_size": 63488 00:23:19.852 }, 00:23:19.852 { 00:23:19.852 "name": null, 00:23:19.852 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:19.852 "is_configured": false, 00:23:19.852 "data_offset": 2048, 00:23:19.852 "data_size": 63488 00:23:19.852 }, 00:23:19.852 { 00:23:19.852 "name": null, 00:23:19.852 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:19.852 "is_configured": false, 00:23:19.852 "data_offset": 2048, 00:23:19.852 "data_size": 63488 00:23:19.852 } 00:23:19.852 ] 00:23:19.852 }' 00:23:19.852 19:59:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:19.852 19:59:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:20.418 19:59:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:23:20.418 19:59:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:23:20.418 19:59:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:20.676 [2024-07-24 19:59:12.221054] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:20.676 [2024-07-24 19:59:12.221099] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:20.676 [2024-07-24 19:59:12.221117] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c4fc90 00:23:20.676 [2024-07-24 19:59:12.221130] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:20.676 [2024-07-24 19:59:12.221465] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:20.676 [2024-07-24 19:59:12.221482] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:20.676 [2024-07-24 19:59:12.221549] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:20.676 [2024-07-24 19:59:12.221570] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:20.676 pt2 00:23:20.676 19:59:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:23:20.676 19:59:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:23:20.676 19:59:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:23:20.934 [2024-07-24 19:59:12.465710] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:23:20.934 [2024-07-24 19:59:12.465743] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:20.934 [2024-07-24 19:59:12.465759] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1dfea60 00:23:20.934 [2024-07-24 19:59:12.465771] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:20.934 [2024-07-24 19:59:12.466070] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:20.934 [2024-07-24 19:59:12.466087] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:23:20.934 [2024-07-24 19:59:12.466138] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:23:20.934 [2024-07-24 19:59:12.466157] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:23:20.934 pt3 00:23:20.934 19:59:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:23:20.934 19:59:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:23:20.934 19:59:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:23:21.193 [2024-07-24 19:59:12.714369] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:23:21.193 [2024-07-24 19:59:12.714411] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:21.193 [2024-07-24 19:59:12.714427] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c505d0 00:23:21.193 [2024-07-24 19:59:12.714439] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:21.193 [2024-07-24 19:59:12.714728] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:21.193 [2024-07-24 19:59:12.714744] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:23:21.193 [2024-07-24 19:59:12.714795] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:23:21.193 [2024-07-24 19:59:12.714815] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:23:21.193 [2024-07-24 19:59:12.714936] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c4f530 00:23:21.193 [2024-07-24 19:59:12.714947] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:21.193 [2024-07-24 19:59:12.715109] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c55300 00:23:21.193 [2024-07-24 19:59:12.715240] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c4f530 00:23:21.193 [2024-07-24 19:59:12.715250] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c4f530 00:23:21.193 [2024-07-24 19:59:12.715341] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:21.193 pt4 00:23:21.193 19:59:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:23:21.193 19:59:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:23:21.193 19:59:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:21.193 19:59:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:21.193 19:59:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:21.193 19:59:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:21.193 19:59:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:21.193 19:59:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:21.193 19:59:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:21.193 19:59:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:21.193 19:59:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:21.193 19:59:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:21.193 19:59:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:21.193 19:59:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:21.452 19:59:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:21.452 "name": "raid_bdev1", 00:23:21.452 "uuid": "cf506415-f26f-49b3-9317-04c77703b1be", 00:23:21.452 "strip_size_kb": 0, 00:23:21.452 "state": "online", 00:23:21.452 "raid_level": "raid1", 00:23:21.452 "superblock": true, 00:23:21.452 "num_base_bdevs": 4, 00:23:21.452 "num_base_bdevs_discovered": 4, 00:23:21.452 "num_base_bdevs_operational": 4, 00:23:21.452 "base_bdevs_list": [ 00:23:21.452 { 00:23:21.452 "name": "pt1", 00:23:21.452 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:21.452 "is_configured": true, 00:23:21.452 "data_offset": 2048, 00:23:21.452 "data_size": 63488 00:23:21.452 }, 00:23:21.452 { 00:23:21.452 "name": "pt2", 00:23:21.452 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:21.452 "is_configured": true, 00:23:21.452 "data_offset": 2048, 00:23:21.452 "data_size": 63488 00:23:21.452 }, 00:23:21.452 { 00:23:21.452 "name": "pt3", 00:23:21.452 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:21.452 "is_configured": true, 00:23:21.452 "data_offset": 2048, 00:23:21.452 "data_size": 63488 00:23:21.452 }, 00:23:21.452 { 00:23:21.452 "name": "pt4", 00:23:21.452 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:21.452 "is_configured": true, 00:23:21.452 "data_offset": 2048, 00:23:21.452 "data_size": 63488 00:23:21.452 } 00:23:21.452 ] 00:23:21.452 }' 00:23:21.452 19:59:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:21.452 19:59:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:22.019 19:59:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:23:22.019 19:59:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:23:22.019 19:59:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:22.019 19:59:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:22.019 19:59:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:22.019 19:59:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:23:22.019 19:59:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:22.019 19:59:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:22.276 [2024-07-24 19:59:13.825620] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:22.276 19:59:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:22.276 "name": "raid_bdev1", 00:23:22.276 "aliases": [ 00:23:22.276 "cf506415-f26f-49b3-9317-04c77703b1be" 00:23:22.276 ], 00:23:22.276 "product_name": "Raid Volume", 00:23:22.276 "block_size": 512, 00:23:22.276 "num_blocks": 63488, 00:23:22.276 "uuid": "cf506415-f26f-49b3-9317-04c77703b1be", 00:23:22.276 "assigned_rate_limits": { 00:23:22.276 "rw_ios_per_sec": 0, 00:23:22.276 "rw_mbytes_per_sec": 0, 00:23:22.276 "r_mbytes_per_sec": 0, 00:23:22.276 "w_mbytes_per_sec": 0 00:23:22.276 }, 00:23:22.276 "claimed": false, 00:23:22.276 "zoned": false, 00:23:22.276 "supported_io_types": { 00:23:22.276 "read": true, 00:23:22.276 "write": true, 00:23:22.276 "unmap": false, 00:23:22.276 "flush": false, 00:23:22.276 "reset": true, 00:23:22.276 "nvme_admin": false, 00:23:22.276 "nvme_io": false, 00:23:22.276 "nvme_io_md": false, 00:23:22.276 "write_zeroes": true, 00:23:22.276 "zcopy": false, 00:23:22.276 "get_zone_info": false, 00:23:22.276 "zone_management": false, 00:23:22.276 "zone_append": false, 00:23:22.276 "compare": false, 00:23:22.276 "compare_and_write": false, 00:23:22.276 "abort": false, 00:23:22.276 "seek_hole": false, 00:23:22.276 "seek_data": false, 00:23:22.276 "copy": false, 00:23:22.276 "nvme_iov_md": false 00:23:22.276 }, 00:23:22.276 "memory_domains": [ 00:23:22.276 { 00:23:22.276 "dma_device_id": "system", 00:23:22.276 "dma_device_type": 1 00:23:22.276 }, 00:23:22.276 { 00:23:22.276 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:22.276 "dma_device_type": 2 00:23:22.276 }, 00:23:22.276 { 00:23:22.276 "dma_device_id": "system", 00:23:22.276 "dma_device_type": 1 00:23:22.276 }, 00:23:22.276 { 00:23:22.276 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:22.276 "dma_device_type": 2 00:23:22.276 }, 00:23:22.276 { 00:23:22.276 "dma_device_id": "system", 00:23:22.276 "dma_device_type": 1 00:23:22.277 }, 00:23:22.277 { 00:23:22.277 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:22.277 "dma_device_type": 2 00:23:22.277 }, 00:23:22.277 { 00:23:22.277 "dma_device_id": "system", 00:23:22.277 "dma_device_type": 1 00:23:22.277 }, 00:23:22.277 { 00:23:22.277 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:22.277 "dma_device_type": 2 00:23:22.277 } 00:23:22.277 ], 00:23:22.277 "driver_specific": { 00:23:22.277 "raid": { 00:23:22.277 "uuid": "cf506415-f26f-49b3-9317-04c77703b1be", 00:23:22.277 "strip_size_kb": 0, 00:23:22.277 "state": "online", 00:23:22.277 "raid_level": "raid1", 00:23:22.277 "superblock": true, 00:23:22.277 "num_base_bdevs": 4, 00:23:22.277 "num_base_bdevs_discovered": 4, 00:23:22.277 "num_base_bdevs_operational": 4, 00:23:22.277 "base_bdevs_list": [ 00:23:22.277 { 00:23:22.277 "name": "pt1", 00:23:22.277 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:22.277 "is_configured": true, 00:23:22.277 "data_offset": 2048, 00:23:22.277 "data_size": 63488 00:23:22.277 }, 00:23:22.277 { 00:23:22.277 "name": "pt2", 00:23:22.277 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:22.277 "is_configured": true, 00:23:22.277 "data_offset": 2048, 00:23:22.277 "data_size": 63488 00:23:22.277 }, 00:23:22.277 { 00:23:22.277 "name": "pt3", 00:23:22.277 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:22.277 "is_configured": true, 00:23:22.277 "data_offset": 2048, 00:23:22.277 "data_size": 63488 00:23:22.277 }, 00:23:22.277 { 00:23:22.277 "name": "pt4", 00:23:22.277 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:22.277 "is_configured": true, 00:23:22.277 "data_offset": 2048, 00:23:22.277 "data_size": 63488 00:23:22.277 } 00:23:22.277 ] 00:23:22.277 } 00:23:22.277 } 00:23:22.277 }' 00:23:22.277 19:59:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:22.535 19:59:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:23:22.535 pt2 00:23:22.535 pt3 00:23:22.535 pt4' 00:23:22.535 19:59:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:22.535 19:59:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:22.535 19:59:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:22.794 19:59:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:22.794 "name": "pt1", 00:23:22.794 "aliases": [ 00:23:22.794 "00000000-0000-0000-0000-000000000001" 00:23:22.794 ], 00:23:22.794 "product_name": "passthru", 00:23:22.794 "block_size": 512, 00:23:22.794 "num_blocks": 65536, 00:23:22.794 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:22.794 "assigned_rate_limits": { 00:23:22.794 "rw_ios_per_sec": 0, 00:23:22.794 "rw_mbytes_per_sec": 0, 00:23:22.794 "r_mbytes_per_sec": 0, 00:23:22.794 "w_mbytes_per_sec": 0 00:23:22.794 }, 00:23:22.794 "claimed": true, 00:23:22.794 "claim_type": "exclusive_write", 00:23:22.794 "zoned": false, 00:23:22.794 "supported_io_types": { 00:23:22.794 "read": true, 00:23:22.794 "write": true, 00:23:22.794 "unmap": true, 00:23:22.794 "flush": true, 00:23:22.794 "reset": true, 00:23:22.794 "nvme_admin": false, 00:23:22.794 "nvme_io": false, 00:23:22.794 "nvme_io_md": false, 00:23:22.794 "write_zeroes": true, 00:23:22.794 "zcopy": true, 00:23:22.794 "get_zone_info": false, 00:23:22.794 "zone_management": false, 00:23:22.794 "zone_append": false, 00:23:22.794 "compare": false, 00:23:22.794 "compare_and_write": false, 00:23:22.794 "abort": true, 00:23:22.794 "seek_hole": false, 00:23:22.794 "seek_data": false, 00:23:22.794 "copy": true, 00:23:22.794 "nvme_iov_md": false 00:23:22.794 }, 00:23:22.794 "memory_domains": [ 00:23:22.794 { 00:23:22.794 "dma_device_id": "system", 00:23:22.794 "dma_device_type": 1 00:23:22.794 }, 00:23:22.794 { 00:23:22.794 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:22.794 "dma_device_type": 2 00:23:22.794 } 00:23:22.794 ], 00:23:22.794 "driver_specific": { 00:23:22.794 "passthru": { 00:23:22.794 "name": "pt1", 00:23:22.794 "base_bdev_name": "malloc1" 00:23:22.794 } 00:23:22.794 } 00:23:22.794 }' 00:23:22.794 19:59:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:22.794 19:59:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:22.794 19:59:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:22.794 19:59:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:22.794 19:59:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:22.794 19:59:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:22.794 19:59:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:22.794 19:59:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:23.052 19:59:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:23.052 19:59:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:23.052 19:59:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:23.052 19:59:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:23.052 19:59:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:23.052 19:59:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:23.052 19:59:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:23.311 19:59:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:23.311 "name": "pt2", 00:23:23.311 "aliases": [ 00:23:23.311 "00000000-0000-0000-0000-000000000002" 00:23:23.311 ], 00:23:23.311 "product_name": "passthru", 00:23:23.311 "block_size": 512, 00:23:23.311 "num_blocks": 65536, 00:23:23.311 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:23.311 "assigned_rate_limits": { 00:23:23.311 "rw_ios_per_sec": 0, 00:23:23.311 "rw_mbytes_per_sec": 0, 00:23:23.311 "r_mbytes_per_sec": 0, 00:23:23.311 "w_mbytes_per_sec": 0 00:23:23.311 }, 00:23:23.311 "claimed": true, 00:23:23.311 "claim_type": "exclusive_write", 00:23:23.311 "zoned": false, 00:23:23.311 "supported_io_types": { 00:23:23.311 "read": true, 00:23:23.311 "write": true, 00:23:23.311 "unmap": true, 00:23:23.311 "flush": true, 00:23:23.311 "reset": true, 00:23:23.311 "nvme_admin": false, 00:23:23.311 "nvme_io": false, 00:23:23.311 "nvme_io_md": false, 00:23:23.311 "write_zeroes": true, 00:23:23.311 "zcopy": true, 00:23:23.311 "get_zone_info": false, 00:23:23.311 "zone_management": false, 00:23:23.311 "zone_append": false, 00:23:23.311 "compare": false, 00:23:23.311 "compare_and_write": false, 00:23:23.311 "abort": true, 00:23:23.311 "seek_hole": false, 00:23:23.311 "seek_data": false, 00:23:23.311 "copy": true, 00:23:23.311 "nvme_iov_md": false 00:23:23.311 }, 00:23:23.311 "memory_domains": [ 00:23:23.311 { 00:23:23.311 "dma_device_id": "system", 00:23:23.311 "dma_device_type": 1 00:23:23.311 }, 00:23:23.311 { 00:23:23.311 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:23.311 "dma_device_type": 2 00:23:23.311 } 00:23:23.311 ], 00:23:23.311 "driver_specific": { 00:23:23.311 "passthru": { 00:23:23.311 "name": "pt2", 00:23:23.311 "base_bdev_name": "malloc2" 00:23:23.311 } 00:23:23.311 } 00:23:23.311 }' 00:23:23.311 19:59:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:23.311 19:59:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:23.311 19:59:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:23.311 19:59:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:23.311 19:59:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:23.570 19:59:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:23.570 19:59:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:23.570 19:59:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:23.570 19:59:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:23.570 19:59:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:23.570 19:59:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:23.570 19:59:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:23.570 19:59:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:23.570 19:59:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:23.571 19:59:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:23:23.829 19:59:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:23.829 "name": "pt3", 00:23:23.829 "aliases": [ 00:23:23.829 "00000000-0000-0000-0000-000000000003" 00:23:23.829 ], 00:23:23.829 "product_name": "passthru", 00:23:23.829 "block_size": 512, 00:23:23.829 "num_blocks": 65536, 00:23:23.829 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:23.829 "assigned_rate_limits": { 00:23:23.829 "rw_ios_per_sec": 0, 00:23:23.829 "rw_mbytes_per_sec": 0, 00:23:23.829 "r_mbytes_per_sec": 0, 00:23:23.829 "w_mbytes_per_sec": 0 00:23:23.829 }, 00:23:23.829 "claimed": true, 00:23:23.829 "claim_type": "exclusive_write", 00:23:23.829 "zoned": false, 00:23:23.829 "supported_io_types": { 00:23:23.829 "read": true, 00:23:23.829 "write": true, 00:23:23.829 "unmap": true, 00:23:23.829 "flush": true, 00:23:23.829 "reset": true, 00:23:23.829 "nvme_admin": false, 00:23:23.829 "nvme_io": false, 00:23:23.829 "nvme_io_md": false, 00:23:23.829 "write_zeroes": true, 00:23:23.829 "zcopy": true, 00:23:23.829 "get_zone_info": false, 00:23:23.829 "zone_management": false, 00:23:23.829 "zone_append": false, 00:23:23.829 "compare": false, 00:23:23.830 "compare_and_write": false, 00:23:23.830 "abort": true, 00:23:23.830 "seek_hole": false, 00:23:23.830 "seek_data": false, 00:23:23.830 "copy": true, 00:23:23.830 "nvme_iov_md": false 00:23:23.830 }, 00:23:23.830 "memory_domains": [ 00:23:23.830 { 00:23:23.830 "dma_device_id": "system", 00:23:23.830 "dma_device_type": 1 00:23:23.830 }, 00:23:23.830 { 00:23:23.830 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:23.830 "dma_device_type": 2 00:23:23.830 } 00:23:23.830 ], 00:23:23.830 "driver_specific": { 00:23:23.830 "passthru": { 00:23:23.830 "name": "pt3", 00:23:23.830 "base_bdev_name": "malloc3" 00:23:23.830 } 00:23:23.830 } 00:23:23.830 }' 00:23:23.830 19:59:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:23.830 19:59:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:24.088 19:59:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:24.088 19:59:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:24.088 19:59:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:24.088 19:59:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:24.088 19:59:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:24.088 19:59:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:24.088 19:59:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:24.088 19:59:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:24.088 19:59:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:24.347 19:59:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:24.347 19:59:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:24.347 19:59:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:23:24.347 19:59:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:24.606 19:59:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:24.606 "name": "pt4", 00:23:24.606 "aliases": [ 00:23:24.606 "00000000-0000-0000-0000-000000000004" 00:23:24.606 ], 00:23:24.606 "product_name": "passthru", 00:23:24.606 "block_size": 512, 00:23:24.606 "num_blocks": 65536, 00:23:24.606 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:24.606 "assigned_rate_limits": { 00:23:24.606 "rw_ios_per_sec": 0, 00:23:24.606 "rw_mbytes_per_sec": 0, 00:23:24.606 "r_mbytes_per_sec": 0, 00:23:24.606 "w_mbytes_per_sec": 0 00:23:24.606 }, 00:23:24.606 "claimed": true, 00:23:24.606 "claim_type": "exclusive_write", 00:23:24.606 "zoned": false, 00:23:24.606 "supported_io_types": { 00:23:24.606 "read": true, 00:23:24.606 "write": true, 00:23:24.606 "unmap": true, 00:23:24.606 "flush": true, 00:23:24.606 "reset": true, 00:23:24.606 "nvme_admin": false, 00:23:24.606 "nvme_io": false, 00:23:24.606 "nvme_io_md": false, 00:23:24.606 "write_zeroes": true, 00:23:24.606 "zcopy": true, 00:23:24.606 "get_zone_info": false, 00:23:24.606 "zone_management": false, 00:23:24.606 "zone_append": false, 00:23:24.606 "compare": false, 00:23:24.606 "compare_and_write": false, 00:23:24.606 "abort": true, 00:23:24.606 "seek_hole": false, 00:23:24.606 "seek_data": false, 00:23:24.606 "copy": true, 00:23:24.606 "nvme_iov_md": false 00:23:24.606 }, 00:23:24.606 "memory_domains": [ 00:23:24.606 { 00:23:24.606 "dma_device_id": "system", 00:23:24.606 "dma_device_type": 1 00:23:24.606 }, 00:23:24.606 { 00:23:24.606 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:24.606 "dma_device_type": 2 00:23:24.606 } 00:23:24.606 ], 00:23:24.606 "driver_specific": { 00:23:24.606 "passthru": { 00:23:24.606 "name": "pt4", 00:23:24.606 "base_bdev_name": "malloc4" 00:23:24.606 } 00:23:24.606 } 00:23:24.606 }' 00:23:24.606 19:59:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:24.606 19:59:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:24.606 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:24.606 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:24.606 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:24.606 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:24.606 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:24.606 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:24.865 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:24.865 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:24.865 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:24.865 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:24.865 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:24.865 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:23:25.124 [2024-07-24 19:59:16.520779] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:25.124 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' cf506415-f26f-49b3-9317-04c77703b1be '!=' cf506415-f26f-49b3-9317-04c77703b1be ']' 00:23:25.124 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy raid1 00:23:25.124 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:25.124 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:23:25.124 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@508 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:25.384 [2024-07-24 19:59:16.773191] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:23:25.384 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:25.384 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:25.384 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:25.384 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:25.384 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:25.384 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:25.384 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:25.384 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:25.384 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:25.384 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:25.384 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:25.384 19:59:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:25.643 19:59:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:25.643 "name": "raid_bdev1", 00:23:25.643 "uuid": "cf506415-f26f-49b3-9317-04c77703b1be", 00:23:25.643 "strip_size_kb": 0, 00:23:25.643 "state": "online", 00:23:25.643 "raid_level": "raid1", 00:23:25.643 "superblock": true, 00:23:25.643 "num_base_bdevs": 4, 00:23:25.643 "num_base_bdevs_discovered": 3, 00:23:25.643 "num_base_bdevs_operational": 3, 00:23:25.643 "base_bdevs_list": [ 00:23:25.643 { 00:23:25.643 "name": null, 00:23:25.643 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:25.643 "is_configured": false, 00:23:25.643 "data_offset": 2048, 00:23:25.643 "data_size": 63488 00:23:25.643 }, 00:23:25.643 { 00:23:25.643 "name": "pt2", 00:23:25.643 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:25.643 "is_configured": true, 00:23:25.643 "data_offset": 2048, 00:23:25.643 "data_size": 63488 00:23:25.643 }, 00:23:25.643 { 00:23:25.643 "name": "pt3", 00:23:25.643 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:25.643 "is_configured": true, 00:23:25.643 "data_offset": 2048, 00:23:25.643 "data_size": 63488 00:23:25.643 }, 00:23:25.643 { 00:23:25.643 "name": "pt4", 00:23:25.643 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:25.643 "is_configured": true, 00:23:25.644 "data_offset": 2048, 00:23:25.644 "data_size": 63488 00:23:25.644 } 00:23:25.644 ] 00:23:25.644 }' 00:23:25.644 19:59:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:25.644 19:59:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:26.211 19:59:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:26.469 [2024-07-24 19:59:17.860060] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:26.469 [2024-07-24 19:59:17.860089] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:26.469 [2024-07-24 19:59:17.860147] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:26.469 [2024-07-24 19:59:17.860217] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:26.470 [2024-07-24 19:59:17.860229] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c4f530 name raid_bdev1, state offline 00:23:26.470 19:59:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:26.470 19:59:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # jq -r '.[]' 00:23:26.732 19:59:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # raid_bdev= 00:23:26.732 19:59:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@516 -- # '[' -n '' ']' 00:23:26.732 19:59:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i = 1 )) 00:23:26.732 19:59:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:23:26.732 19:59:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:26.990 19:59:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:23:26.990 19:59:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:23:26.990 19:59:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:23:27.249 19:59:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:23:27.249 19:59:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:23:27.249 19:59:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:23:27.508 19:59:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:23:27.508 19:59:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:23:27.508 19:59:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i = 1 )) 00:23:27.508 19:59:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:23:27.508 19:59:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:27.508 [2024-07-24 19:59:19.075215] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:27.508 [2024-07-24 19:59:19.075262] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:27.508 [2024-07-24 19:59:19.075280] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e00f40 00:23:27.508 [2024-07-24 19:59:19.075293] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:27.508 [2024-07-24 19:59:19.076895] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:27.508 [2024-07-24 19:59:19.076924] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:27.508 [2024-07-24 19:59:19.076995] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:27.508 [2024-07-24 19:59:19.077023] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:27.508 pt2 00:23:27.508 19:59:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@530 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:23:27.508 19:59:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:27.508 19:59:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:27.508 19:59:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:27.508 19:59:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:27.508 19:59:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:27.508 19:59:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:27.508 19:59:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:27.508 19:59:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:27.508 19:59:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:27.508 19:59:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:27.767 19:59:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:27.767 19:59:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:27.767 "name": "raid_bdev1", 00:23:27.767 "uuid": "cf506415-f26f-49b3-9317-04c77703b1be", 00:23:27.767 "strip_size_kb": 0, 00:23:27.767 "state": "configuring", 00:23:27.767 "raid_level": "raid1", 00:23:27.767 "superblock": true, 00:23:27.767 "num_base_bdevs": 4, 00:23:27.767 "num_base_bdevs_discovered": 1, 00:23:27.767 "num_base_bdevs_operational": 3, 00:23:27.767 "base_bdevs_list": [ 00:23:27.767 { 00:23:27.767 "name": null, 00:23:27.767 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:27.767 "is_configured": false, 00:23:27.767 "data_offset": 2048, 00:23:27.767 "data_size": 63488 00:23:27.767 }, 00:23:27.767 { 00:23:27.767 "name": "pt2", 00:23:27.767 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:27.767 "is_configured": true, 00:23:27.767 "data_offset": 2048, 00:23:27.767 "data_size": 63488 00:23:27.767 }, 00:23:27.767 { 00:23:27.767 "name": null, 00:23:27.767 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:27.767 "is_configured": false, 00:23:27.767 "data_offset": 2048, 00:23:27.767 "data_size": 63488 00:23:27.767 }, 00:23:27.767 { 00:23:27.767 "name": null, 00:23:27.767 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:27.767 "is_configured": false, 00:23:27.767 "data_offset": 2048, 00:23:27.767 "data_size": 63488 00:23:27.767 } 00:23:27.767 ] 00:23:27.767 }' 00:23:27.767 19:59:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:27.767 19:59:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:28.704 19:59:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i++ )) 00:23:28.704 19:59:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:23:28.704 19:59:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:23:28.704 [2024-07-24 19:59:20.170148] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:23:28.704 [2024-07-24 19:59:20.170203] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:28.704 [2024-07-24 19:59:20.170222] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c51370 00:23:28.704 [2024-07-24 19:59:20.170234] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:28.704 [2024-07-24 19:59:20.170582] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:28.704 [2024-07-24 19:59:20.170600] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:23:28.704 [2024-07-24 19:59:20.170667] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:23:28.704 [2024-07-24 19:59:20.170687] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:23:28.704 pt3 00:23:28.704 19:59:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@530 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:23:28.704 19:59:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:28.704 19:59:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:28.704 19:59:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:28.704 19:59:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:28.704 19:59:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:28.704 19:59:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:28.704 19:59:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:28.704 19:59:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:28.704 19:59:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:28.704 19:59:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:28.704 19:59:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:28.963 19:59:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:28.964 "name": "raid_bdev1", 00:23:28.964 "uuid": "cf506415-f26f-49b3-9317-04c77703b1be", 00:23:28.964 "strip_size_kb": 0, 00:23:28.964 "state": "configuring", 00:23:28.964 "raid_level": "raid1", 00:23:28.964 "superblock": true, 00:23:28.964 "num_base_bdevs": 4, 00:23:28.964 "num_base_bdevs_discovered": 2, 00:23:28.964 "num_base_bdevs_operational": 3, 00:23:28.964 "base_bdevs_list": [ 00:23:28.964 { 00:23:28.964 "name": null, 00:23:28.964 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:28.964 "is_configured": false, 00:23:28.964 "data_offset": 2048, 00:23:28.964 "data_size": 63488 00:23:28.964 }, 00:23:28.964 { 00:23:28.964 "name": "pt2", 00:23:28.964 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:28.964 "is_configured": true, 00:23:28.964 "data_offset": 2048, 00:23:28.964 "data_size": 63488 00:23:28.964 }, 00:23:28.964 { 00:23:28.964 "name": "pt3", 00:23:28.964 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:28.964 "is_configured": true, 00:23:28.964 "data_offset": 2048, 00:23:28.964 "data_size": 63488 00:23:28.964 }, 00:23:28.964 { 00:23:28.964 "name": null, 00:23:28.964 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:28.964 "is_configured": false, 00:23:28.964 "data_offset": 2048, 00:23:28.964 "data_size": 63488 00:23:28.964 } 00:23:28.964 ] 00:23:28.964 }' 00:23:28.964 19:59:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:28.964 19:59:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:29.531 19:59:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i++ )) 00:23:29.531 19:59:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:23:29.531 19:59:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # i=3 00:23:29.531 19:59:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:23:29.790 [2024-07-24 19:59:21.281086] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:23:29.790 [2024-07-24 19:59:21.281133] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:29.790 [2024-07-24 19:59:21.281155] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1dfdb60 00:23:29.790 [2024-07-24 19:59:21.281168] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:29.790 [2024-07-24 19:59:21.281526] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:29.790 [2024-07-24 19:59:21.281545] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:23:29.790 [2024-07-24 19:59:21.281611] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:23:29.790 [2024-07-24 19:59:21.281632] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:23:29.790 [2024-07-24 19:59:21.281748] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1dfd0b0 00:23:29.790 [2024-07-24 19:59:21.281758] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:29.790 [2024-07-24 19:59:21.281929] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c58ba0 00:23:29.790 [2024-07-24 19:59:21.282060] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1dfd0b0 00:23:29.790 [2024-07-24 19:59:21.282070] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1dfd0b0 00:23:29.790 [2024-07-24 19:59:21.282166] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:29.790 pt4 00:23:29.790 19:59:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@538 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:29.790 19:59:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:29.790 19:59:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:29.790 19:59:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:29.790 19:59:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:29.790 19:59:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:29.790 19:59:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:29.790 19:59:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:29.790 19:59:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:29.790 19:59:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:29.790 19:59:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:29.790 19:59:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:30.049 19:59:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:30.049 "name": "raid_bdev1", 00:23:30.049 "uuid": "cf506415-f26f-49b3-9317-04c77703b1be", 00:23:30.049 "strip_size_kb": 0, 00:23:30.049 "state": "online", 00:23:30.049 "raid_level": "raid1", 00:23:30.049 "superblock": true, 00:23:30.049 "num_base_bdevs": 4, 00:23:30.049 "num_base_bdevs_discovered": 3, 00:23:30.049 "num_base_bdevs_operational": 3, 00:23:30.049 "base_bdevs_list": [ 00:23:30.049 { 00:23:30.049 "name": null, 00:23:30.049 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:30.049 "is_configured": false, 00:23:30.049 "data_offset": 2048, 00:23:30.049 "data_size": 63488 00:23:30.049 }, 00:23:30.049 { 00:23:30.049 "name": "pt2", 00:23:30.049 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:30.049 "is_configured": true, 00:23:30.049 "data_offset": 2048, 00:23:30.049 "data_size": 63488 00:23:30.049 }, 00:23:30.049 { 00:23:30.049 "name": "pt3", 00:23:30.049 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:30.049 "is_configured": true, 00:23:30.049 "data_offset": 2048, 00:23:30.049 "data_size": 63488 00:23:30.049 }, 00:23:30.049 { 00:23:30.049 "name": "pt4", 00:23:30.049 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:30.049 "is_configured": true, 00:23:30.049 "data_offset": 2048, 00:23:30.049 "data_size": 63488 00:23:30.049 } 00:23:30.049 ] 00:23:30.049 }' 00:23:30.049 19:59:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:30.049 19:59:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:30.617 19:59:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:31.184 [2024-07-24 19:59:22.636664] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:31.184 [2024-07-24 19:59:22.636691] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:31.184 [2024-07-24 19:59:22.636746] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:31.184 [2024-07-24 19:59:22.636813] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:31.184 [2024-07-24 19:59:22.636825] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1dfd0b0 name raid_bdev1, state offline 00:23:31.184 19:59:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:31.184 19:59:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # jq -r '.[]' 00:23:31.788 19:59:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # raid_bdev= 00:23:31.788 19:59:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@543 -- # '[' -n '' ']' 00:23:31.788 19:59:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@547 -- # '[' 4 -gt 2 ']' 00:23:31.788 19:59:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@549 -- # i=3 00:23:31.788 19:59:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@550 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:23:32.047 19:59:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:32.306 [2024-07-24 19:59:23.655300] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:32.306 [2024-07-24 19:59:23.655347] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:32.306 [2024-07-24 19:59:23.655364] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c4f7b0 00:23:32.306 [2024-07-24 19:59:23.655376] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:32.306 [2024-07-24 19:59:23.656984] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:32.306 [2024-07-24 19:59:23.657013] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:32.306 [2024-07-24 19:59:23.657081] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:23:32.306 [2024-07-24 19:59:23.657108] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:32.306 [2024-07-24 19:59:23.657205] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:23:32.306 [2024-07-24 19:59:23.657218] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:32.306 [2024-07-24 19:59:23.657233] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c4f0a0 name raid_bdev1, state configuring 00:23:32.306 [2024-07-24 19:59:23.657257] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:32.306 [2024-07-24 19:59:23.657330] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:23:32.306 pt1 00:23:32.306 19:59:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 4 -gt 2 ']' 00:23:32.306 19:59:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@560 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:23:32.306 19:59:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:32.306 19:59:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:32.306 19:59:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:32.306 19:59:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:32.306 19:59:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:32.306 19:59:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:32.306 19:59:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:32.306 19:59:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:32.306 19:59:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:32.306 19:59:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:32.306 19:59:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:32.565 19:59:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:32.565 "name": "raid_bdev1", 00:23:32.565 "uuid": "cf506415-f26f-49b3-9317-04c77703b1be", 00:23:32.565 "strip_size_kb": 0, 00:23:32.565 "state": "configuring", 00:23:32.565 "raid_level": "raid1", 00:23:32.565 "superblock": true, 00:23:32.565 "num_base_bdevs": 4, 00:23:32.565 "num_base_bdevs_discovered": 2, 00:23:32.565 "num_base_bdevs_operational": 3, 00:23:32.565 "base_bdevs_list": [ 00:23:32.565 { 00:23:32.565 "name": null, 00:23:32.565 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:32.565 "is_configured": false, 00:23:32.565 "data_offset": 2048, 00:23:32.565 "data_size": 63488 00:23:32.565 }, 00:23:32.565 { 00:23:32.565 "name": "pt2", 00:23:32.565 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:32.565 "is_configured": true, 00:23:32.565 "data_offset": 2048, 00:23:32.565 "data_size": 63488 00:23:32.565 }, 00:23:32.565 { 00:23:32.565 "name": "pt3", 00:23:32.565 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:32.565 "is_configured": true, 00:23:32.565 "data_offset": 2048, 00:23:32.565 "data_size": 63488 00:23:32.565 }, 00:23:32.565 { 00:23:32.565 "name": null, 00:23:32.565 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:32.565 "is_configured": false, 00:23:32.565 "data_offset": 2048, 00:23:32.565 "data_size": 63488 00:23:32.565 } 00:23:32.565 ] 00:23:32.565 }' 00:23:32.565 19:59:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:32.565 19:59:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:33.132 19:59:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@561 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:23:33.132 19:59:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@561 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:23:33.390 19:59:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@561 -- # [[ false == \f\a\l\s\e ]] 00:23:33.390 19:59:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:23:33.956 [2024-07-24 19:59:25.247532] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:23:33.956 [2024-07-24 19:59:25.247584] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:33.956 [2024-07-24 19:59:25.247603] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c509a0 00:23:33.956 [2024-07-24 19:59:25.247616] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:33.956 [2024-07-24 19:59:25.247963] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:33.956 [2024-07-24 19:59:25.247980] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:23:33.956 [2024-07-24 19:59:25.248047] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:23:33.956 [2024-07-24 19:59:25.248069] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:23:33.956 [2024-07-24 19:59:25.248187] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c587c0 00:23:33.956 [2024-07-24 19:59:25.248197] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:33.956 [2024-07-24 19:59:25.248365] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c4fa40 00:23:33.956 [2024-07-24 19:59:25.248515] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c587c0 00:23:33.956 [2024-07-24 19:59:25.248526] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c587c0 00:23:33.956 [2024-07-24 19:59:25.248626] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:33.956 pt4 00:23:33.956 19:59:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@569 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:33.956 19:59:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:33.956 19:59:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:33.956 19:59:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:33.956 19:59:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:33.956 19:59:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:33.956 19:59:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:33.957 19:59:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:33.957 19:59:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:33.957 19:59:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:33.957 19:59:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:33.957 19:59:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:33.957 19:59:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:33.957 "name": "raid_bdev1", 00:23:33.957 "uuid": "cf506415-f26f-49b3-9317-04c77703b1be", 00:23:33.957 "strip_size_kb": 0, 00:23:33.957 "state": "online", 00:23:33.957 "raid_level": "raid1", 00:23:33.957 "superblock": true, 00:23:33.957 "num_base_bdevs": 4, 00:23:33.957 "num_base_bdevs_discovered": 3, 00:23:33.957 "num_base_bdevs_operational": 3, 00:23:33.957 "base_bdevs_list": [ 00:23:33.957 { 00:23:33.957 "name": null, 00:23:33.957 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:33.957 "is_configured": false, 00:23:33.957 "data_offset": 2048, 00:23:33.957 "data_size": 63488 00:23:33.957 }, 00:23:33.957 { 00:23:33.957 "name": "pt2", 00:23:33.957 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:33.957 "is_configured": true, 00:23:33.957 "data_offset": 2048, 00:23:33.957 "data_size": 63488 00:23:33.957 }, 00:23:33.957 { 00:23:33.957 "name": "pt3", 00:23:33.957 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:33.957 "is_configured": true, 00:23:33.957 "data_offset": 2048, 00:23:33.957 "data_size": 63488 00:23:33.957 }, 00:23:33.957 { 00:23:33.957 "name": "pt4", 00:23:33.957 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:33.957 "is_configured": true, 00:23:33.957 "data_offset": 2048, 00:23:33.957 "data_size": 63488 00:23:33.957 } 00:23:33.957 ] 00:23:33.957 }' 00:23:33.957 19:59:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:33.957 19:59:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:34.523 19:59:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:23:34.523 19:59:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:23:34.780 19:59:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # [[ false == \f\a\l\s\e ]] 00:23:34.780 19:59:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:34.780 19:59:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # jq -r '.[] | .uuid' 00:23:35.039 [2024-07-24 19:59:26.511173] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:35.039 19:59:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # '[' cf506415-f26f-49b3-9317-04c77703b1be '!=' cf506415-f26f-49b3-9317-04c77703b1be ']' 00:23:35.039 19:59:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 1478375 00:23:35.039 19:59:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 1478375 ']' 00:23:35.039 19:59:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 1478375 00:23:35.039 19:59:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:23:35.039 19:59:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:23:35.039 19:59:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1478375 00:23:35.039 19:59:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:23:35.039 19:59:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:23:35.039 19:59:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1478375' 00:23:35.039 killing process with pid 1478375 00:23:35.039 19:59:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 1478375 00:23:35.039 [2024-07-24 19:59:26.578682] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:35.039 [2024-07-24 19:59:26.578733] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:35.039 [2024-07-24 19:59:26.578804] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:35.039 [2024-07-24 19:59:26.578817] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c587c0 name raid_bdev1, state offline 00:23:35.039 19:59:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 1478375 00:23:35.039 [2024-07-24 19:59:26.620466] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:35.297 19:59:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:23:35.297 00:23:35.297 real 0m28.337s 00:23:35.297 user 0m51.878s 00:23:35.297 sys 0m4.976s 00:23:35.297 19:59:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:23:35.297 19:59:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:35.297 ************************************ 00:23:35.297 END TEST raid_superblock_test 00:23:35.297 ************************************ 00:23:35.297 19:59:26 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:23:35.555 19:59:26 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:23:35.555 19:59:26 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:23:35.555 19:59:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:35.555 ************************************ 00:23:35.555 START TEST raid_read_error_test 00:23:35.555 ************************************ 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 4 read 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid1 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=4 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev4 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid1 '!=' raid1 ']' 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@818 -- # strip_size=0 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.JCjdEvV2D8 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1482571 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1482571 /var/tmp/spdk-raid.sock 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 1482571 ']' 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:35.555 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:23:35.555 19:59:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:35.555 [2024-07-24 19:59:27.022125] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:23:35.555 [2024-07-24 19:59:27.022200] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1482571 ] 00:23:35.813 [2024-07-24 19:59:27.151746] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:35.813 [2024-07-24 19:59:27.256905] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:35.813 [2024-07-24 19:59:27.318558] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:35.813 [2024-07-24 19:59:27.318593] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:36.378 19:59:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:23:36.378 19:59:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:23:36.378 19:59:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:23:36.378 19:59:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:36.635 BaseBdev1_malloc 00:23:36.635 19:59:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:23:36.893 true 00:23:36.893 19:59:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:23:36.893 [2024-07-24 19:59:28.467822] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:23:36.893 [2024-07-24 19:59:28.467869] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:36.893 [2024-07-24 19:59:28.467889] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20183a0 00:23:36.893 [2024-07-24 19:59:28.467901] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:36.893 [2024-07-24 19:59:28.469485] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:36.893 [2024-07-24 19:59:28.469514] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:36.893 BaseBdev1 00:23:37.151 19:59:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:23:37.151 19:59:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:37.151 BaseBdev2_malloc 00:23:37.151 19:59:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:23:37.409 true 00:23:37.409 19:59:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:23:37.667 [2024-07-24 19:59:29.017878] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:23:37.667 [2024-07-24 19:59:29.017924] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:37.667 [2024-07-24 19:59:29.017948] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20d7370 00:23:37.667 [2024-07-24 19:59:29.017960] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:37.667 [2024-07-24 19:59:29.019487] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:37.667 [2024-07-24 19:59:29.019517] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:37.667 BaseBdev2 00:23:37.667 19:59:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:23:37.667 19:59:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:23:37.667 BaseBdev3_malloc 00:23:37.667 19:59:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:23:37.925 true 00:23:37.925 19:59:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:23:38.185 [2024-07-24 19:59:29.543814] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:23:38.185 [2024-07-24 19:59:29.543857] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:38.185 [2024-07-24 19:59:29.543880] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x200d2d0 00:23:38.185 [2024-07-24 19:59:29.543892] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:38.185 [2024-07-24 19:59:29.545341] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:38.185 [2024-07-24 19:59:29.545369] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:23:38.185 BaseBdev3 00:23:38.185 19:59:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:23:38.185 19:59:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:23:38.185 BaseBdev4_malloc 00:23:38.185 19:59:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:23:38.443 true 00:23:38.443 19:59:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:23:38.702 [2024-07-24 19:59:30.077775] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:23:38.702 [2024-07-24 19:59:30.077819] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:38.702 [2024-07-24 19:59:30.077843] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2010310 00:23:38.702 [2024-07-24 19:59:30.077856] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:38.702 [2024-07-24 19:59:30.079359] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:38.702 [2024-07-24 19:59:30.079387] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:23:38.702 BaseBdev4 00:23:38.702 19:59:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:23:38.702 [2024-07-24 19:59:30.270329] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:38.702 [2024-07-24 19:59:30.271507] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:38.702 [2024-07-24 19:59:30.271574] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:38.702 [2024-07-24 19:59:30.271633] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:38.702 [2024-07-24 19:59:30.271865] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2011060 00:23:38.702 [2024-07-24 19:59:30.271876] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:38.702 [2024-07-24 19:59:30.272060] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2011c10 00:23:38.702 [2024-07-24 19:59:30.272214] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2011060 00:23:38.702 [2024-07-24 19:59:30.272225] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2011060 00:23:38.702 [2024-07-24 19:59:30.272324] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:38.702 19:59:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:38.702 19:59:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:38.702 19:59:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:38.961 19:59:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:38.961 19:59:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:38.961 19:59:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:38.961 19:59:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:38.961 19:59:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:38.961 19:59:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:38.961 19:59:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:38.961 19:59:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:38.961 19:59:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:39.528 19:59:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:39.528 "name": "raid_bdev1", 00:23:39.528 "uuid": "51e0a96a-3db7-4dc6-9ed2-d112bfb9765b", 00:23:39.528 "strip_size_kb": 0, 00:23:39.528 "state": "online", 00:23:39.528 "raid_level": "raid1", 00:23:39.528 "superblock": true, 00:23:39.528 "num_base_bdevs": 4, 00:23:39.528 "num_base_bdevs_discovered": 4, 00:23:39.528 "num_base_bdevs_operational": 4, 00:23:39.528 "base_bdevs_list": [ 00:23:39.528 { 00:23:39.528 "name": "BaseBdev1", 00:23:39.528 "uuid": "68977f6e-cb0c-5712-a766-fba82e63a1d1", 00:23:39.528 "is_configured": true, 00:23:39.528 "data_offset": 2048, 00:23:39.528 "data_size": 63488 00:23:39.528 }, 00:23:39.528 { 00:23:39.528 "name": "BaseBdev2", 00:23:39.528 "uuid": "982b136a-d407-57f3-a5ec-86b1b6a183ae", 00:23:39.528 "is_configured": true, 00:23:39.528 "data_offset": 2048, 00:23:39.528 "data_size": 63488 00:23:39.528 }, 00:23:39.528 { 00:23:39.528 "name": "BaseBdev3", 00:23:39.528 "uuid": "145d8138-b627-51cd-8f45-36b8efebf4f3", 00:23:39.528 "is_configured": true, 00:23:39.528 "data_offset": 2048, 00:23:39.528 "data_size": 63488 00:23:39.528 }, 00:23:39.528 { 00:23:39.528 "name": "BaseBdev4", 00:23:39.528 "uuid": "c6d8b227-91b8-534a-ad15-cf5fae3079ab", 00:23:39.528 "is_configured": true, 00:23:39.528 "data_offset": 2048, 00:23:39.528 "data_size": 63488 00:23:39.528 } 00:23:39.528 ] 00:23:39.528 }' 00:23:39.528 19:59:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:39.528 19:59:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:40.095 19:59:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:23:40.095 19:59:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:40.095 [2024-07-24 19:59:31.493855] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2016c80 00:23:41.033 19:59:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:23:41.033 19:59:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:23:41.033 19:59:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid1 = \r\a\i\d\1 ]] 00:23:41.033 19:59:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ read = \w\r\i\t\e ]] 00:23:41.033 19:59:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=4 00:23:41.033 19:59:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:41.033 19:59:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:41.033 19:59:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:41.033 19:59:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:41.033 19:59:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:41.033 19:59:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:41.033 19:59:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:41.033 19:59:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:41.033 19:59:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:41.033 19:59:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:41.033 19:59:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:41.033 19:59:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:41.292 19:59:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:41.292 "name": "raid_bdev1", 00:23:41.292 "uuid": "51e0a96a-3db7-4dc6-9ed2-d112bfb9765b", 00:23:41.292 "strip_size_kb": 0, 00:23:41.292 "state": "online", 00:23:41.292 "raid_level": "raid1", 00:23:41.292 "superblock": true, 00:23:41.292 "num_base_bdevs": 4, 00:23:41.292 "num_base_bdevs_discovered": 4, 00:23:41.292 "num_base_bdevs_operational": 4, 00:23:41.292 "base_bdevs_list": [ 00:23:41.292 { 00:23:41.292 "name": "BaseBdev1", 00:23:41.292 "uuid": "68977f6e-cb0c-5712-a766-fba82e63a1d1", 00:23:41.292 "is_configured": true, 00:23:41.292 "data_offset": 2048, 00:23:41.292 "data_size": 63488 00:23:41.292 }, 00:23:41.292 { 00:23:41.292 "name": "BaseBdev2", 00:23:41.292 "uuid": "982b136a-d407-57f3-a5ec-86b1b6a183ae", 00:23:41.292 "is_configured": true, 00:23:41.292 "data_offset": 2048, 00:23:41.292 "data_size": 63488 00:23:41.292 }, 00:23:41.292 { 00:23:41.292 "name": "BaseBdev3", 00:23:41.292 "uuid": "145d8138-b627-51cd-8f45-36b8efebf4f3", 00:23:41.292 "is_configured": true, 00:23:41.292 "data_offset": 2048, 00:23:41.292 "data_size": 63488 00:23:41.292 }, 00:23:41.292 { 00:23:41.292 "name": "BaseBdev4", 00:23:41.292 "uuid": "c6d8b227-91b8-534a-ad15-cf5fae3079ab", 00:23:41.292 "is_configured": true, 00:23:41.292 "data_offset": 2048, 00:23:41.292 "data_size": 63488 00:23:41.292 } 00:23:41.292 ] 00:23:41.292 }' 00:23:41.292 19:59:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:41.292 19:59:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:42.227 19:59:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:42.227 [2024-07-24 19:59:33.704042] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:42.227 [2024-07-24 19:59:33.704082] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:42.227 [2024-07-24 19:59:33.707344] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:42.227 [2024-07-24 19:59:33.707385] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:42.227 [2024-07-24 19:59:33.707509] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:42.227 [2024-07-24 19:59:33.707522] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2011060 name raid_bdev1, state offline 00:23:42.227 0 00:23:42.227 19:59:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1482571 00:23:42.227 19:59:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 1482571 ']' 00:23:42.227 19:59:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 1482571 00:23:42.227 19:59:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:23:42.227 19:59:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:23:42.227 19:59:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1482571 00:23:42.227 19:59:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:23:42.227 19:59:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:23:42.227 19:59:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1482571' 00:23:42.227 killing process with pid 1482571 00:23:42.227 19:59:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 1482571 00:23:42.227 [2024-07-24 19:59:33.773464] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:42.227 19:59:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 1482571 00:23:42.227 [2024-07-24 19:59:33.803375] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:42.486 19:59:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.JCjdEvV2D8 00:23:42.486 19:59:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:23:42.486 19:59:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:23:42.486 19:59:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.00 00:23:42.486 19:59:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid1 00:23:42.486 19:59:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:42.486 19:59:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:23:42.486 19:59:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@861 -- # [[ 0.00 = \0\.\0\0 ]] 00:23:42.486 00:23:42.486 real 0m7.089s 00:23:42.486 user 0m11.246s 00:23:42.486 sys 0m1.246s 00:23:42.486 19:59:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:23:42.486 19:59:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:42.486 ************************************ 00:23:42.486 END TEST raid_read_error_test 00:23:42.486 ************************************ 00:23:42.486 19:59:34 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:23:42.486 19:59:34 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:23:42.486 19:59:34 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:23:42.486 19:59:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:42.744 ************************************ 00:23:42.744 START TEST raid_write_error_test 00:23:42.744 ************************************ 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 4 write 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid1 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=4 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev4 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid1 '!=' raid1 ']' 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@818 -- # strip_size=0 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.L2ZFQqYMDm 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=1483553 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 1483553 /var/tmp/spdk-raid.sock 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 1483553 ']' 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:42.745 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:23:42.745 19:59:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:42.745 [2024-07-24 19:59:34.198077] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:23:42.745 [2024-07-24 19:59:34.198150] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1483553 ] 00:23:42.745 [2024-07-24 19:59:34.331349] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:43.003 [2024-07-24 19:59:34.434784] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:43.003 [2024-07-24 19:59:34.500195] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:43.003 [2024-07-24 19:59:34.500235] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:43.570 19:59:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:23:43.570 19:59:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:23:43.570 19:59:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:23:43.570 19:59:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:43.829 BaseBdev1_malloc 00:23:43.829 19:59:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:23:44.088 true 00:23:44.088 19:59:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:23:44.088 [2024-07-24 19:59:35.630224] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:23:44.088 [2024-07-24 19:59:35.630267] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:44.088 [2024-07-24 19:59:35.630286] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d433a0 00:23:44.088 [2024-07-24 19:59:35.630299] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:44.088 [2024-07-24 19:59:35.631854] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:44.088 [2024-07-24 19:59:35.631882] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:44.088 BaseBdev1 00:23:44.088 19:59:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:23:44.088 19:59:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:44.347 BaseBdev2_malloc 00:23:44.347 19:59:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:23:44.605 true 00:23:44.606 19:59:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:23:44.864 [2024-07-24 19:59:36.312641] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:23:44.864 [2024-07-24 19:59:36.312687] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:44.864 [2024-07-24 19:59:36.312709] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e02370 00:23:44.864 [2024-07-24 19:59:36.312722] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:44.864 [2024-07-24 19:59:36.314112] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:44.864 [2024-07-24 19:59:36.314141] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:44.864 BaseBdev2 00:23:44.864 19:59:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:23:44.864 19:59:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:23:45.123 BaseBdev3_malloc 00:23:45.123 19:59:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:23:45.123 true 00:23:45.123 19:59:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:23:45.381 [2024-07-24 19:59:36.850566] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:23:45.381 [2024-07-24 19:59:36.850609] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:45.381 [2024-07-24 19:59:36.850629] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d382d0 00:23:45.381 [2024-07-24 19:59:36.850642] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:45.381 [2024-07-24 19:59:36.852023] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:45.381 [2024-07-24 19:59:36.852049] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:23:45.381 BaseBdev3 00:23:45.381 19:59:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:23:45.381 19:59:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:23:45.640 BaseBdev4_malloc 00:23:45.640 19:59:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:23:45.899 true 00:23:45.899 19:59:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:23:46.158 [2024-07-24 19:59:37.545079] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:23:46.158 [2024-07-24 19:59:37.545128] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:46.158 [2024-07-24 19:59:37.545153] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d3b310 00:23:46.158 [2024-07-24 19:59:37.545165] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:46.158 [2024-07-24 19:59:37.546650] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:46.158 [2024-07-24 19:59:37.546678] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:23:46.158 BaseBdev4 00:23:46.158 19:59:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:23:46.158 [2024-07-24 19:59:37.725586] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:46.158 [2024-07-24 19:59:37.726830] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:46.158 [2024-07-24 19:59:37.726905] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:46.158 [2024-07-24 19:59:37.726964] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:46.158 [2024-07-24 19:59:37.727200] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d3c060 00:23:46.158 [2024-07-24 19:59:37.727211] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:46.158 [2024-07-24 19:59:37.727410] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d3cc10 00:23:46.158 [2024-07-24 19:59:37.727570] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d3c060 00:23:46.158 [2024-07-24 19:59:37.727580] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d3c060 00:23:46.158 [2024-07-24 19:59:37.727685] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:46.158 19:59:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:46.158 19:59:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:46.158 19:59:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:46.158 19:59:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:46.417 19:59:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:46.417 19:59:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:46.417 19:59:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:46.417 19:59:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:46.417 19:59:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:46.417 19:59:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:46.417 19:59:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:46.417 19:59:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:46.417 19:59:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:46.417 "name": "raid_bdev1", 00:23:46.417 "uuid": "00dbc31c-a70e-4675-9323-04716ca9365c", 00:23:46.417 "strip_size_kb": 0, 00:23:46.417 "state": "online", 00:23:46.417 "raid_level": "raid1", 00:23:46.417 "superblock": true, 00:23:46.417 "num_base_bdevs": 4, 00:23:46.417 "num_base_bdevs_discovered": 4, 00:23:46.417 "num_base_bdevs_operational": 4, 00:23:46.417 "base_bdevs_list": [ 00:23:46.417 { 00:23:46.417 "name": "BaseBdev1", 00:23:46.417 "uuid": "3419f03e-d41a-5bda-a837-f0dd88a1653b", 00:23:46.417 "is_configured": true, 00:23:46.417 "data_offset": 2048, 00:23:46.417 "data_size": 63488 00:23:46.417 }, 00:23:46.417 { 00:23:46.417 "name": "BaseBdev2", 00:23:46.417 "uuid": "1d154ea3-4be7-5113-a9da-ff12ce626052", 00:23:46.417 "is_configured": true, 00:23:46.417 "data_offset": 2048, 00:23:46.417 "data_size": 63488 00:23:46.417 }, 00:23:46.417 { 00:23:46.417 "name": "BaseBdev3", 00:23:46.417 "uuid": "b9ecba4f-c239-53c7-84e8-f2bc119e8976", 00:23:46.417 "is_configured": true, 00:23:46.417 "data_offset": 2048, 00:23:46.417 "data_size": 63488 00:23:46.417 }, 00:23:46.417 { 00:23:46.417 "name": "BaseBdev4", 00:23:46.417 "uuid": "961c30ed-b860-5963-a61a-d7f38967ba85", 00:23:46.417 "is_configured": true, 00:23:46.417 "data_offset": 2048, 00:23:46.417 "data_size": 63488 00:23:46.417 } 00:23:46.417 ] 00:23:46.417 }' 00:23:46.417 19:59:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:46.417 19:59:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:47.353 19:59:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:23:47.353 19:59:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:47.353 [2024-07-24 19:59:38.692464] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d41c80 00:23:48.289 19:59:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:23:48.289 [2024-07-24 19:59:39.816763] bdev_raid.c:2247:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:23:48.289 [2024-07-24 19:59:39.816826] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:48.289 [2024-07-24 19:59:39.817044] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1d41c80 00:23:48.289 19:59:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:23:48.289 19:59:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid1 = \r\a\i\d\1 ]] 00:23:48.289 19:59:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ write = \w\r\i\t\e ]] 00:23:48.289 19:59:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # expected_num_base_bdevs=3 00:23:48.289 19:59:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:48.289 19:59:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:48.289 19:59:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:48.289 19:59:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:48.289 19:59:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:48.289 19:59:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:48.289 19:59:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:48.289 19:59:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:48.289 19:59:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:48.289 19:59:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:48.289 19:59:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:48.289 19:59:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:48.546 19:59:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:48.546 "name": "raid_bdev1", 00:23:48.546 "uuid": "00dbc31c-a70e-4675-9323-04716ca9365c", 00:23:48.546 "strip_size_kb": 0, 00:23:48.547 "state": "online", 00:23:48.547 "raid_level": "raid1", 00:23:48.547 "superblock": true, 00:23:48.547 "num_base_bdevs": 4, 00:23:48.547 "num_base_bdevs_discovered": 3, 00:23:48.547 "num_base_bdevs_operational": 3, 00:23:48.547 "base_bdevs_list": [ 00:23:48.547 { 00:23:48.547 "name": null, 00:23:48.547 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:48.547 "is_configured": false, 00:23:48.547 "data_offset": 2048, 00:23:48.547 "data_size": 63488 00:23:48.547 }, 00:23:48.547 { 00:23:48.547 "name": "BaseBdev2", 00:23:48.547 "uuid": "1d154ea3-4be7-5113-a9da-ff12ce626052", 00:23:48.547 "is_configured": true, 00:23:48.547 "data_offset": 2048, 00:23:48.547 "data_size": 63488 00:23:48.547 }, 00:23:48.547 { 00:23:48.547 "name": "BaseBdev3", 00:23:48.547 "uuid": "b9ecba4f-c239-53c7-84e8-f2bc119e8976", 00:23:48.547 "is_configured": true, 00:23:48.547 "data_offset": 2048, 00:23:48.547 "data_size": 63488 00:23:48.547 }, 00:23:48.547 { 00:23:48.547 "name": "BaseBdev4", 00:23:48.547 "uuid": "961c30ed-b860-5963-a61a-d7f38967ba85", 00:23:48.547 "is_configured": true, 00:23:48.547 "data_offset": 2048, 00:23:48.547 "data_size": 63488 00:23:48.547 } 00:23:48.547 ] 00:23:48.547 }' 00:23:48.547 19:59:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:48.547 19:59:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:49.113 19:59:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:49.371 [2024-07-24 19:59:40.921580] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:49.371 [2024-07-24 19:59:40.921625] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:49.371 [2024-07-24 19:59:40.924927] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:49.371 [2024-07-24 19:59:40.924963] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:49.371 [2024-07-24 19:59:40.925055] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:49.371 [2024-07-24 19:59:40.925067] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d3c060 name raid_bdev1, state offline 00:23:49.371 0 00:23:49.371 19:59:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 1483553 00:23:49.371 19:59:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 1483553 ']' 00:23:49.371 19:59:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 1483553 00:23:49.371 19:59:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:23:49.371 19:59:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:23:49.371 19:59:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1483553 00:23:49.630 19:59:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:23:49.630 19:59:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:23:49.630 19:59:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1483553' 00:23:49.630 killing process with pid 1483553 00:23:49.630 19:59:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 1483553 00:23:49.630 [2024-07-24 19:59:40.993493] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:49.630 19:59:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 1483553 00:23:49.630 [2024-07-24 19:59:41.025679] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:49.889 19:59:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.L2ZFQqYMDm 00:23:49.889 19:59:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:23:49.889 19:59:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:23:49.889 19:59:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.00 00:23:49.889 19:59:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid1 00:23:49.889 19:59:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:49.889 19:59:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:23:49.889 19:59:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@861 -- # [[ 0.00 = \0\.\0\0 ]] 00:23:49.889 00:23:49.889 real 0m7.152s 00:23:49.889 user 0m11.307s 00:23:49.889 sys 0m1.317s 00:23:49.889 19:59:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:23:49.889 19:59:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:49.889 ************************************ 00:23:49.889 END TEST raid_write_error_test 00:23:49.889 ************************************ 00:23:49.889 19:59:41 bdev_raid -- bdev/bdev_raid.sh@955 -- # '[' true = true ']' 00:23:49.889 19:59:41 bdev_raid -- bdev/bdev_raid.sh@956 -- # for n in 2 4 00:23:49.889 19:59:41 bdev_raid -- bdev/bdev_raid.sh@957 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:23:49.889 19:59:41 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:23:49.889 19:59:41 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:23:49.889 19:59:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:49.889 ************************************ 00:23:49.889 START TEST raid_rebuild_test 00:23:49.889 ************************************ 00:23:49.889 19:59:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 false false true 00:23:49.889 19:59:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:23:49.889 19:59:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=2 00:23:49.889 19:59:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@586 -- # local superblock=false 00:23:49.889 19:59:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@587 -- # local background_io=false 00:23:49.889 19:59:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # local verify=true 00:23:49.889 19:59:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:23:49.889 19:59:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:23:49.889 19:59:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:23:49.889 19:59:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:23:49.889 19:59:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:23:49.889 19:59:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:23:49.889 19:59:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:23:49.889 19:59:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:23:49.889 19:59:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:49.889 19:59:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:23:49.889 19:59:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:23:49.889 19:59:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # local strip_size 00:23:49.889 19:59:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@592 -- # local create_arg 00:23:49.889 19:59:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:23:49.889 19:59:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@594 -- # local data_offset 00:23:49.889 19:59:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:23:49.889 19:59:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:23:49.890 19:59:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # '[' false = true ']' 00:23:49.890 19:59:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # raid_pid=1484691 00:23:49.890 19:59:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@613 -- # waitforlisten 1484691 /var/tmp/spdk-raid.sock 00:23:49.890 19:59:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:49.890 19:59:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@831 -- # '[' -z 1484691 ']' 00:23:49.890 19:59:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:49.890 19:59:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:23:49.890 19:59:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:49.890 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:49.890 19:59:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:23:49.890 19:59:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:49.890 [2024-07-24 19:59:41.477861] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:23:49.890 [2024-07-24 19:59:41.477997] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1484691 ] 00:23:49.890 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:49.890 Zero copy mechanism will not be used. 00:23:50.148 [2024-07-24 19:59:41.672620] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:50.407 [2024-07-24 19:59:41.773400] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:50.407 [2024-07-24 19:59:41.833102] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:50.407 [2024-07-24 19:59:41.833159] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:51.029 19:59:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:23:51.029 19:59:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@864 -- # return 0 00:23:51.029 19:59:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:23:51.029 19:59:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:51.310 BaseBdev1_malloc 00:23:51.310 19:59:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:51.310 [2024-07-24 19:59:42.829358] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:51.310 [2024-07-24 19:59:42.829410] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:51.310 [2024-07-24 19:59:42.829436] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9aecd0 00:23:51.310 [2024-07-24 19:59:42.829455] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:51.310 [2024-07-24 19:59:42.831146] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:51.310 [2024-07-24 19:59:42.831175] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:51.310 BaseBdev1 00:23:51.310 19:59:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:23:51.310 19:59:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:51.569 BaseBdev2_malloc 00:23:51.569 19:59:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:51.828 [2024-07-24 19:59:43.316701] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:51.828 [2024-07-24 19:59:43.316747] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:51.828 [2024-07-24 19:59:43.316767] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9b2460 00:23:51.828 [2024-07-24 19:59:43.316780] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:51.828 [2024-07-24 19:59:43.318313] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:51.828 [2024-07-24 19:59:43.318341] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:51.828 BaseBdev2 00:23:51.828 19:59:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:52.086 spare_malloc 00:23:52.086 19:59:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:52.344 spare_delay 00:23:52.344 19:59:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:52.602 [2024-07-24 19:59:44.043218] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:52.602 [2024-07-24 19:59:44.043265] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:52.602 [2024-07-24 19:59:44.043288] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9a6c70 00:23:52.602 [2024-07-24 19:59:44.043300] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:52.602 [2024-07-24 19:59:44.044900] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:52.602 [2024-07-24 19:59:44.044930] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:52.602 spare 00:23:52.602 19:59:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:52.861 [2024-07-24 19:59:44.275840] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:52.861 [2024-07-24 19:59:44.277154] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:52.861 [2024-07-24 19:59:44.277240] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xa71c90 00:23:52.861 [2024-07-24 19:59:44.277251] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:23:52.861 [2024-07-24 19:59:44.277466] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9a6f00 00:23:52.861 [2024-07-24 19:59:44.277612] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa71c90 00:23:52.861 [2024-07-24 19:59:44.277622] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa71c90 00:23:52.861 [2024-07-24 19:59:44.277740] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:52.861 19:59:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:52.861 19:59:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:52.861 19:59:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:52.861 19:59:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:52.861 19:59:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:52.861 19:59:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:52.861 19:59:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:52.861 19:59:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:52.861 19:59:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:52.861 19:59:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:52.861 19:59:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:52.861 19:59:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:53.120 19:59:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:53.120 "name": "raid_bdev1", 00:23:53.120 "uuid": "c7879b1f-80bb-4e8f-9812-5d46814f65c2", 00:23:53.120 "strip_size_kb": 0, 00:23:53.120 "state": "online", 00:23:53.120 "raid_level": "raid1", 00:23:53.120 "superblock": false, 00:23:53.120 "num_base_bdevs": 2, 00:23:53.120 "num_base_bdevs_discovered": 2, 00:23:53.120 "num_base_bdevs_operational": 2, 00:23:53.120 "base_bdevs_list": [ 00:23:53.120 { 00:23:53.120 "name": "BaseBdev1", 00:23:53.120 "uuid": "4b2dfb6e-af58-5678-b2dd-91302deb490c", 00:23:53.120 "is_configured": true, 00:23:53.120 "data_offset": 0, 00:23:53.120 "data_size": 65536 00:23:53.120 }, 00:23:53.120 { 00:23:53.120 "name": "BaseBdev2", 00:23:53.120 "uuid": "802b3615-f6b1-5639-84d8-b1e53993b057", 00:23:53.120 "is_configured": true, 00:23:53.120 "data_offset": 0, 00:23:53.120 "data_size": 65536 00:23:53.120 } 00:23:53.120 ] 00:23:53.120 }' 00:23:53.120 19:59:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:53.120 19:59:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:53.687 19:59:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:53.687 19:59:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:23:53.946 [2024-07-24 19:59:45.306815] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:53.946 19:59:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=65536 00:23:53.946 19:59:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:53.946 19:59:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:54.205 19:59:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # data_offset=0 00:23:54.205 19:59:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@636 -- # '[' false = true ']' 00:23:54.205 19:59:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # '[' true = true ']' 00:23:54.205 19:59:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@640 -- # local write_unit_size 00:23:54.205 19:59:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@643 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:23:54.205 19:59:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:54.205 19:59:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:23:54.205 19:59:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:54.205 19:59:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:54.205 19:59:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:54.205 19:59:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:23:54.205 19:59:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:54.205 19:59:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:54.205 19:59:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:23:54.464 [2024-07-24 19:59:45.811965] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9ae9a0 00:23:54.464 /dev/nbd0 00:23:54.464 19:59:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:54.464 19:59:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:54.464 19:59:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:23:54.464 19:59:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:23:54.464 19:59:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:23:54.464 19:59:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:23:54.464 19:59:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:23:54.464 19:59:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:23:54.464 19:59:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:23:54.464 19:59:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:23:54.464 19:59:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:54.464 1+0 records in 00:23:54.464 1+0 records out 00:23:54.464 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000247405 s, 16.6 MB/s 00:23:54.464 19:59:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:54.464 19:59:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:23:54.464 19:59:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:54.464 19:59:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:23:54.464 19:59:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:23:54.464 19:59:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:54.464 19:59:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:54.464 19:59:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@644 -- # '[' raid1 = raid5f ']' 00:23:54.464 19:59:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@648 -- # write_unit_size=1 00:23:54.464 19:59:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@650 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:24:01.028 65536+0 records in 00:24:01.028 65536+0 records out 00:24:01.028 33554432 bytes (34 MB, 32 MiB) copied, 6.10049 s, 5.5 MB/s 00:24:01.028 19:59:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@651 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:01.028 19:59:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:01.028 19:59:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:01.028 19:59:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:01.028 19:59:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:24:01.028 19:59:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:01.028 19:59:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:01.028 19:59:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:01.028 [2024-07-24 19:59:52.245640] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:01.028 19:59:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:01.028 19:59:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:01.028 19:59:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:01.028 19:59:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:01.028 19:59:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:01.028 19:59:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:24:01.028 19:59:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:24:01.028 19:59:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:01.028 [2024-07-24 19:59:52.482311] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:01.028 19:59:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:01.028 19:59:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:01.028 19:59:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:01.028 19:59:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:01.028 19:59:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:01.028 19:59:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:01.028 19:59:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:01.028 19:59:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:01.028 19:59:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:01.028 19:59:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:01.029 19:59:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:01.029 19:59:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:01.287 19:59:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:01.287 "name": "raid_bdev1", 00:24:01.287 "uuid": "c7879b1f-80bb-4e8f-9812-5d46814f65c2", 00:24:01.287 "strip_size_kb": 0, 00:24:01.287 "state": "online", 00:24:01.287 "raid_level": "raid1", 00:24:01.287 "superblock": false, 00:24:01.287 "num_base_bdevs": 2, 00:24:01.287 "num_base_bdevs_discovered": 1, 00:24:01.287 "num_base_bdevs_operational": 1, 00:24:01.287 "base_bdevs_list": [ 00:24:01.287 { 00:24:01.287 "name": null, 00:24:01.287 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:01.287 "is_configured": false, 00:24:01.287 "data_offset": 0, 00:24:01.287 "data_size": 65536 00:24:01.287 }, 00:24:01.287 { 00:24:01.287 "name": "BaseBdev2", 00:24:01.287 "uuid": "802b3615-f6b1-5639-84d8-b1e53993b057", 00:24:01.287 "is_configured": true, 00:24:01.287 "data_offset": 0, 00:24:01.287 "data_size": 65536 00:24:01.287 } 00:24:01.287 ] 00:24:01.287 }' 00:24:01.288 19:59:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:01.288 19:59:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:01.855 19:59:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:02.114 [2024-07-24 19:59:53.557193] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:02.114 [2024-07-24 19:59:53.562190] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9a6f00 00:24:02.114 [2024-07-24 19:59:53.564519] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:02.114 19:59:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:03.051 19:59:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:03.051 19:59:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:03.051 19:59:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:03.051 19:59:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:03.051 19:59:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:03.051 19:59:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:03.051 19:59:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:03.309 19:59:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:03.309 "name": "raid_bdev1", 00:24:03.309 "uuid": "c7879b1f-80bb-4e8f-9812-5d46814f65c2", 00:24:03.309 "strip_size_kb": 0, 00:24:03.309 "state": "online", 00:24:03.309 "raid_level": "raid1", 00:24:03.309 "superblock": false, 00:24:03.309 "num_base_bdevs": 2, 00:24:03.309 "num_base_bdevs_discovered": 2, 00:24:03.309 "num_base_bdevs_operational": 2, 00:24:03.309 "process": { 00:24:03.309 "type": "rebuild", 00:24:03.309 "target": "spare", 00:24:03.309 "progress": { 00:24:03.309 "blocks": 24576, 00:24:03.309 "percent": 37 00:24:03.309 } 00:24:03.309 }, 00:24:03.309 "base_bdevs_list": [ 00:24:03.309 { 00:24:03.309 "name": "spare", 00:24:03.309 "uuid": "30a41601-0626-5f34-be20-cb5eb9df1eca", 00:24:03.309 "is_configured": true, 00:24:03.309 "data_offset": 0, 00:24:03.309 "data_size": 65536 00:24:03.309 }, 00:24:03.309 { 00:24:03.309 "name": "BaseBdev2", 00:24:03.309 "uuid": "802b3615-f6b1-5639-84d8-b1e53993b057", 00:24:03.309 "is_configured": true, 00:24:03.309 "data_offset": 0, 00:24:03.309 "data_size": 65536 00:24:03.309 } 00:24:03.309 ] 00:24:03.309 }' 00:24:03.310 19:59:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:03.310 19:59:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:03.310 19:59:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:03.568 19:59:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:03.568 19:59:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:03.568 [2024-07-24 19:59:55.138368] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:03.827 [2024-07-24 19:59:55.177302] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:03.827 [2024-07-24 19:59:55.177354] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:03.827 [2024-07-24 19:59:55.177370] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:03.827 [2024-07-24 19:59:55.177378] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:03.827 19:59:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:03.827 19:59:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:03.827 19:59:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:03.827 19:59:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:03.827 19:59:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:03.827 19:59:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:03.827 19:59:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:03.827 19:59:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:03.827 19:59:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:03.827 19:59:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:03.827 19:59:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:03.827 19:59:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:04.086 19:59:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:04.086 "name": "raid_bdev1", 00:24:04.086 "uuid": "c7879b1f-80bb-4e8f-9812-5d46814f65c2", 00:24:04.086 "strip_size_kb": 0, 00:24:04.086 "state": "online", 00:24:04.086 "raid_level": "raid1", 00:24:04.086 "superblock": false, 00:24:04.086 "num_base_bdevs": 2, 00:24:04.086 "num_base_bdevs_discovered": 1, 00:24:04.086 "num_base_bdevs_operational": 1, 00:24:04.086 "base_bdevs_list": [ 00:24:04.086 { 00:24:04.086 "name": null, 00:24:04.086 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:04.086 "is_configured": false, 00:24:04.086 "data_offset": 0, 00:24:04.086 "data_size": 65536 00:24:04.086 }, 00:24:04.086 { 00:24:04.086 "name": "BaseBdev2", 00:24:04.086 "uuid": "802b3615-f6b1-5639-84d8-b1e53993b057", 00:24:04.086 "is_configured": true, 00:24:04.086 "data_offset": 0, 00:24:04.086 "data_size": 65536 00:24:04.086 } 00:24:04.086 ] 00:24:04.086 }' 00:24:04.086 19:59:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:04.086 19:59:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:04.654 19:59:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:04.654 19:59:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:04.654 19:59:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:04.654 19:59:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:04.654 19:59:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:04.654 19:59:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:04.654 19:59:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:04.913 19:59:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:04.913 "name": "raid_bdev1", 00:24:04.913 "uuid": "c7879b1f-80bb-4e8f-9812-5d46814f65c2", 00:24:04.913 "strip_size_kb": 0, 00:24:04.913 "state": "online", 00:24:04.913 "raid_level": "raid1", 00:24:04.913 "superblock": false, 00:24:04.913 "num_base_bdevs": 2, 00:24:04.913 "num_base_bdevs_discovered": 1, 00:24:04.913 "num_base_bdevs_operational": 1, 00:24:04.913 "base_bdevs_list": [ 00:24:04.913 { 00:24:04.913 "name": null, 00:24:04.913 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:04.913 "is_configured": false, 00:24:04.913 "data_offset": 0, 00:24:04.913 "data_size": 65536 00:24:04.913 }, 00:24:04.913 { 00:24:04.913 "name": "BaseBdev2", 00:24:04.913 "uuid": "802b3615-f6b1-5639-84d8-b1e53993b057", 00:24:04.913 "is_configured": true, 00:24:04.913 "data_offset": 0, 00:24:04.913 "data_size": 65536 00:24:04.913 } 00:24:04.913 ] 00:24:04.913 }' 00:24:04.913 19:59:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:04.913 19:59:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:04.913 19:59:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:04.913 19:59:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:04.913 19:59:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:05.171 [2024-07-24 19:59:56.630335] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:05.171 [2024-07-24 19:59:56.635632] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9ae9a0 00:24:05.171 [2024-07-24 19:59:56.637121] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:05.171 19:59:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@678 -- # sleep 1 00:24:06.108 19:59:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:06.108 19:59:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:06.108 19:59:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:06.108 19:59:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:06.108 19:59:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:06.108 19:59:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:06.108 19:59:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:06.367 19:59:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:06.367 "name": "raid_bdev1", 00:24:06.367 "uuid": "c7879b1f-80bb-4e8f-9812-5d46814f65c2", 00:24:06.367 "strip_size_kb": 0, 00:24:06.367 "state": "online", 00:24:06.367 "raid_level": "raid1", 00:24:06.367 "superblock": false, 00:24:06.367 "num_base_bdevs": 2, 00:24:06.367 "num_base_bdevs_discovered": 2, 00:24:06.367 "num_base_bdevs_operational": 2, 00:24:06.367 "process": { 00:24:06.367 "type": "rebuild", 00:24:06.367 "target": "spare", 00:24:06.367 "progress": { 00:24:06.367 "blocks": 24576, 00:24:06.367 "percent": 37 00:24:06.367 } 00:24:06.367 }, 00:24:06.367 "base_bdevs_list": [ 00:24:06.367 { 00:24:06.367 "name": "spare", 00:24:06.367 "uuid": "30a41601-0626-5f34-be20-cb5eb9df1eca", 00:24:06.367 "is_configured": true, 00:24:06.367 "data_offset": 0, 00:24:06.367 "data_size": 65536 00:24:06.367 }, 00:24:06.367 { 00:24:06.367 "name": "BaseBdev2", 00:24:06.367 "uuid": "802b3615-f6b1-5639-84d8-b1e53993b057", 00:24:06.367 "is_configured": true, 00:24:06.367 "data_offset": 0, 00:24:06.367 "data_size": 65536 00:24:06.367 } 00:24:06.367 ] 00:24:06.367 }' 00:24:06.367 19:59:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:06.367 19:59:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:06.367 19:59:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:06.626 19:59:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:06.626 19:59:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@681 -- # '[' false = true ']' 00:24:06.626 19:59:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=2 00:24:06.626 19:59:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:24:06.626 19:59:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # '[' 2 -gt 2 ']' 00:24:06.626 19:59:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # local timeout=818 00:24:06.626 19:59:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:24:06.626 19:59:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:06.627 19:59:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:06.627 19:59:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:06.627 19:59:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:06.627 19:59:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:06.627 19:59:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:06.627 19:59:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:06.886 19:59:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:06.886 "name": "raid_bdev1", 00:24:06.886 "uuid": "c7879b1f-80bb-4e8f-9812-5d46814f65c2", 00:24:06.886 "strip_size_kb": 0, 00:24:06.886 "state": "online", 00:24:06.886 "raid_level": "raid1", 00:24:06.886 "superblock": false, 00:24:06.886 "num_base_bdevs": 2, 00:24:06.886 "num_base_bdevs_discovered": 2, 00:24:06.886 "num_base_bdevs_operational": 2, 00:24:06.886 "process": { 00:24:06.886 "type": "rebuild", 00:24:06.886 "target": "spare", 00:24:06.886 "progress": { 00:24:06.886 "blocks": 30720, 00:24:06.886 "percent": 46 00:24:06.886 } 00:24:06.886 }, 00:24:06.886 "base_bdevs_list": [ 00:24:06.886 { 00:24:06.886 "name": "spare", 00:24:06.886 "uuid": "30a41601-0626-5f34-be20-cb5eb9df1eca", 00:24:06.886 "is_configured": true, 00:24:06.886 "data_offset": 0, 00:24:06.886 "data_size": 65536 00:24:06.886 }, 00:24:06.886 { 00:24:06.886 "name": "BaseBdev2", 00:24:06.886 "uuid": "802b3615-f6b1-5639-84d8-b1e53993b057", 00:24:06.886 "is_configured": true, 00:24:06.886 "data_offset": 0, 00:24:06.886 "data_size": 65536 00:24:06.886 } 00:24:06.886 ] 00:24:06.886 }' 00:24:06.886 19:59:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:06.886 19:59:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:06.886 19:59:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:06.886 19:59:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:06.886 19:59:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@726 -- # sleep 1 00:24:07.823 19:59:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:24:07.823 19:59:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:07.823 19:59:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:07.823 19:59:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:07.823 19:59:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:07.823 19:59:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:07.823 19:59:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:07.823 19:59:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:08.082 19:59:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:08.082 "name": "raid_bdev1", 00:24:08.082 "uuid": "c7879b1f-80bb-4e8f-9812-5d46814f65c2", 00:24:08.082 "strip_size_kb": 0, 00:24:08.082 "state": "online", 00:24:08.082 "raid_level": "raid1", 00:24:08.082 "superblock": false, 00:24:08.082 "num_base_bdevs": 2, 00:24:08.082 "num_base_bdevs_discovered": 2, 00:24:08.082 "num_base_bdevs_operational": 2, 00:24:08.082 "process": { 00:24:08.082 "type": "rebuild", 00:24:08.082 "target": "spare", 00:24:08.082 "progress": { 00:24:08.082 "blocks": 59392, 00:24:08.082 "percent": 90 00:24:08.082 } 00:24:08.082 }, 00:24:08.082 "base_bdevs_list": [ 00:24:08.082 { 00:24:08.082 "name": "spare", 00:24:08.082 "uuid": "30a41601-0626-5f34-be20-cb5eb9df1eca", 00:24:08.082 "is_configured": true, 00:24:08.082 "data_offset": 0, 00:24:08.082 "data_size": 65536 00:24:08.082 }, 00:24:08.082 { 00:24:08.082 "name": "BaseBdev2", 00:24:08.082 "uuid": "802b3615-f6b1-5639-84d8-b1e53993b057", 00:24:08.082 "is_configured": true, 00:24:08.082 "data_offset": 0, 00:24:08.082 "data_size": 65536 00:24:08.082 } 00:24:08.082 ] 00:24:08.082 }' 00:24:08.082 19:59:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:08.082 19:59:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:08.082 19:59:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:08.341 19:59:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:08.341 19:59:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@726 -- # sleep 1 00:24:08.341 [2024-07-24 19:59:59.862558] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:08.341 [2024-07-24 19:59:59.862615] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:08.341 [2024-07-24 19:59:59.862651] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:09.278 20:00:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:24:09.278 20:00:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:09.278 20:00:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:09.278 20:00:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:09.278 20:00:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:09.278 20:00:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:09.278 20:00:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:09.278 20:00:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:09.537 20:00:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:09.537 "name": "raid_bdev1", 00:24:09.537 "uuid": "c7879b1f-80bb-4e8f-9812-5d46814f65c2", 00:24:09.537 "strip_size_kb": 0, 00:24:09.537 "state": "online", 00:24:09.537 "raid_level": "raid1", 00:24:09.537 "superblock": false, 00:24:09.537 "num_base_bdevs": 2, 00:24:09.537 "num_base_bdevs_discovered": 2, 00:24:09.537 "num_base_bdevs_operational": 2, 00:24:09.537 "base_bdevs_list": [ 00:24:09.537 { 00:24:09.537 "name": "spare", 00:24:09.537 "uuid": "30a41601-0626-5f34-be20-cb5eb9df1eca", 00:24:09.537 "is_configured": true, 00:24:09.537 "data_offset": 0, 00:24:09.537 "data_size": 65536 00:24:09.537 }, 00:24:09.537 { 00:24:09.537 "name": "BaseBdev2", 00:24:09.537 "uuid": "802b3615-f6b1-5639-84d8-b1e53993b057", 00:24:09.537 "is_configured": true, 00:24:09.537 "data_offset": 0, 00:24:09.537 "data_size": 65536 00:24:09.537 } 00:24:09.537 ] 00:24:09.537 }' 00:24:09.537 20:00:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:09.537 20:00:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:09.537 20:00:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:09.537 20:00:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:09.537 20:00:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@724 -- # break 00:24:09.537 20:00:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:09.537 20:00:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:09.537 20:00:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:09.537 20:00:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:09.537 20:00:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:09.537 20:00:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:09.537 20:00:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:09.797 20:00:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:09.797 "name": "raid_bdev1", 00:24:09.797 "uuid": "c7879b1f-80bb-4e8f-9812-5d46814f65c2", 00:24:09.797 "strip_size_kb": 0, 00:24:09.797 "state": "online", 00:24:09.797 "raid_level": "raid1", 00:24:09.797 "superblock": false, 00:24:09.797 "num_base_bdevs": 2, 00:24:09.797 "num_base_bdevs_discovered": 2, 00:24:09.797 "num_base_bdevs_operational": 2, 00:24:09.797 "base_bdevs_list": [ 00:24:09.797 { 00:24:09.797 "name": "spare", 00:24:09.797 "uuid": "30a41601-0626-5f34-be20-cb5eb9df1eca", 00:24:09.797 "is_configured": true, 00:24:09.797 "data_offset": 0, 00:24:09.797 "data_size": 65536 00:24:09.797 }, 00:24:09.797 { 00:24:09.797 "name": "BaseBdev2", 00:24:09.797 "uuid": "802b3615-f6b1-5639-84d8-b1e53993b057", 00:24:09.797 "is_configured": true, 00:24:09.797 "data_offset": 0, 00:24:09.797 "data_size": 65536 00:24:09.797 } 00:24:09.797 ] 00:24:09.797 }' 00:24:09.797 20:00:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:09.797 20:00:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:09.797 20:00:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:09.797 20:00:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:09.797 20:00:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:09.797 20:00:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:09.797 20:00:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:09.797 20:00:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:09.797 20:00:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:09.797 20:00:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:09.797 20:00:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:09.797 20:00:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:09.797 20:00:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:09.797 20:00:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:09.797 20:00:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:09.797 20:00:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:10.056 20:00:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:10.056 "name": "raid_bdev1", 00:24:10.056 "uuid": "c7879b1f-80bb-4e8f-9812-5d46814f65c2", 00:24:10.056 "strip_size_kb": 0, 00:24:10.056 "state": "online", 00:24:10.056 "raid_level": "raid1", 00:24:10.056 "superblock": false, 00:24:10.056 "num_base_bdevs": 2, 00:24:10.056 "num_base_bdevs_discovered": 2, 00:24:10.056 "num_base_bdevs_operational": 2, 00:24:10.056 "base_bdevs_list": [ 00:24:10.056 { 00:24:10.056 "name": "spare", 00:24:10.056 "uuid": "30a41601-0626-5f34-be20-cb5eb9df1eca", 00:24:10.056 "is_configured": true, 00:24:10.056 "data_offset": 0, 00:24:10.056 "data_size": 65536 00:24:10.056 }, 00:24:10.056 { 00:24:10.056 "name": "BaseBdev2", 00:24:10.056 "uuid": "802b3615-f6b1-5639-84d8-b1e53993b057", 00:24:10.056 "is_configured": true, 00:24:10.056 "data_offset": 0, 00:24:10.056 "data_size": 65536 00:24:10.056 } 00:24:10.056 ] 00:24:10.056 }' 00:24:10.056 20:00:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:10.056 20:00:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:10.985 20:00:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:10.985 [2024-07-24 20:00:02.441916] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:10.985 [2024-07-24 20:00:02.441946] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:10.985 [2024-07-24 20:00:02.442004] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:10.985 [2024-07-24 20:00:02.442061] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:10.985 [2024-07-24 20:00:02.442072] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa71c90 name raid_bdev1, state offline 00:24:10.985 20:00:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:10.985 20:00:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@735 -- # jq length 00:24:11.243 20:00:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:24:11.243 20:00:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:24:11.243 20:00:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # '[' false = true ']' 00:24:11.243 20:00:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@752 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:24:11.243 20:00:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:11.243 20:00:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:24:11.243 20:00:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:11.243 20:00:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:11.243 20:00:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:11.243 20:00:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:24:11.243 20:00:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:11.243 20:00:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:11.243 20:00:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:24:11.501 /dev/nbd0 00:24:11.501 20:00:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:11.501 20:00:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:11.501 20:00:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:24:11.501 20:00:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:24:11.501 20:00:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:24:11.501 20:00:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:24:11.501 20:00:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:24:11.501 20:00:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:24:11.501 20:00:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:24:11.501 20:00:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:24:11.501 20:00:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:11.501 1+0 records in 00:24:11.501 1+0 records out 00:24:11.501 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000166853 s, 24.5 MB/s 00:24:11.501 20:00:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:11.501 20:00:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:24:11.501 20:00:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:11.501 20:00:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:24:11.502 20:00:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:24:11.502 20:00:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:11.502 20:00:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:11.502 20:00:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:24:11.502 /dev/nbd1 00:24:11.760 20:00:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:11.760 20:00:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:11.760 20:00:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:24:11.760 20:00:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:24:11.760 20:00:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:24:11.761 20:00:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:24:11.761 20:00:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:24:11.761 20:00:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:24:11.761 20:00:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:24:11.761 20:00:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:24:11.761 20:00:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:11.761 1+0 records in 00:24:11.761 1+0 records out 00:24:11.761 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000634309 s, 6.5 MB/s 00:24:11.761 20:00:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:11.761 20:00:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:24:11.761 20:00:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:11.761 20:00:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:24:11.761 20:00:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:24:11.761 20:00:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:11.761 20:00:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:11.761 20:00:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@753 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:24:11.761 20:00:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@754 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:24:11.761 20:00:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:11.761 20:00:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:11.761 20:00:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:11.761 20:00:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:24:11.761 20:00:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:11.761 20:00:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:12.018 20:00:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:12.018 20:00:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:12.018 20:00:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:12.018 20:00:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:12.018 20:00:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:12.018 20:00:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:12.018 20:00:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:24:12.018 20:00:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:24:12.018 20:00:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:12.018 20:00:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:12.284 20:00:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:12.284 20:00:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:12.284 20:00:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:12.284 20:00:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:12.284 20:00:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:12.284 20:00:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:12.284 20:00:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:24:12.284 20:00:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:24:12.284 20:00:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@758 -- # '[' false = true ']' 00:24:12.284 20:00:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@798 -- # killprocess 1484691 00:24:12.284 20:00:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@950 -- # '[' -z 1484691 ']' 00:24:12.284 20:00:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # kill -0 1484691 00:24:12.284 20:00:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # uname 00:24:12.284 20:00:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:24:12.284 20:00:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1484691 00:24:12.284 20:00:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:24:12.284 20:00:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:24:12.284 20:00:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1484691' 00:24:12.284 killing process with pid 1484691 00:24:12.284 20:00:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@969 -- # kill 1484691 00:24:12.284 Received shutdown signal, test time was about 60.000000 seconds 00:24:12.284 00:24:12.284 Latency(us) 00:24:12.284 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:12.284 =================================================================================================================== 00:24:12.284 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:24:12.284 [2024-07-24 20:00:03.750991] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:12.284 20:00:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@974 -- # wait 1484691 00:24:12.284 [2024-07-24 20:00:03.779042] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:12.544 20:00:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@800 -- # return 0 00:24:12.544 00:24:12.544 real 0m22.639s 00:24:12.544 user 0m29.654s 00:24:12.544 sys 0m5.402s 00:24:12.544 20:00:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:24:12.544 20:00:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:12.544 ************************************ 00:24:12.544 END TEST raid_rebuild_test 00:24:12.544 ************************************ 00:24:12.544 20:00:04 bdev_raid -- bdev/bdev_raid.sh@958 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:24:12.544 20:00:04 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:24:12.544 20:00:04 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:24:12.544 20:00:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:12.544 ************************************ 00:24:12.544 START TEST raid_rebuild_test_sb 00:24:12.544 ************************************ 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 true false true 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=2 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@586 -- # local superblock=true 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@587 -- # local background_io=false 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # local verify=true 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # local strip_size 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # local create_arg 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@594 -- # local data_offset 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # '[' true = true ']' 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # create_arg+=' -s' 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # raid_pid=1487869 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@613 -- # waitforlisten 1487869 /var/tmp/spdk-raid.sock 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1487869 ']' 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:12.544 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:24:12.544 20:00:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:12.803 [2024-07-24 20:00:04.154455] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:24:12.803 [2024-07-24 20:00:04.154526] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1487869 ] 00:24:12.803 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:12.803 Zero copy mechanism will not be used. 00:24:12.803 [2024-07-24 20:00:04.282111] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:12.803 [2024-07-24 20:00:04.384161] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:13.061 [2024-07-24 20:00:04.443592] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:13.061 [2024-07-24 20:00:04.443617] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:13.629 20:00:05 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:24:13.629 20:00:05 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@864 -- # return 0 00:24:13.629 20:00:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:24:13.629 20:00:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:13.888 BaseBdev1_malloc 00:24:13.888 20:00:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:14.529 [2024-07-24 20:00:05.807362] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:14.529 [2024-07-24 20:00:05.807420] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:14.529 [2024-07-24 20:00:05.807446] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x995cd0 00:24:14.529 [2024-07-24 20:00:05.807459] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:14.529 [2024-07-24 20:00:05.809148] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:14.529 [2024-07-24 20:00:05.809184] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:14.529 BaseBdev1 00:24:14.529 20:00:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:24:14.529 20:00:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:14.529 BaseBdev2_malloc 00:24:14.810 20:00:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:15.069 [2024-07-24 20:00:06.574124] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:15.069 [2024-07-24 20:00:06.574177] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:15.069 [2024-07-24 20:00:06.574199] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x999460 00:24:15.069 [2024-07-24 20:00:06.574211] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:15.069 [2024-07-24 20:00:06.575802] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:15.069 [2024-07-24 20:00:06.575830] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:15.069 BaseBdev2 00:24:15.069 20:00:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:15.329 spare_malloc 00:24:15.329 20:00:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:15.590 spare_delay 00:24:15.590 20:00:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:15.849 [2024-07-24 20:00:07.320672] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:15.849 [2024-07-24 20:00:07.320722] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:15.849 [2024-07-24 20:00:07.320745] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x98dc70 00:24:15.849 [2024-07-24 20:00:07.320758] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:15.849 [2024-07-24 20:00:07.322362] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:15.849 [2024-07-24 20:00:07.322399] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:15.849 spare 00:24:15.849 20:00:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:16.109 [2024-07-24 20:00:07.553322] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:16.109 [2024-07-24 20:00:07.554621] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:16.109 [2024-07-24 20:00:07.554783] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xa58c90 00:24:16.109 [2024-07-24 20:00:07.554797] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:16.109 [2024-07-24 20:00:07.555003] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x98df00 00:24:16.109 [2024-07-24 20:00:07.555140] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa58c90 00:24:16.109 [2024-07-24 20:00:07.555150] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa58c90 00:24:16.109 [2024-07-24 20:00:07.555248] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:16.109 20:00:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:16.109 20:00:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:16.109 20:00:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:16.109 20:00:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:16.109 20:00:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:16.109 20:00:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:16.109 20:00:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:16.109 20:00:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:16.109 20:00:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:16.109 20:00:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:16.109 20:00:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:16.109 20:00:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:16.368 20:00:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:16.368 "name": "raid_bdev1", 00:24:16.368 "uuid": "994a6edc-8625-4577-8801-1827f14622bc", 00:24:16.368 "strip_size_kb": 0, 00:24:16.368 "state": "online", 00:24:16.368 "raid_level": "raid1", 00:24:16.368 "superblock": true, 00:24:16.368 "num_base_bdevs": 2, 00:24:16.368 "num_base_bdevs_discovered": 2, 00:24:16.368 "num_base_bdevs_operational": 2, 00:24:16.368 "base_bdevs_list": [ 00:24:16.368 { 00:24:16.368 "name": "BaseBdev1", 00:24:16.368 "uuid": "d60d04c9-5e68-5ede-b083-b2873ec65692", 00:24:16.368 "is_configured": true, 00:24:16.368 "data_offset": 2048, 00:24:16.368 "data_size": 63488 00:24:16.368 }, 00:24:16.368 { 00:24:16.368 "name": "BaseBdev2", 00:24:16.368 "uuid": "387fc68f-050a-570c-8ad9-025751288013", 00:24:16.368 "is_configured": true, 00:24:16.368 "data_offset": 2048, 00:24:16.368 "data_size": 63488 00:24:16.368 } 00:24:16.368 ] 00:24:16.368 }' 00:24:16.368 20:00:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:16.369 20:00:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:16.936 20:00:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:24:16.936 20:00:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:17.195 [2024-07-24 20:00:08.660455] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:17.195 20:00:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=63488 00:24:17.195 20:00:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:17.195 20:00:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:17.455 20:00:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # data_offset=2048 00:24:17.455 20:00:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@636 -- # '[' false = true ']' 00:24:17.455 20:00:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # '[' true = true ']' 00:24:17.455 20:00:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@640 -- # local write_unit_size 00:24:17.455 20:00:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@643 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:24:17.455 20:00:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:17.455 20:00:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:24:17.455 20:00:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:17.455 20:00:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:17.455 20:00:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:17.455 20:00:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:24:17.455 20:00:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:17.455 20:00:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:17.455 20:00:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:24:17.714 [2024-07-24 20:00:09.153548] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x98df00 00:24:17.714 /dev/nbd0 00:24:17.714 20:00:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:17.714 20:00:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:17.714 20:00:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:24:17.714 20:00:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:24:17.714 20:00:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:24:17.714 20:00:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:24:17.714 20:00:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:24:17.714 20:00:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:24:17.714 20:00:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:24:17.714 20:00:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:24:17.714 20:00:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:17.714 1+0 records in 00:24:17.714 1+0 records out 00:24:17.714 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000260354 s, 15.7 MB/s 00:24:17.714 20:00:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:17.714 20:00:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:24:17.714 20:00:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:17.714 20:00:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:24:17.714 20:00:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:24:17.714 20:00:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:17.714 20:00:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:17.714 20:00:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@644 -- # '[' raid1 = raid5f ']' 00:24:17.714 20:00:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@648 -- # write_unit_size=1 00:24:17.714 20:00:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@650 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:24:23.012 63488+0 records in 00:24:23.012 63488+0 records out 00:24:23.012 32505856 bytes (33 MB, 31 MiB) copied, 4.71695 s, 6.9 MB/s 00:24:23.012 20:00:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@651 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:23.012 20:00:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:23.012 20:00:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:23.012 20:00:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:23.012 20:00:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:24:23.012 20:00:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:23.012 20:00:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:23.012 [2024-07-24 20:00:14.205093] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:23.012 20:00:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:23.012 20:00:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:23.012 20:00:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:23.012 20:00:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:23.012 20:00:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:23.012 20:00:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:23.012 20:00:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:24:23.012 20:00:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:24:23.012 20:00:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:23.012 [2024-07-24 20:00:14.440854] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:23.012 20:00:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:23.012 20:00:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:23.012 20:00:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:23.012 20:00:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:23.012 20:00:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:23.012 20:00:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:23.012 20:00:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:23.012 20:00:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:23.012 20:00:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:23.012 20:00:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:23.012 20:00:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:23.012 20:00:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:23.271 20:00:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:23.271 "name": "raid_bdev1", 00:24:23.271 "uuid": "994a6edc-8625-4577-8801-1827f14622bc", 00:24:23.271 "strip_size_kb": 0, 00:24:23.271 "state": "online", 00:24:23.271 "raid_level": "raid1", 00:24:23.271 "superblock": true, 00:24:23.271 "num_base_bdevs": 2, 00:24:23.271 "num_base_bdevs_discovered": 1, 00:24:23.271 "num_base_bdevs_operational": 1, 00:24:23.271 "base_bdevs_list": [ 00:24:23.271 { 00:24:23.271 "name": null, 00:24:23.271 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:23.271 "is_configured": false, 00:24:23.271 "data_offset": 2048, 00:24:23.271 "data_size": 63488 00:24:23.271 }, 00:24:23.271 { 00:24:23.271 "name": "BaseBdev2", 00:24:23.271 "uuid": "387fc68f-050a-570c-8ad9-025751288013", 00:24:23.271 "is_configured": true, 00:24:23.271 "data_offset": 2048, 00:24:23.271 "data_size": 63488 00:24:23.271 } 00:24:23.271 ] 00:24:23.271 }' 00:24:23.271 20:00:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:23.271 20:00:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:23.841 20:00:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:23.841 [2024-07-24 20:00:15.331212] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:23.841 [2024-07-24 20:00:15.336205] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x98d4e0 00:24:23.841 [2024-07-24 20:00:15.338558] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:23.841 20:00:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:24.793 20:00:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:24.793 20:00:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:24.793 20:00:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:24.793 20:00:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:24.793 20:00:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:24.793 20:00:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:24.793 20:00:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:25.052 20:00:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:25.052 "name": "raid_bdev1", 00:24:25.052 "uuid": "994a6edc-8625-4577-8801-1827f14622bc", 00:24:25.052 "strip_size_kb": 0, 00:24:25.052 "state": "online", 00:24:25.052 "raid_level": "raid1", 00:24:25.052 "superblock": true, 00:24:25.052 "num_base_bdevs": 2, 00:24:25.052 "num_base_bdevs_discovered": 2, 00:24:25.052 "num_base_bdevs_operational": 2, 00:24:25.052 "process": { 00:24:25.052 "type": "rebuild", 00:24:25.052 "target": "spare", 00:24:25.052 "progress": { 00:24:25.052 "blocks": 24576, 00:24:25.052 "percent": 38 00:24:25.052 } 00:24:25.052 }, 00:24:25.052 "base_bdevs_list": [ 00:24:25.052 { 00:24:25.052 "name": "spare", 00:24:25.052 "uuid": "3d432d7d-e854-5be5-94a0-7f3bad44fc33", 00:24:25.052 "is_configured": true, 00:24:25.052 "data_offset": 2048, 00:24:25.052 "data_size": 63488 00:24:25.052 }, 00:24:25.052 { 00:24:25.052 "name": "BaseBdev2", 00:24:25.053 "uuid": "387fc68f-050a-570c-8ad9-025751288013", 00:24:25.053 "is_configured": true, 00:24:25.053 "data_offset": 2048, 00:24:25.053 "data_size": 63488 00:24:25.053 } 00:24:25.053 ] 00:24:25.053 }' 00:24:25.053 20:00:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:25.311 20:00:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:25.311 20:00:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:25.311 20:00:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:25.311 20:00:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:25.311 [2024-07-24 20:00:16.868632] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:25.570 [2024-07-24 20:00:16.951035] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:25.570 [2024-07-24 20:00:16.951085] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:25.570 [2024-07-24 20:00:16.951102] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:25.570 [2024-07-24 20:00:16.951110] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:25.570 20:00:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:25.570 20:00:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:25.570 20:00:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:25.570 20:00:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:25.570 20:00:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:25.570 20:00:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:25.570 20:00:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:25.570 20:00:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:25.570 20:00:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:25.570 20:00:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:25.570 20:00:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:25.570 20:00:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:25.829 20:00:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:25.829 "name": "raid_bdev1", 00:24:25.829 "uuid": "994a6edc-8625-4577-8801-1827f14622bc", 00:24:25.829 "strip_size_kb": 0, 00:24:25.829 "state": "online", 00:24:25.829 "raid_level": "raid1", 00:24:25.829 "superblock": true, 00:24:25.829 "num_base_bdevs": 2, 00:24:25.829 "num_base_bdevs_discovered": 1, 00:24:25.829 "num_base_bdevs_operational": 1, 00:24:25.829 "base_bdevs_list": [ 00:24:25.829 { 00:24:25.829 "name": null, 00:24:25.829 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:25.829 "is_configured": false, 00:24:25.829 "data_offset": 2048, 00:24:25.829 "data_size": 63488 00:24:25.829 }, 00:24:25.829 { 00:24:25.829 "name": "BaseBdev2", 00:24:25.829 "uuid": "387fc68f-050a-570c-8ad9-025751288013", 00:24:25.829 "is_configured": true, 00:24:25.829 "data_offset": 2048, 00:24:25.829 "data_size": 63488 00:24:25.829 } 00:24:25.829 ] 00:24:25.829 }' 00:24:25.829 20:00:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:25.829 20:00:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:26.398 20:00:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:26.398 20:00:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:26.398 20:00:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:26.398 20:00:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:26.398 20:00:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:26.398 20:00:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:26.398 20:00:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:26.657 20:00:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:26.657 "name": "raid_bdev1", 00:24:26.657 "uuid": "994a6edc-8625-4577-8801-1827f14622bc", 00:24:26.657 "strip_size_kb": 0, 00:24:26.657 "state": "online", 00:24:26.657 "raid_level": "raid1", 00:24:26.657 "superblock": true, 00:24:26.657 "num_base_bdevs": 2, 00:24:26.657 "num_base_bdevs_discovered": 1, 00:24:26.657 "num_base_bdevs_operational": 1, 00:24:26.657 "base_bdevs_list": [ 00:24:26.657 { 00:24:26.657 "name": null, 00:24:26.657 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:26.657 "is_configured": false, 00:24:26.657 "data_offset": 2048, 00:24:26.657 "data_size": 63488 00:24:26.657 }, 00:24:26.657 { 00:24:26.657 "name": "BaseBdev2", 00:24:26.657 "uuid": "387fc68f-050a-570c-8ad9-025751288013", 00:24:26.657 "is_configured": true, 00:24:26.657 "data_offset": 2048, 00:24:26.657 "data_size": 63488 00:24:26.657 } 00:24:26.657 ] 00:24:26.657 }' 00:24:26.657 20:00:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:26.657 20:00:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:26.657 20:00:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:26.657 20:00:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:26.657 20:00:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:26.916 [2024-07-24 20:00:18.395141] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:26.916 [2024-07-24 20:00:18.400782] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9959a0 00:24:26.916 [2024-07-24 20:00:18.402294] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:26.916 20:00:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@678 -- # sleep 1 00:24:27.853 20:00:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:27.853 20:00:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:27.853 20:00:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:27.853 20:00:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:27.853 20:00:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:27.853 20:00:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:27.853 20:00:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:28.114 20:00:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:28.114 "name": "raid_bdev1", 00:24:28.114 "uuid": "994a6edc-8625-4577-8801-1827f14622bc", 00:24:28.114 "strip_size_kb": 0, 00:24:28.114 "state": "online", 00:24:28.114 "raid_level": "raid1", 00:24:28.114 "superblock": true, 00:24:28.114 "num_base_bdevs": 2, 00:24:28.114 "num_base_bdevs_discovered": 2, 00:24:28.114 "num_base_bdevs_operational": 2, 00:24:28.114 "process": { 00:24:28.114 "type": "rebuild", 00:24:28.114 "target": "spare", 00:24:28.114 "progress": { 00:24:28.114 "blocks": 24576, 00:24:28.114 "percent": 38 00:24:28.114 } 00:24:28.114 }, 00:24:28.114 "base_bdevs_list": [ 00:24:28.114 { 00:24:28.114 "name": "spare", 00:24:28.114 "uuid": "3d432d7d-e854-5be5-94a0-7f3bad44fc33", 00:24:28.114 "is_configured": true, 00:24:28.114 "data_offset": 2048, 00:24:28.114 "data_size": 63488 00:24:28.114 }, 00:24:28.114 { 00:24:28.114 "name": "BaseBdev2", 00:24:28.114 "uuid": "387fc68f-050a-570c-8ad9-025751288013", 00:24:28.114 "is_configured": true, 00:24:28.114 "data_offset": 2048, 00:24:28.114 "data_size": 63488 00:24:28.114 } 00:24:28.114 ] 00:24:28.114 }' 00:24:28.114 20:00:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:28.373 20:00:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:28.373 20:00:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:28.373 20:00:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:28.373 20:00:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@681 -- # '[' true = true ']' 00:24:28.373 20:00:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@681 -- # '[' = false ']' 00:24:28.373 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 681: [: =: unary operator expected 00:24:28.374 20:00:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=2 00:24:28.374 20:00:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:24:28.374 20:00:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # '[' 2 -gt 2 ']' 00:24:28.374 20:00:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # local timeout=840 00:24:28.374 20:00:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:24:28.374 20:00:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:28.374 20:00:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:28.374 20:00:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:28.374 20:00:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:28.374 20:00:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:28.374 20:00:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:28.374 20:00:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:28.632 20:00:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:28.632 "name": "raid_bdev1", 00:24:28.632 "uuid": "994a6edc-8625-4577-8801-1827f14622bc", 00:24:28.632 "strip_size_kb": 0, 00:24:28.632 "state": "online", 00:24:28.632 "raid_level": "raid1", 00:24:28.632 "superblock": true, 00:24:28.632 "num_base_bdevs": 2, 00:24:28.632 "num_base_bdevs_discovered": 2, 00:24:28.632 "num_base_bdevs_operational": 2, 00:24:28.632 "process": { 00:24:28.632 "type": "rebuild", 00:24:28.632 "target": "spare", 00:24:28.632 "progress": { 00:24:28.632 "blocks": 30720, 00:24:28.632 "percent": 48 00:24:28.632 } 00:24:28.632 }, 00:24:28.633 "base_bdevs_list": [ 00:24:28.633 { 00:24:28.633 "name": "spare", 00:24:28.633 "uuid": "3d432d7d-e854-5be5-94a0-7f3bad44fc33", 00:24:28.633 "is_configured": true, 00:24:28.633 "data_offset": 2048, 00:24:28.633 "data_size": 63488 00:24:28.633 }, 00:24:28.633 { 00:24:28.633 "name": "BaseBdev2", 00:24:28.633 "uuid": "387fc68f-050a-570c-8ad9-025751288013", 00:24:28.633 "is_configured": true, 00:24:28.633 "data_offset": 2048, 00:24:28.633 "data_size": 63488 00:24:28.633 } 00:24:28.633 ] 00:24:28.633 }' 00:24:28.633 20:00:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:28.633 20:00:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:28.633 20:00:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:28.633 20:00:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:28.633 20:00:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@726 -- # sleep 1 00:24:29.570 20:00:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:24:29.570 20:00:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:29.570 20:00:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:29.570 20:00:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:29.570 20:00:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:29.570 20:00:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:29.570 20:00:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:29.570 20:00:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:29.829 20:00:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:29.829 "name": "raid_bdev1", 00:24:29.829 "uuid": "994a6edc-8625-4577-8801-1827f14622bc", 00:24:29.829 "strip_size_kb": 0, 00:24:29.829 "state": "online", 00:24:29.829 "raid_level": "raid1", 00:24:29.829 "superblock": true, 00:24:29.829 "num_base_bdevs": 2, 00:24:29.829 "num_base_bdevs_discovered": 2, 00:24:29.829 "num_base_bdevs_operational": 2, 00:24:29.829 "process": { 00:24:29.829 "type": "rebuild", 00:24:29.829 "target": "spare", 00:24:29.829 "progress": { 00:24:29.829 "blocks": 59392, 00:24:29.829 "percent": 93 00:24:29.829 } 00:24:29.829 }, 00:24:29.829 "base_bdevs_list": [ 00:24:29.829 { 00:24:29.829 "name": "spare", 00:24:29.829 "uuid": "3d432d7d-e854-5be5-94a0-7f3bad44fc33", 00:24:29.829 "is_configured": true, 00:24:29.829 "data_offset": 2048, 00:24:29.829 "data_size": 63488 00:24:29.829 }, 00:24:29.829 { 00:24:29.829 "name": "BaseBdev2", 00:24:29.829 "uuid": "387fc68f-050a-570c-8ad9-025751288013", 00:24:29.829 "is_configured": true, 00:24:29.829 "data_offset": 2048, 00:24:29.829 "data_size": 63488 00:24:29.829 } 00:24:29.829 ] 00:24:29.829 }' 00:24:29.829 20:00:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:29.829 20:00:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:29.829 20:00:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:30.088 20:00:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:30.088 20:00:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@726 -- # sleep 1 00:24:30.088 [2024-07-24 20:00:21.527093] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:30.088 [2024-07-24 20:00:21.527151] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:30.088 [2024-07-24 20:00:21.527233] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:31.026 20:00:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:24:31.026 20:00:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:31.026 20:00:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:31.026 20:00:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:31.026 20:00:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:31.026 20:00:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:31.026 20:00:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:31.026 20:00:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:31.284 20:00:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:31.284 "name": "raid_bdev1", 00:24:31.284 "uuid": "994a6edc-8625-4577-8801-1827f14622bc", 00:24:31.284 "strip_size_kb": 0, 00:24:31.284 "state": "online", 00:24:31.284 "raid_level": "raid1", 00:24:31.284 "superblock": true, 00:24:31.284 "num_base_bdevs": 2, 00:24:31.284 "num_base_bdevs_discovered": 2, 00:24:31.284 "num_base_bdevs_operational": 2, 00:24:31.284 "base_bdevs_list": [ 00:24:31.284 { 00:24:31.284 "name": "spare", 00:24:31.284 "uuid": "3d432d7d-e854-5be5-94a0-7f3bad44fc33", 00:24:31.284 "is_configured": true, 00:24:31.284 "data_offset": 2048, 00:24:31.284 "data_size": 63488 00:24:31.284 }, 00:24:31.284 { 00:24:31.284 "name": "BaseBdev2", 00:24:31.284 "uuid": "387fc68f-050a-570c-8ad9-025751288013", 00:24:31.284 "is_configured": true, 00:24:31.284 "data_offset": 2048, 00:24:31.284 "data_size": 63488 00:24:31.284 } 00:24:31.284 ] 00:24:31.284 }' 00:24:31.284 20:00:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:31.284 20:00:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:31.284 20:00:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:31.284 20:00:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:31.284 20:00:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@724 -- # break 00:24:31.284 20:00:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:31.284 20:00:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:31.284 20:00:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:31.284 20:00:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:31.284 20:00:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:31.284 20:00:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:31.284 20:00:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:31.542 20:00:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:31.542 "name": "raid_bdev1", 00:24:31.542 "uuid": "994a6edc-8625-4577-8801-1827f14622bc", 00:24:31.542 "strip_size_kb": 0, 00:24:31.542 "state": "online", 00:24:31.542 "raid_level": "raid1", 00:24:31.542 "superblock": true, 00:24:31.542 "num_base_bdevs": 2, 00:24:31.542 "num_base_bdevs_discovered": 2, 00:24:31.542 "num_base_bdevs_operational": 2, 00:24:31.542 "base_bdevs_list": [ 00:24:31.542 { 00:24:31.542 "name": "spare", 00:24:31.542 "uuid": "3d432d7d-e854-5be5-94a0-7f3bad44fc33", 00:24:31.542 "is_configured": true, 00:24:31.542 "data_offset": 2048, 00:24:31.542 "data_size": 63488 00:24:31.542 }, 00:24:31.542 { 00:24:31.542 "name": "BaseBdev2", 00:24:31.542 "uuid": "387fc68f-050a-570c-8ad9-025751288013", 00:24:31.542 "is_configured": true, 00:24:31.542 "data_offset": 2048, 00:24:31.542 "data_size": 63488 00:24:31.542 } 00:24:31.542 ] 00:24:31.542 }' 00:24:31.542 20:00:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:31.542 20:00:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:31.542 20:00:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:31.801 20:00:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:31.801 20:00:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:31.801 20:00:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:31.801 20:00:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:31.801 20:00:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:31.801 20:00:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:31.801 20:00:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:31.801 20:00:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:31.801 20:00:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:31.801 20:00:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:31.801 20:00:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:31.801 20:00:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:31.801 20:00:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:32.060 20:00:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:32.060 "name": "raid_bdev1", 00:24:32.060 "uuid": "994a6edc-8625-4577-8801-1827f14622bc", 00:24:32.060 "strip_size_kb": 0, 00:24:32.060 "state": "online", 00:24:32.060 "raid_level": "raid1", 00:24:32.060 "superblock": true, 00:24:32.060 "num_base_bdevs": 2, 00:24:32.060 "num_base_bdevs_discovered": 2, 00:24:32.060 "num_base_bdevs_operational": 2, 00:24:32.060 "base_bdevs_list": [ 00:24:32.060 { 00:24:32.060 "name": "spare", 00:24:32.060 "uuid": "3d432d7d-e854-5be5-94a0-7f3bad44fc33", 00:24:32.060 "is_configured": true, 00:24:32.060 "data_offset": 2048, 00:24:32.060 "data_size": 63488 00:24:32.060 }, 00:24:32.060 { 00:24:32.060 "name": "BaseBdev2", 00:24:32.060 "uuid": "387fc68f-050a-570c-8ad9-025751288013", 00:24:32.060 "is_configured": true, 00:24:32.060 "data_offset": 2048, 00:24:32.060 "data_size": 63488 00:24:32.060 } 00:24:32.060 ] 00:24:32.060 }' 00:24:32.060 20:00:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:32.060 20:00:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:32.627 20:00:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:32.886 [2024-07-24 20:00:24.235618] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:32.886 [2024-07-24 20:00:24.235654] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:32.886 [2024-07-24 20:00:24.235711] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:32.886 [2024-07-24 20:00:24.235764] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:32.886 [2024-07-24 20:00:24.235776] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa58c90 name raid_bdev1, state offline 00:24:32.886 20:00:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:32.886 20:00:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@735 -- # jq length 00:24:33.144 20:00:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:24:33.144 20:00:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:24:33.144 20:00:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # '[' false = true ']' 00:24:33.144 20:00:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:24:33.144 20:00:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:33.144 20:00:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:24:33.144 20:00:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:33.144 20:00:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:33.144 20:00:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:33.144 20:00:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:24:33.144 20:00:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:33.144 20:00:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:33.144 20:00:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:24:33.403 /dev/nbd0 00:24:33.403 20:00:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:33.403 20:00:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:33.403 20:00:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:24:33.403 20:00:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:24:33.403 20:00:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:24:33.403 20:00:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:24:33.403 20:00:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:24:33.403 20:00:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:24:33.403 20:00:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:24:33.403 20:00:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:24:33.403 20:00:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:33.403 1+0 records in 00:24:33.403 1+0 records out 00:24:33.403 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000244996 s, 16.7 MB/s 00:24:33.403 20:00:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:33.403 20:00:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:24:33.403 20:00:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:33.403 20:00:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:24:33.403 20:00:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:24:33.403 20:00:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:33.403 20:00:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:33.403 20:00:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:24:33.662 /dev/nbd1 00:24:33.662 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:33.662 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:33.662 20:00:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:24:33.662 20:00:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:24:33.662 20:00:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:24:33.662 20:00:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:24:33.662 20:00:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:24:33.662 20:00:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:24:33.662 20:00:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:24:33.663 20:00:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:24:33.663 20:00:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:33.663 1+0 records in 00:24:33.663 1+0 records out 00:24:33.663 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000333154 s, 12.3 MB/s 00:24:33.663 20:00:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:33.663 20:00:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:24:33.663 20:00:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:33.663 20:00:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:24:33.663 20:00:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:24:33.663 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:33.663 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:33.663 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:24:33.663 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:24:33.663 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:33.663 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:33.663 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:33.663 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:24:33.663 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:33.663 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:33.921 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:33.921 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:33.921 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:33.921 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:33.921 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:33.921 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:33.921 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:24:33.921 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:24:33.921 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:33.921 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:34.179 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:34.179 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:34.179 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:34.179 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:34.179 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:34.179 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:34.179 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:24:34.179 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:24:34.179 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@758 -- # '[' true = true ']' 00:24:34.179 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:34.443 20:00:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:34.710 [2024-07-24 20:00:26.178294] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:34.710 [2024-07-24 20:00:26.178338] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:34.710 [2024-07-24 20:00:26.178358] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x991510 00:24:34.710 [2024-07-24 20:00:26.178370] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:34.710 [2024-07-24 20:00:26.179996] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:34.710 [2024-07-24 20:00:26.180026] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:34.710 [2024-07-24 20:00:26.180101] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:34.710 [2024-07-24 20:00:26.180129] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:34.710 [2024-07-24 20:00:26.180229] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:34.710 spare 00:24:34.710 20:00:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:34.710 20:00:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:34.710 20:00:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:34.710 20:00:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:34.710 20:00:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:34.710 20:00:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:34.710 20:00:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:34.710 20:00:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:34.710 20:00:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:34.710 20:00:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:34.710 20:00:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:34.710 20:00:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:34.710 [2024-07-24 20:00:26.280546] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x98f070 00:24:34.710 [2024-07-24 20:00:26.280568] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:34.710 [2024-07-24 20:00:26.280760] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x996370 00:24:34.710 [2024-07-24 20:00:26.280911] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x98f070 00:24:34.710 [2024-07-24 20:00:26.280921] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x98f070 00:24:34.710 [2024-07-24 20:00:26.281030] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:34.969 20:00:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:34.969 "name": "raid_bdev1", 00:24:34.969 "uuid": "994a6edc-8625-4577-8801-1827f14622bc", 00:24:34.969 "strip_size_kb": 0, 00:24:34.969 "state": "online", 00:24:34.969 "raid_level": "raid1", 00:24:34.969 "superblock": true, 00:24:34.969 "num_base_bdevs": 2, 00:24:34.969 "num_base_bdevs_discovered": 2, 00:24:34.969 "num_base_bdevs_operational": 2, 00:24:34.969 "base_bdevs_list": [ 00:24:34.969 { 00:24:34.969 "name": "spare", 00:24:34.969 "uuid": "3d432d7d-e854-5be5-94a0-7f3bad44fc33", 00:24:34.969 "is_configured": true, 00:24:34.969 "data_offset": 2048, 00:24:34.969 "data_size": 63488 00:24:34.969 }, 00:24:34.969 { 00:24:34.969 "name": "BaseBdev2", 00:24:34.969 "uuid": "387fc68f-050a-570c-8ad9-025751288013", 00:24:34.969 "is_configured": true, 00:24:34.969 "data_offset": 2048, 00:24:34.969 "data_size": 63488 00:24:34.969 } 00:24:34.969 ] 00:24:34.969 }' 00:24:34.969 20:00:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:34.969 20:00:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:35.536 20:00:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:35.536 20:00:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:35.536 20:00:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:35.536 20:00:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:35.536 20:00:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:35.537 20:00:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:35.537 20:00:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:35.796 20:00:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:35.796 "name": "raid_bdev1", 00:24:35.796 "uuid": "994a6edc-8625-4577-8801-1827f14622bc", 00:24:35.796 "strip_size_kb": 0, 00:24:35.796 "state": "online", 00:24:35.796 "raid_level": "raid1", 00:24:35.796 "superblock": true, 00:24:35.796 "num_base_bdevs": 2, 00:24:35.796 "num_base_bdevs_discovered": 2, 00:24:35.796 "num_base_bdevs_operational": 2, 00:24:35.796 "base_bdevs_list": [ 00:24:35.796 { 00:24:35.796 "name": "spare", 00:24:35.796 "uuid": "3d432d7d-e854-5be5-94a0-7f3bad44fc33", 00:24:35.796 "is_configured": true, 00:24:35.796 "data_offset": 2048, 00:24:35.796 "data_size": 63488 00:24:35.796 }, 00:24:35.796 { 00:24:35.796 "name": "BaseBdev2", 00:24:35.796 "uuid": "387fc68f-050a-570c-8ad9-025751288013", 00:24:35.796 "is_configured": true, 00:24:35.796 "data_offset": 2048, 00:24:35.796 "data_size": 63488 00:24:35.796 } 00:24:35.796 ] 00:24:35.796 }' 00:24:35.796 20:00:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:35.796 20:00:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:35.796 20:00:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:36.059 20:00:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:36.059 20:00:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:36.059 20:00:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@765 -- # jq -r '.[].base_bdevs_list[0].name' 00:24:36.059 20:00:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@765 -- # [[ spare == \s\p\a\r\e ]] 00:24:36.059 20:00:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:36.319 [2024-07-24 20:00:27.866919] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:36.319 20:00:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:36.319 20:00:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:36.319 20:00:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:36.319 20:00:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:36.319 20:00:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:36.319 20:00:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:36.319 20:00:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:36.319 20:00:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:36.319 20:00:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:36.319 20:00:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:36.319 20:00:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:36.319 20:00:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:36.579 20:00:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:36.579 "name": "raid_bdev1", 00:24:36.579 "uuid": "994a6edc-8625-4577-8801-1827f14622bc", 00:24:36.579 "strip_size_kb": 0, 00:24:36.579 "state": "online", 00:24:36.579 "raid_level": "raid1", 00:24:36.579 "superblock": true, 00:24:36.579 "num_base_bdevs": 2, 00:24:36.579 "num_base_bdevs_discovered": 1, 00:24:36.579 "num_base_bdevs_operational": 1, 00:24:36.579 "base_bdevs_list": [ 00:24:36.579 { 00:24:36.579 "name": null, 00:24:36.579 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:36.579 "is_configured": false, 00:24:36.579 "data_offset": 2048, 00:24:36.579 "data_size": 63488 00:24:36.580 }, 00:24:36.580 { 00:24:36.580 "name": "BaseBdev2", 00:24:36.580 "uuid": "387fc68f-050a-570c-8ad9-025751288013", 00:24:36.580 "is_configured": true, 00:24:36.580 "data_offset": 2048, 00:24:36.580 "data_size": 63488 00:24:36.580 } 00:24:36.580 ] 00:24:36.580 }' 00:24:36.580 20:00:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:36.580 20:00:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:37.239 20:00:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@770 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:37.498 [2024-07-24 20:00:28.945805] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:37.498 [2024-07-24 20:00:28.945959] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:37.498 [2024-07-24 20:00:28.945976] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:37.498 [2024-07-24 20:00:28.946003] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:37.498 [2024-07-24 20:00:28.950814] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x996370 00:24:37.498 [2024-07-24 20:00:28.952153] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:37.498 20:00:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # sleep 1 00:24:38.435 20:00:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:38.435 20:00:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:38.435 20:00:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:38.435 20:00:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:38.435 20:00:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:38.435 20:00:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:38.435 20:00:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:38.695 20:00:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:38.695 "name": "raid_bdev1", 00:24:38.695 "uuid": "994a6edc-8625-4577-8801-1827f14622bc", 00:24:38.695 "strip_size_kb": 0, 00:24:38.695 "state": "online", 00:24:38.695 "raid_level": "raid1", 00:24:38.695 "superblock": true, 00:24:38.695 "num_base_bdevs": 2, 00:24:38.695 "num_base_bdevs_discovered": 2, 00:24:38.695 "num_base_bdevs_operational": 2, 00:24:38.695 "process": { 00:24:38.695 "type": "rebuild", 00:24:38.695 "target": "spare", 00:24:38.695 "progress": { 00:24:38.695 "blocks": 24576, 00:24:38.695 "percent": 38 00:24:38.695 } 00:24:38.695 }, 00:24:38.695 "base_bdevs_list": [ 00:24:38.695 { 00:24:38.695 "name": "spare", 00:24:38.695 "uuid": "3d432d7d-e854-5be5-94a0-7f3bad44fc33", 00:24:38.695 "is_configured": true, 00:24:38.695 "data_offset": 2048, 00:24:38.695 "data_size": 63488 00:24:38.695 }, 00:24:38.695 { 00:24:38.695 "name": "BaseBdev2", 00:24:38.695 "uuid": "387fc68f-050a-570c-8ad9-025751288013", 00:24:38.695 "is_configured": true, 00:24:38.695 "data_offset": 2048, 00:24:38.695 "data_size": 63488 00:24:38.695 } 00:24:38.695 ] 00:24:38.695 }' 00:24:38.695 20:00:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:38.695 20:00:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:38.695 20:00:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:38.953 20:00:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:38.953 20:00:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:38.953 [2024-07-24 20:00:30.523648] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:39.212 [2024-07-24 20:00:30.564720] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:39.212 [2024-07-24 20:00:30.564767] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:39.212 [2024-07-24 20:00:30.564782] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:39.212 [2024-07-24 20:00:30.564791] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:39.212 20:00:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:39.212 20:00:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:39.212 20:00:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:39.212 20:00:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:39.212 20:00:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:39.212 20:00:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:39.213 20:00:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:39.213 20:00:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:39.213 20:00:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:39.213 20:00:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:39.213 20:00:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:39.213 20:00:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:39.471 20:00:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:39.471 "name": "raid_bdev1", 00:24:39.471 "uuid": "994a6edc-8625-4577-8801-1827f14622bc", 00:24:39.471 "strip_size_kb": 0, 00:24:39.471 "state": "online", 00:24:39.471 "raid_level": "raid1", 00:24:39.471 "superblock": true, 00:24:39.471 "num_base_bdevs": 2, 00:24:39.471 "num_base_bdevs_discovered": 1, 00:24:39.471 "num_base_bdevs_operational": 1, 00:24:39.471 "base_bdevs_list": [ 00:24:39.471 { 00:24:39.471 "name": null, 00:24:39.471 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:39.471 "is_configured": false, 00:24:39.471 "data_offset": 2048, 00:24:39.471 "data_size": 63488 00:24:39.471 }, 00:24:39.471 { 00:24:39.471 "name": "BaseBdev2", 00:24:39.471 "uuid": "387fc68f-050a-570c-8ad9-025751288013", 00:24:39.471 "is_configured": true, 00:24:39.471 "data_offset": 2048, 00:24:39.471 "data_size": 63488 00:24:39.471 } 00:24:39.471 ] 00:24:39.471 }' 00:24:39.471 20:00:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:39.471 20:00:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:40.041 20:00:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:40.609 [2024-07-24 20:00:31.933529] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:40.609 [2024-07-24 20:00:31.933584] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:40.609 [2024-07-24 20:00:31.933606] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x990340 00:24:40.609 [2024-07-24 20:00:31.933619] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:40.609 [2024-07-24 20:00:31.934005] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:40.609 [2024-07-24 20:00:31.934026] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:40.609 [2024-07-24 20:00:31.934109] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:40.609 [2024-07-24 20:00:31.934122] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:40.609 [2024-07-24 20:00:31.934133] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:40.609 [2024-07-24 20:00:31.934152] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:40.609 [2024-07-24 20:00:31.939708] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x996370 00:24:40.609 spare 00:24:40.609 [2024-07-24 20:00:31.941099] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:40.609 20:00:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # sleep 1 00:24:41.544 20:00:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:41.544 20:00:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:41.544 20:00:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:41.544 20:00:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:41.544 20:00:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:41.544 20:00:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:41.544 20:00:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:41.807 20:00:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:41.807 "name": "raid_bdev1", 00:24:41.807 "uuid": "994a6edc-8625-4577-8801-1827f14622bc", 00:24:41.807 "strip_size_kb": 0, 00:24:41.807 "state": "online", 00:24:41.807 "raid_level": "raid1", 00:24:41.807 "superblock": true, 00:24:41.807 "num_base_bdevs": 2, 00:24:41.807 "num_base_bdevs_discovered": 2, 00:24:41.807 "num_base_bdevs_operational": 2, 00:24:41.807 "process": { 00:24:41.807 "type": "rebuild", 00:24:41.807 "target": "spare", 00:24:41.807 "progress": { 00:24:41.807 "blocks": 24576, 00:24:41.807 "percent": 38 00:24:41.807 } 00:24:41.807 }, 00:24:41.807 "base_bdevs_list": [ 00:24:41.807 { 00:24:41.807 "name": "spare", 00:24:41.807 "uuid": "3d432d7d-e854-5be5-94a0-7f3bad44fc33", 00:24:41.807 "is_configured": true, 00:24:41.807 "data_offset": 2048, 00:24:41.807 "data_size": 63488 00:24:41.807 }, 00:24:41.807 { 00:24:41.807 "name": "BaseBdev2", 00:24:41.807 "uuid": "387fc68f-050a-570c-8ad9-025751288013", 00:24:41.807 "is_configured": true, 00:24:41.807 "data_offset": 2048, 00:24:41.807 "data_size": 63488 00:24:41.807 } 00:24:41.807 ] 00:24:41.807 }' 00:24:41.807 20:00:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:41.807 20:00:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:41.807 20:00:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:41.807 20:00:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:41.807 20:00:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:42.068 [2024-07-24 20:00:33.536253] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:42.068 [2024-07-24 20:00:33.553643] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:42.068 [2024-07-24 20:00:33.553686] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:42.068 [2024-07-24 20:00:33.553701] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:42.068 [2024-07-24 20:00:33.553710] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:42.068 20:00:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@783 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:42.068 20:00:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:42.068 20:00:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:42.068 20:00:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:42.068 20:00:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:42.068 20:00:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:42.068 20:00:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:42.068 20:00:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:42.068 20:00:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:42.068 20:00:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:42.068 20:00:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:42.068 20:00:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:42.327 20:00:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:42.327 "name": "raid_bdev1", 00:24:42.327 "uuid": "994a6edc-8625-4577-8801-1827f14622bc", 00:24:42.327 "strip_size_kb": 0, 00:24:42.327 "state": "online", 00:24:42.327 "raid_level": "raid1", 00:24:42.327 "superblock": true, 00:24:42.327 "num_base_bdevs": 2, 00:24:42.327 "num_base_bdevs_discovered": 1, 00:24:42.327 "num_base_bdevs_operational": 1, 00:24:42.327 "base_bdevs_list": [ 00:24:42.327 { 00:24:42.327 "name": null, 00:24:42.327 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:42.327 "is_configured": false, 00:24:42.327 "data_offset": 2048, 00:24:42.327 "data_size": 63488 00:24:42.327 }, 00:24:42.327 { 00:24:42.327 "name": "BaseBdev2", 00:24:42.327 "uuid": "387fc68f-050a-570c-8ad9-025751288013", 00:24:42.327 "is_configured": true, 00:24:42.327 "data_offset": 2048, 00:24:42.327 "data_size": 63488 00:24:42.327 } 00:24:42.327 ] 00:24:42.327 }' 00:24:42.327 20:00:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:42.327 20:00:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:42.893 20:00:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:42.893 20:00:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:42.893 20:00:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:42.893 20:00:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:42.893 20:00:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:42.893 20:00:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:42.893 20:00:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:43.153 20:00:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:43.153 "name": "raid_bdev1", 00:24:43.153 "uuid": "994a6edc-8625-4577-8801-1827f14622bc", 00:24:43.153 "strip_size_kb": 0, 00:24:43.153 "state": "online", 00:24:43.153 "raid_level": "raid1", 00:24:43.153 "superblock": true, 00:24:43.153 "num_base_bdevs": 2, 00:24:43.153 "num_base_bdevs_discovered": 1, 00:24:43.153 "num_base_bdevs_operational": 1, 00:24:43.153 "base_bdevs_list": [ 00:24:43.153 { 00:24:43.153 "name": null, 00:24:43.153 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:43.153 "is_configured": false, 00:24:43.153 "data_offset": 2048, 00:24:43.153 "data_size": 63488 00:24:43.153 }, 00:24:43.153 { 00:24:43.153 "name": "BaseBdev2", 00:24:43.153 "uuid": "387fc68f-050a-570c-8ad9-025751288013", 00:24:43.153 "is_configured": true, 00:24:43.153 "data_offset": 2048, 00:24:43.153 "data_size": 63488 00:24:43.153 } 00:24:43.153 ] 00:24:43.153 }' 00:24:43.153 20:00:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:43.153 20:00:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:43.153 20:00:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:43.413 20:00:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:43.413 20:00:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@787 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:24:43.671 20:00:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@788 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:43.929 [2024-07-24 20:00:35.274635] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:43.930 [2024-07-24 20:00:35.274678] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:43.930 [2024-07-24 20:00:35.274699] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x995f00 00:24:43.930 [2024-07-24 20:00:35.274711] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:43.930 [2024-07-24 20:00:35.275043] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:43.930 [2024-07-24 20:00:35.275062] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:43.930 [2024-07-24 20:00:35.275124] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:24:43.930 [2024-07-24 20:00:35.275137] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:43.930 [2024-07-24 20:00:35.275147] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:43.930 BaseBdev1 00:24:43.930 20:00:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@789 -- # sleep 1 00:24:44.867 20:00:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@790 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:44.867 20:00:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:44.867 20:00:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:44.867 20:00:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:44.867 20:00:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:44.867 20:00:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:44.867 20:00:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:44.867 20:00:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:44.867 20:00:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:44.867 20:00:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:44.867 20:00:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:44.867 20:00:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:45.127 20:00:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:45.127 "name": "raid_bdev1", 00:24:45.127 "uuid": "994a6edc-8625-4577-8801-1827f14622bc", 00:24:45.127 "strip_size_kb": 0, 00:24:45.127 "state": "online", 00:24:45.127 "raid_level": "raid1", 00:24:45.127 "superblock": true, 00:24:45.127 "num_base_bdevs": 2, 00:24:45.127 "num_base_bdevs_discovered": 1, 00:24:45.127 "num_base_bdevs_operational": 1, 00:24:45.127 "base_bdevs_list": [ 00:24:45.127 { 00:24:45.127 "name": null, 00:24:45.127 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:45.127 "is_configured": false, 00:24:45.127 "data_offset": 2048, 00:24:45.127 "data_size": 63488 00:24:45.127 }, 00:24:45.127 { 00:24:45.127 "name": "BaseBdev2", 00:24:45.127 "uuid": "387fc68f-050a-570c-8ad9-025751288013", 00:24:45.127 "is_configured": true, 00:24:45.127 "data_offset": 2048, 00:24:45.127 "data_size": 63488 00:24:45.127 } 00:24:45.127 ] 00:24:45.127 }' 00:24:45.127 20:00:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:45.127 20:00:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:45.693 20:00:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:45.693 20:00:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:45.693 20:00:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:45.693 20:00:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:45.693 20:00:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:45.693 20:00:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:45.693 20:00:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:45.951 20:00:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:45.951 "name": "raid_bdev1", 00:24:45.951 "uuid": "994a6edc-8625-4577-8801-1827f14622bc", 00:24:45.951 "strip_size_kb": 0, 00:24:45.951 "state": "online", 00:24:45.951 "raid_level": "raid1", 00:24:45.951 "superblock": true, 00:24:45.951 "num_base_bdevs": 2, 00:24:45.951 "num_base_bdevs_discovered": 1, 00:24:45.951 "num_base_bdevs_operational": 1, 00:24:45.951 "base_bdevs_list": [ 00:24:45.951 { 00:24:45.951 "name": null, 00:24:45.951 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:45.951 "is_configured": false, 00:24:45.951 "data_offset": 2048, 00:24:45.951 "data_size": 63488 00:24:45.951 }, 00:24:45.951 { 00:24:45.951 "name": "BaseBdev2", 00:24:45.951 "uuid": "387fc68f-050a-570c-8ad9-025751288013", 00:24:45.951 "is_configured": true, 00:24:45.951 "data_offset": 2048, 00:24:45.951 "data_size": 63488 00:24:45.951 } 00:24:45.951 ] 00:24:45.951 }' 00:24:45.951 20:00:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:45.951 20:00:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:45.951 20:00:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:45.951 20:00:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:45.951 20:00:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@792 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:45.951 20:00:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # local es=0 00:24:45.951 20:00:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:45.951 20:00:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:45.951 20:00:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:24:45.951 20:00:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:45.951 20:00:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:24:45.951 20:00:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:45.951 20:00:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:24:45.951 20:00:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:45.951 20:00:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:45.951 20:00:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:46.210 [2024-07-24 20:00:37.709154] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:46.210 [2024-07-24 20:00:37.709274] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:46.210 [2024-07-24 20:00:37.709289] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:46.210 request: 00:24:46.210 { 00:24:46.210 "base_bdev": "BaseBdev1", 00:24:46.210 "raid_bdev": "raid_bdev1", 00:24:46.210 "method": "bdev_raid_add_base_bdev", 00:24:46.210 "req_id": 1 00:24:46.210 } 00:24:46.210 Got JSON-RPC error response 00:24:46.210 response: 00:24:46.210 { 00:24:46.210 "code": -22, 00:24:46.210 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:24:46.210 } 00:24:46.210 20:00:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@653 -- # es=1 00:24:46.210 20:00:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:24:46.210 20:00:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:24:46.210 20:00:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:24:46.210 20:00:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@793 -- # sleep 1 00:24:47.163 20:00:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@794 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:47.163 20:00:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:47.163 20:00:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:47.163 20:00:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:47.163 20:00:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:47.163 20:00:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:47.163 20:00:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:47.163 20:00:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:47.163 20:00:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:47.163 20:00:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:47.163 20:00:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:47.163 20:00:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:47.422 20:00:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:47.422 "name": "raid_bdev1", 00:24:47.422 "uuid": "994a6edc-8625-4577-8801-1827f14622bc", 00:24:47.422 "strip_size_kb": 0, 00:24:47.422 "state": "online", 00:24:47.422 "raid_level": "raid1", 00:24:47.422 "superblock": true, 00:24:47.422 "num_base_bdevs": 2, 00:24:47.422 "num_base_bdevs_discovered": 1, 00:24:47.422 "num_base_bdevs_operational": 1, 00:24:47.422 "base_bdevs_list": [ 00:24:47.422 { 00:24:47.422 "name": null, 00:24:47.422 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:47.422 "is_configured": false, 00:24:47.422 "data_offset": 2048, 00:24:47.422 "data_size": 63488 00:24:47.422 }, 00:24:47.422 { 00:24:47.423 "name": "BaseBdev2", 00:24:47.423 "uuid": "387fc68f-050a-570c-8ad9-025751288013", 00:24:47.423 "is_configured": true, 00:24:47.423 "data_offset": 2048, 00:24:47.423 "data_size": 63488 00:24:47.423 } 00:24:47.423 ] 00:24:47.423 }' 00:24:47.423 20:00:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:47.423 20:00:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:47.991 20:00:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@795 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:47.991 20:00:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:47.991 20:00:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:47.991 20:00:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:47.991 20:00:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:47.991 20:00:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:47.991 20:00:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:48.250 20:00:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:48.250 "name": "raid_bdev1", 00:24:48.250 "uuid": "994a6edc-8625-4577-8801-1827f14622bc", 00:24:48.250 "strip_size_kb": 0, 00:24:48.250 "state": "online", 00:24:48.250 "raid_level": "raid1", 00:24:48.250 "superblock": true, 00:24:48.250 "num_base_bdevs": 2, 00:24:48.250 "num_base_bdevs_discovered": 1, 00:24:48.250 "num_base_bdevs_operational": 1, 00:24:48.250 "base_bdevs_list": [ 00:24:48.250 { 00:24:48.250 "name": null, 00:24:48.250 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:48.250 "is_configured": false, 00:24:48.250 "data_offset": 2048, 00:24:48.250 "data_size": 63488 00:24:48.250 }, 00:24:48.250 { 00:24:48.250 "name": "BaseBdev2", 00:24:48.250 "uuid": "387fc68f-050a-570c-8ad9-025751288013", 00:24:48.250 "is_configured": true, 00:24:48.250 "data_offset": 2048, 00:24:48.250 "data_size": 63488 00:24:48.250 } 00:24:48.250 ] 00:24:48.250 }' 00:24:48.250 20:00:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:48.510 20:00:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:48.510 20:00:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:48.510 20:00:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:48.510 20:00:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@798 -- # killprocess 1487869 00:24:48.510 20:00:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1487869 ']' 00:24:48.510 20:00:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # kill -0 1487869 00:24:48.510 20:00:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # uname 00:24:48.510 20:00:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:24:48.510 20:00:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1487869 00:24:48.510 20:00:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:24:48.510 20:00:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:24:48.510 20:00:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1487869' 00:24:48.510 killing process with pid 1487869 00:24:48.510 20:00:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@969 -- # kill 1487869 00:24:48.510 Received shutdown signal, test time was about 60.000000 seconds 00:24:48.510 00:24:48.510 Latency(us) 00:24:48.510 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:48.510 =================================================================================================================== 00:24:48.510 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:24:48.510 [2024-07-24 20:00:39.956471] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:48.510 [2024-07-24 20:00:39.956556] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:48.510 [2024-07-24 20:00:39.956599] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:48.510 [2024-07-24 20:00:39.956610] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x98f070 name raid_bdev1, state offline 00:24:48.510 20:00:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@974 -- # wait 1487869 00:24:48.510 [2024-07-24 20:00:39.984696] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:48.770 20:00:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@800 -- # return 0 00:24:48.770 00:24:48.770 real 0m36.117s 00:24:48.770 user 0m53.248s 00:24:48.770 sys 0m6.427s 00:24:48.770 20:00:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:24:48.770 20:00:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:48.770 ************************************ 00:24:48.770 END TEST raid_rebuild_test_sb 00:24:48.770 ************************************ 00:24:48.770 20:00:40 bdev_raid -- bdev/bdev_raid.sh@959 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:24:48.770 20:00:40 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:24:48.770 20:00:40 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:24:48.770 20:00:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:48.770 ************************************ 00:24:48.770 START TEST raid_rebuild_test_io 00:24:48.770 ************************************ 00:24:48.770 20:00:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 false true true 00:24:48.770 20:00:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:24:48.770 20:00:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=2 00:24:48.770 20:00:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@586 -- # local superblock=false 00:24:48.770 20:00:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@587 -- # local background_io=true 00:24:48.770 20:00:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # local verify=true 00:24:48.770 20:00:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:24:48.770 20:00:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:24:48.770 20:00:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:24:48.770 20:00:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:24:48.770 20:00:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:24:48.770 20:00:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:24:48.770 20:00:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:24:48.770 20:00:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:24:48.770 20:00:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:48.770 20:00:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:24:48.770 20:00:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:24:48.770 20:00:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # local strip_size 00:24:48.770 20:00:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@592 -- # local create_arg 00:24:48.771 20:00:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:24:48.771 20:00:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@594 -- # local data_offset 00:24:48.771 20:00:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:24:48.771 20:00:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:24:48.771 20:00:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # '[' false = true ']' 00:24:48.771 20:00:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # raid_pid=1493337 00:24:48.771 20:00:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@613 -- # waitforlisten 1493337 /var/tmp/spdk-raid.sock 00:24:48.771 20:00:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:48.771 20:00:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@831 -- # '[' -z 1493337 ']' 00:24:48.771 20:00:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:48.771 20:00:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # local max_retries=100 00:24:48.771 20:00:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:48.771 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:48.771 20:00:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@840 -- # xtrace_disable 00:24:48.771 20:00:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:48.771 [2024-07-24 20:00:40.358972] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:24:48.771 [2024-07-24 20:00:40.359043] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1493337 ] 00:24:48.771 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:48.771 Zero copy mechanism will not be used. 00:24:49.031 [2024-07-24 20:00:40.487229] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:49.031 [2024-07-24 20:00:40.593063] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:49.291 [2024-07-24 20:00:40.660669] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:49.291 [2024-07-24 20:00:40.660709] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:49.859 20:00:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:24:49.859 20:00:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@864 -- # return 0 00:24:49.859 20:00:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:24:49.859 20:00:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:50.120 BaseBdev1_malloc 00:24:50.120 20:00:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:50.380 [2024-07-24 20:00:41.762982] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:50.380 [2024-07-24 20:00:41.763031] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:50.380 [2024-07-24 20:00:41.763053] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf8acd0 00:24:50.380 [2024-07-24 20:00:41.763066] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:50.380 [2024-07-24 20:00:41.764706] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:50.380 [2024-07-24 20:00:41.764738] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:50.380 BaseBdev1 00:24:50.380 20:00:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:24:50.380 20:00:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:50.639 BaseBdev2_malloc 00:24:50.639 20:00:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:50.898 [2024-07-24 20:00:42.261233] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:50.898 [2024-07-24 20:00:42.261279] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:50.898 [2024-07-24 20:00:42.261299] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf8e460 00:24:50.898 [2024-07-24 20:00:42.261312] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:50.898 [2024-07-24 20:00:42.262886] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:50.898 [2024-07-24 20:00:42.262918] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:50.898 BaseBdev2 00:24:50.898 20:00:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:51.158 spare_malloc 00:24:51.158 20:00:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:51.158 spare_delay 00:24:51.419 20:00:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:51.419 [2024-07-24 20:00:42.985025] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:51.419 [2024-07-24 20:00:42.985073] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:51.419 [2024-07-24 20:00:42.985095] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf82c70 00:24:51.419 [2024-07-24 20:00:42.985108] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:51.419 [2024-07-24 20:00:42.986731] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:51.419 [2024-07-24 20:00:42.986763] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:51.419 spare 00:24:51.419 20:00:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:51.679 [2024-07-24 20:00:43.229702] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:51.679 [2024-07-24 20:00:43.231022] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:51.679 [2024-07-24 20:00:43.231102] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x104dc90 00:24:51.679 [2024-07-24 20:00:43.231113] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:24:51.679 [2024-07-24 20:00:43.231319] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf82f00 00:24:51.679 [2024-07-24 20:00:43.231476] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x104dc90 00:24:51.679 [2024-07-24 20:00:43.231487] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x104dc90 00:24:51.679 [2024-07-24 20:00:43.231602] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:51.679 20:00:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:51.679 20:00:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:51.679 20:00:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:51.679 20:00:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:51.679 20:00:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:51.679 20:00:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:51.679 20:00:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:51.679 20:00:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:51.679 20:00:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:51.679 20:00:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:51.679 20:00:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:51.679 20:00:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:51.944 20:00:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:51.944 "name": "raid_bdev1", 00:24:51.944 "uuid": "8fdb9758-3240-45fa-ad0e-6860de7dbbc1", 00:24:51.944 "strip_size_kb": 0, 00:24:51.944 "state": "online", 00:24:51.944 "raid_level": "raid1", 00:24:51.944 "superblock": false, 00:24:51.944 "num_base_bdevs": 2, 00:24:51.944 "num_base_bdevs_discovered": 2, 00:24:51.944 "num_base_bdevs_operational": 2, 00:24:51.944 "base_bdevs_list": [ 00:24:51.944 { 00:24:51.944 "name": "BaseBdev1", 00:24:51.944 "uuid": "4813523b-f43b-56a5-9be2-5bcad7fccb07", 00:24:51.944 "is_configured": true, 00:24:51.944 "data_offset": 0, 00:24:51.944 "data_size": 65536 00:24:51.944 }, 00:24:51.944 { 00:24:51.944 "name": "BaseBdev2", 00:24:51.944 "uuid": "bba6ee1f-b33e-5db0-a297-312e472753de", 00:24:51.944 "is_configured": true, 00:24:51.944 "data_offset": 0, 00:24:51.944 "data_size": 65536 00:24:51.944 } 00:24:51.944 ] 00:24:51.944 }' 00:24:51.944 20:00:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:51.944 20:00:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:52.520 20:00:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:52.520 20:00:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:24:52.779 [2024-07-24 20:00:44.328815] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:52.779 20:00:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=65536 00:24:52.779 20:00:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:52.779 20:00:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:53.037 20:00:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@634 -- # data_offset=0 00:24:53.037 20:00:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@636 -- # '[' true = true ']' 00:24:53.037 20:00:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@638 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:24:53.037 20:00:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:53.307 [2024-07-24 20:00:44.703980] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf82950 00:24:53.307 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:53.307 Zero copy mechanism will not be used. 00:24:53.307 Running I/O for 60 seconds... 00:24:53.307 [2024-07-24 20:00:44.835029] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:53.307 [2024-07-24 20:00:44.843174] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xf82950 00:24:53.307 20:00:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:53.307 20:00:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:53.307 20:00:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:53.307 20:00:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:53.307 20:00:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:53.307 20:00:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:53.307 20:00:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:53.307 20:00:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:53.307 20:00:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:53.307 20:00:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:53.307 20:00:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:53.307 20:00:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:53.571 20:00:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:53.571 "name": "raid_bdev1", 00:24:53.571 "uuid": "8fdb9758-3240-45fa-ad0e-6860de7dbbc1", 00:24:53.571 "strip_size_kb": 0, 00:24:53.571 "state": "online", 00:24:53.571 "raid_level": "raid1", 00:24:53.571 "superblock": false, 00:24:53.571 "num_base_bdevs": 2, 00:24:53.571 "num_base_bdevs_discovered": 1, 00:24:53.571 "num_base_bdevs_operational": 1, 00:24:53.571 "base_bdevs_list": [ 00:24:53.571 { 00:24:53.571 "name": null, 00:24:53.571 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:53.571 "is_configured": false, 00:24:53.571 "data_offset": 0, 00:24:53.571 "data_size": 65536 00:24:53.571 }, 00:24:53.571 { 00:24:53.571 "name": "BaseBdev2", 00:24:53.571 "uuid": "bba6ee1f-b33e-5db0-a297-312e472753de", 00:24:53.571 "is_configured": true, 00:24:53.571 "data_offset": 0, 00:24:53.571 "data_size": 65536 00:24:53.571 } 00:24:53.571 ] 00:24:53.571 }' 00:24:53.571 20:00:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:53.571 20:00:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:54.506 20:00:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:54.506 [2024-07-24 20:00:45.998867] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:54.506 20:00:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:54.506 [2024-07-24 20:00:46.065699] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x113c8d0 00:24:54.506 [2024-07-24 20:00:46.068046] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:54.764 [2024-07-24 20:00:46.186852] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:54.764 [2024-07-24 20:00:46.187144] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:55.022 [2024-07-24 20:00:46.406658] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:55.022 [2024-07-24 20:00:46.406865] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:55.281 [2024-07-24 20:00:46.667933] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:55.281 [2024-07-24 20:00:46.668160] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:55.541 [2024-07-24 20:00:46.887559] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:55.541 [2024-07-24 20:00:46.887801] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:55.541 20:00:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:55.541 20:00:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:55.541 20:00:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:55.541 20:00:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:55.541 20:00:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:55.541 20:00:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:55.541 20:00:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:55.799 [2024-07-24 20:00:47.220653] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:55.800 [2024-07-24 20:00:47.221112] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:55.800 20:00:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:55.800 "name": "raid_bdev1", 00:24:55.800 "uuid": "8fdb9758-3240-45fa-ad0e-6860de7dbbc1", 00:24:55.800 "strip_size_kb": 0, 00:24:55.800 "state": "online", 00:24:55.800 "raid_level": "raid1", 00:24:55.800 "superblock": false, 00:24:55.800 "num_base_bdevs": 2, 00:24:55.800 "num_base_bdevs_discovered": 2, 00:24:55.800 "num_base_bdevs_operational": 2, 00:24:55.800 "process": { 00:24:55.800 "type": "rebuild", 00:24:55.800 "target": "spare", 00:24:55.800 "progress": { 00:24:55.800 "blocks": 14336, 00:24:55.800 "percent": 21 00:24:55.800 } 00:24:55.800 }, 00:24:55.800 "base_bdevs_list": [ 00:24:55.800 { 00:24:55.800 "name": "spare", 00:24:55.800 "uuid": "53284ad0-fd53-5a6b-a59a-a5e87e355d49", 00:24:55.800 "is_configured": true, 00:24:55.800 "data_offset": 0, 00:24:55.800 "data_size": 65536 00:24:55.800 }, 00:24:55.800 { 00:24:55.800 "name": "BaseBdev2", 00:24:55.800 "uuid": "bba6ee1f-b33e-5db0-a297-312e472753de", 00:24:55.800 "is_configured": true, 00:24:55.800 "data_offset": 0, 00:24:55.800 "data_size": 65536 00:24:55.800 } 00:24:55.800 ] 00:24:55.800 }' 00:24:55.800 20:00:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:55.800 20:00:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:55.800 20:00:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:55.800 20:00:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:55.800 20:00:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:56.058 [2024-07-24 20:00:47.433216] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:56.058 [2024-07-24 20:00:47.582042] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:56.316 [2024-07-24 20:00:47.690430] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:56.316 [2024-07-24 20:00:47.708464] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:56.316 [2024-07-24 20:00:47.708493] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:56.316 [2024-07-24 20:00:47.708504] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:56.316 [2024-07-24 20:00:47.738957] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xf82950 00:24:56.316 20:00:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:56.316 20:00:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:56.316 20:00:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:56.316 20:00:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:56.316 20:00:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:56.316 20:00:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:56.316 20:00:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:56.316 20:00:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:56.316 20:00:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:56.316 20:00:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:56.316 20:00:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:56.316 20:00:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:56.629 20:00:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:56.629 "name": "raid_bdev1", 00:24:56.629 "uuid": "8fdb9758-3240-45fa-ad0e-6860de7dbbc1", 00:24:56.629 "strip_size_kb": 0, 00:24:56.629 "state": "online", 00:24:56.629 "raid_level": "raid1", 00:24:56.629 "superblock": false, 00:24:56.629 "num_base_bdevs": 2, 00:24:56.629 "num_base_bdevs_discovered": 1, 00:24:56.629 "num_base_bdevs_operational": 1, 00:24:56.629 "base_bdevs_list": [ 00:24:56.629 { 00:24:56.629 "name": null, 00:24:56.629 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:56.629 "is_configured": false, 00:24:56.629 "data_offset": 0, 00:24:56.629 "data_size": 65536 00:24:56.629 }, 00:24:56.629 { 00:24:56.629 "name": "BaseBdev2", 00:24:56.629 "uuid": "bba6ee1f-b33e-5db0-a297-312e472753de", 00:24:56.629 "is_configured": true, 00:24:56.629 "data_offset": 0, 00:24:56.629 "data_size": 65536 00:24:56.629 } 00:24:56.629 ] 00:24:56.629 }' 00:24:56.629 20:00:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:56.629 20:00:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:57.231 20:00:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:57.231 20:00:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:57.231 20:00:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:57.231 20:00:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:57.232 20:00:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:57.232 20:00:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:57.232 20:00:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:57.491 20:00:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:57.491 "name": "raid_bdev1", 00:24:57.491 "uuid": "8fdb9758-3240-45fa-ad0e-6860de7dbbc1", 00:24:57.491 "strip_size_kb": 0, 00:24:57.491 "state": "online", 00:24:57.491 "raid_level": "raid1", 00:24:57.491 "superblock": false, 00:24:57.491 "num_base_bdevs": 2, 00:24:57.491 "num_base_bdevs_discovered": 1, 00:24:57.491 "num_base_bdevs_operational": 1, 00:24:57.491 "base_bdevs_list": [ 00:24:57.491 { 00:24:57.491 "name": null, 00:24:57.491 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:57.491 "is_configured": false, 00:24:57.491 "data_offset": 0, 00:24:57.491 "data_size": 65536 00:24:57.491 }, 00:24:57.491 { 00:24:57.491 "name": "BaseBdev2", 00:24:57.491 "uuid": "bba6ee1f-b33e-5db0-a297-312e472753de", 00:24:57.491 "is_configured": true, 00:24:57.491 "data_offset": 0, 00:24:57.491 "data_size": 65536 00:24:57.491 } 00:24:57.491 ] 00:24:57.491 }' 00:24:57.491 20:00:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:57.491 20:00:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:57.491 20:00:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:57.491 20:00:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:57.491 20:00:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:57.749 [2024-07-24 20:00:49.259896] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:57.749 20:00:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@678 -- # sleep 1 00:24:57.749 [2024-07-24 20:00:49.336768] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf88f70 00:24:57.749 [2024-07-24 20:00:49.338274] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:58.007 [2024-07-24 20:00:49.440884] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:58.007 [2024-07-24 20:00:49.441356] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:58.264 [2024-07-24 20:00:49.670764] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:58.264 [2024-07-24 20:00:49.670950] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:58.524 [2024-07-24 20:00:50.008631] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:58.524 [2024-07-24 20:00:50.009145] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:58.783 [2024-07-24 20:00:50.238522] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:58.783 [2024-07-24 20:00:50.238805] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:58.783 20:00:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:58.783 20:00:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:58.783 20:00:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:58.783 20:00:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:58.783 20:00:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:58.783 20:00:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:58.783 20:00:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:59.042 20:00:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:59.042 "name": "raid_bdev1", 00:24:59.042 "uuid": "8fdb9758-3240-45fa-ad0e-6860de7dbbc1", 00:24:59.042 "strip_size_kb": 0, 00:24:59.042 "state": "online", 00:24:59.042 "raid_level": "raid1", 00:24:59.042 "superblock": false, 00:24:59.042 "num_base_bdevs": 2, 00:24:59.042 "num_base_bdevs_discovered": 2, 00:24:59.042 "num_base_bdevs_operational": 2, 00:24:59.042 "process": { 00:24:59.042 "type": "rebuild", 00:24:59.042 "target": "spare", 00:24:59.042 "progress": { 00:24:59.042 "blocks": 12288, 00:24:59.042 "percent": 18 00:24:59.042 } 00:24:59.042 }, 00:24:59.042 "base_bdevs_list": [ 00:24:59.042 { 00:24:59.042 "name": "spare", 00:24:59.042 "uuid": "53284ad0-fd53-5a6b-a59a-a5e87e355d49", 00:24:59.042 "is_configured": true, 00:24:59.042 "data_offset": 0, 00:24:59.042 "data_size": 65536 00:24:59.042 }, 00:24:59.042 { 00:24:59.042 "name": "BaseBdev2", 00:24:59.042 "uuid": "bba6ee1f-b33e-5db0-a297-312e472753de", 00:24:59.042 "is_configured": true, 00:24:59.042 "data_offset": 0, 00:24:59.042 "data_size": 65536 00:24:59.042 } 00:24:59.042 ] 00:24:59.042 }' 00:24:59.043 20:00:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:59.301 20:00:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:59.301 20:00:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:59.301 [2024-07-24 20:00:50.681260] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:59.301 [2024-07-24 20:00:50.681463] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:59.301 20:00:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:59.301 20:00:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@681 -- # '[' false = true ']' 00:24:59.301 20:00:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=2 00:24:59.301 20:00:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:24:59.301 20:00:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # '[' 2 -gt 2 ']' 00:24:59.301 20:00:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # local timeout=871 00:24:59.301 20:00:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:24:59.301 20:00:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:59.301 20:00:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:59.301 20:00:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:59.301 20:00:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:59.301 20:00:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:59.301 20:00:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:59.301 20:00:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:59.560 20:00:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:59.560 "name": "raid_bdev1", 00:24:59.560 "uuid": "8fdb9758-3240-45fa-ad0e-6860de7dbbc1", 00:24:59.560 "strip_size_kb": 0, 00:24:59.560 "state": "online", 00:24:59.560 "raid_level": "raid1", 00:24:59.560 "superblock": false, 00:24:59.560 "num_base_bdevs": 2, 00:24:59.560 "num_base_bdevs_discovered": 2, 00:24:59.560 "num_base_bdevs_operational": 2, 00:24:59.560 "process": { 00:24:59.560 "type": "rebuild", 00:24:59.560 "target": "spare", 00:24:59.560 "progress": { 00:24:59.560 "blocks": 18432, 00:24:59.560 "percent": 28 00:24:59.560 } 00:24:59.560 }, 00:24:59.560 "base_bdevs_list": [ 00:24:59.560 { 00:24:59.560 "name": "spare", 00:24:59.560 "uuid": "53284ad0-fd53-5a6b-a59a-a5e87e355d49", 00:24:59.560 "is_configured": true, 00:24:59.560 "data_offset": 0, 00:24:59.560 "data_size": 65536 00:24:59.560 }, 00:24:59.560 { 00:24:59.560 "name": "BaseBdev2", 00:24:59.560 "uuid": "bba6ee1f-b33e-5db0-a297-312e472753de", 00:24:59.560 "is_configured": true, 00:24:59.560 "data_offset": 0, 00:24:59.560 "data_size": 65536 00:24:59.560 } 00:24:59.560 ] 00:24:59.560 }' 00:24:59.560 20:00:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:59.560 20:00:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:59.560 20:00:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:59.560 20:00:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:59.560 20:00:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:24:59.820 [2024-07-24 20:00:51.367276] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:24:59.820 [2024-07-24 20:00:51.367609] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:25:00.078 [2024-07-24 20:00:51.513386] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:25:00.078 [2024-07-24 20:00:51.513551] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:25:00.335 [2024-07-24 20:00:51.926987] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:25:00.594 20:00:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:25:00.594 20:00:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:00.594 20:00:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:00.594 20:00:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:00.594 20:00:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:00.594 20:00:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:00.594 20:00:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:00.594 20:00:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:00.851 [2024-07-24 20:00:52.249200] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:25:00.851 20:00:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:00.851 "name": "raid_bdev1", 00:25:00.851 "uuid": "8fdb9758-3240-45fa-ad0e-6860de7dbbc1", 00:25:00.851 "strip_size_kb": 0, 00:25:00.851 "state": "online", 00:25:00.851 "raid_level": "raid1", 00:25:00.851 "superblock": false, 00:25:00.851 "num_base_bdevs": 2, 00:25:00.851 "num_base_bdevs_discovered": 2, 00:25:00.851 "num_base_bdevs_operational": 2, 00:25:00.851 "process": { 00:25:00.851 "type": "rebuild", 00:25:00.851 "target": "spare", 00:25:00.851 "progress": { 00:25:00.851 "blocks": 38912, 00:25:00.851 "percent": 59 00:25:00.851 } 00:25:00.851 }, 00:25:00.851 "base_bdevs_list": [ 00:25:00.851 { 00:25:00.851 "name": "spare", 00:25:00.851 "uuid": "53284ad0-fd53-5a6b-a59a-a5e87e355d49", 00:25:00.851 "is_configured": true, 00:25:00.851 "data_offset": 0, 00:25:00.851 "data_size": 65536 00:25:00.851 }, 00:25:00.851 { 00:25:00.851 "name": "BaseBdev2", 00:25:00.851 "uuid": "bba6ee1f-b33e-5db0-a297-312e472753de", 00:25:00.851 "is_configured": true, 00:25:00.851 "data_offset": 0, 00:25:00.851 "data_size": 65536 00:25:00.851 } 00:25:00.851 ] 00:25:00.851 }' 00:25:00.851 20:00:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:00.851 20:00:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:00.851 [2024-07-24 20:00:52.368470] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:25:00.851 20:00:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:00.851 20:00:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:00.851 20:00:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:25:02.227 20:00:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:25:02.227 20:00:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:02.227 20:00:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:02.227 20:00:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:02.227 20:00:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:02.227 20:00:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:02.227 20:00:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:02.227 20:00:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:02.227 20:00:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:02.227 "name": "raid_bdev1", 00:25:02.227 "uuid": "8fdb9758-3240-45fa-ad0e-6860de7dbbc1", 00:25:02.227 "strip_size_kb": 0, 00:25:02.227 "state": "online", 00:25:02.227 "raid_level": "raid1", 00:25:02.227 "superblock": false, 00:25:02.227 "num_base_bdevs": 2, 00:25:02.227 "num_base_bdevs_discovered": 2, 00:25:02.227 "num_base_bdevs_operational": 2, 00:25:02.227 "process": { 00:25:02.227 "type": "rebuild", 00:25:02.227 "target": "spare", 00:25:02.227 "progress": { 00:25:02.227 "blocks": 61440, 00:25:02.227 "percent": 93 00:25:02.227 } 00:25:02.227 }, 00:25:02.227 "base_bdevs_list": [ 00:25:02.227 { 00:25:02.227 "name": "spare", 00:25:02.227 "uuid": "53284ad0-fd53-5a6b-a59a-a5e87e355d49", 00:25:02.227 "is_configured": true, 00:25:02.227 "data_offset": 0, 00:25:02.227 "data_size": 65536 00:25:02.227 }, 00:25:02.227 { 00:25:02.227 "name": "BaseBdev2", 00:25:02.227 "uuid": "bba6ee1f-b33e-5db0-a297-312e472753de", 00:25:02.227 "is_configured": true, 00:25:02.227 "data_offset": 0, 00:25:02.227 "data_size": 65536 00:25:02.227 } 00:25:02.227 ] 00:25:02.227 }' 00:25:02.227 20:00:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:02.227 20:00:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:02.227 20:00:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:02.228 20:00:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:02.228 20:00:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:25:02.486 [2024-07-24 20:00:53.825018] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:02.486 [2024-07-24 20:00:53.925340] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:02.486 [2024-07-24 20:00:53.926671] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:03.423 20:00:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:25:03.423 20:00:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:03.423 20:00:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:03.423 20:00:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:03.423 20:00:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:03.423 20:00:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:03.423 20:00:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:03.423 20:00:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:03.683 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:03.683 "name": "raid_bdev1", 00:25:03.683 "uuid": "8fdb9758-3240-45fa-ad0e-6860de7dbbc1", 00:25:03.683 "strip_size_kb": 0, 00:25:03.683 "state": "online", 00:25:03.683 "raid_level": "raid1", 00:25:03.683 "superblock": false, 00:25:03.683 "num_base_bdevs": 2, 00:25:03.683 "num_base_bdevs_discovered": 2, 00:25:03.683 "num_base_bdevs_operational": 2, 00:25:03.683 "base_bdevs_list": [ 00:25:03.683 { 00:25:03.683 "name": "spare", 00:25:03.683 "uuid": "53284ad0-fd53-5a6b-a59a-a5e87e355d49", 00:25:03.683 "is_configured": true, 00:25:03.683 "data_offset": 0, 00:25:03.683 "data_size": 65536 00:25:03.683 }, 00:25:03.683 { 00:25:03.683 "name": "BaseBdev2", 00:25:03.683 "uuid": "bba6ee1f-b33e-5db0-a297-312e472753de", 00:25:03.683 "is_configured": true, 00:25:03.683 "data_offset": 0, 00:25:03.683 "data_size": 65536 00:25:03.683 } 00:25:03.683 ] 00:25:03.683 }' 00:25:03.683 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:03.683 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:03.683 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:03.683 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:03.683 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # break 00:25:03.683 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:03.683 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:03.683 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:03.683 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:03.683 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:03.683 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:03.683 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:03.943 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:03.943 "name": "raid_bdev1", 00:25:03.943 "uuid": "8fdb9758-3240-45fa-ad0e-6860de7dbbc1", 00:25:03.943 "strip_size_kb": 0, 00:25:03.943 "state": "online", 00:25:03.943 "raid_level": "raid1", 00:25:03.943 "superblock": false, 00:25:03.943 "num_base_bdevs": 2, 00:25:03.943 "num_base_bdevs_discovered": 2, 00:25:03.943 "num_base_bdevs_operational": 2, 00:25:03.943 "base_bdevs_list": [ 00:25:03.943 { 00:25:03.943 "name": "spare", 00:25:03.943 "uuid": "53284ad0-fd53-5a6b-a59a-a5e87e355d49", 00:25:03.943 "is_configured": true, 00:25:03.943 "data_offset": 0, 00:25:03.943 "data_size": 65536 00:25:03.943 }, 00:25:03.943 { 00:25:03.943 "name": "BaseBdev2", 00:25:03.943 "uuid": "bba6ee1f-b33e-5db0-a297-312e472753de", 00:25:03.943 "is_configured": true, 00:25:03.943 "data_offset": 0, 00:25:03.943 "data_size": 65536 00:25:03.943 } 00:25:03.943 ] 00:25:03.943 }' 00:25:03.943 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:03.944 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:03.944 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:03.944 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:03.944 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:03.944 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:03.944 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:03.944 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:03.944 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:03.944 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:03.944 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:03.944 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:03.944 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:03.944 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:03.944 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:03.944 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:04.202 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:04.202 "name": "raid_bdev1", 00:25:04.202 "uuid": "8fdb9758-3240-45fa-ad0e-6860de7dbbc1", 00:25:04.202 "strip_size_kb": 0, 00:25:04.202 "state": "online", 00:25:04.202 "raid_level": "raid1", 00:25:04.202 "superblock": false, 00:25:04.202 "num_base_bdevs": 2, 00:25:04.202 "num_base_bdevs_discovered": 2, 00:25:04.202 "num_base_bdevs_operational": 2, 00:25:04.202 "base_bdevs_list": [ 00:25:04.202 { 00:25:04.202 "name": "spare", 00:25:04.202 "uuid": "53284ad0-fd53-5a6b-a59a-a5e87e355d49", 00:25:04.202 "is_configured": true, 00:25:04.202 "data_offset": 0, 00:25:04.202 "data_size": 65536 00:25:04.202 }, 00:25:04.202 { 00:25:04.202 "name": "BaseBdev2", 00:25:04.202 "uuid": "bba6ee1f-b33e-5db0-a297-312e472753de", 00:25:04.202 "is_configured": true, 00:25:04.202 "data_offset": 0, 00:25:04.202 "data_size": 65536 00:25:04.202 } 00:25:04.202 ] 00:25:04.202 }' 00:25:04.202 20:00:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:04.202 20:00:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:04.770 20:00:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:05.030 [2024-07-24 20:00:56.553034] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:05.030 [2024-07-24 20:00:56.553067] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:05.030 00:25:05.030 Latency(us) 00:25:05.030 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:05.030 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:25:05.030 raid_bdev1 : 11.83 104.45 313.35 0.00 0.00 12551.45 288.50 118534.68 00:25:05.030 =================================================================================================================== 00:25:05.030 Total : 104.45 313.35 0.00 0.00 12551.45 288.50 118534.68 00:25:05.030 [2024-07-24 20:00:56.572792] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:05.030 [2024-07-24 20:00:56.572819] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:05.030 [2024-07-24 20:00:56.572892] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:05.030 [2024-07-24 20:00:56.572904] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x104dc90 name raid_bdev1, state offline 00:25:05.030 0 00:25:05.030 20:00:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:05.030 20:00:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # jq length 00:25:05.289 20:00:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:25:05.289 20:00:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:25:05.289 20:00:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@738 -- # '[' true = true ']' 00:25:05.289 20:00:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@740 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:25:05.289 20:00:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:05.289 20:00:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:25:05.289 20:00:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:05.289 20:00:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:05.289 20:00:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:05.289 20:00:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:25:05.289 20:00:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:05.289 20:00:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:05.289 20:00:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:25:05.547 /dev/nbd0 00:25:05.547 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:05.547 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:05.547 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:25:05.547 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # local i 00:25:05.547 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:05.547 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:05.547 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:25:05.547 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@873 -- # break 00:25:05.547 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:05.547 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:05.547 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:05.547 1+0 records in 00:25:05.547 1+0 records out 00:25:05.547 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274047 s, 14.9 MB/s 00:25:05.547 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:05.547 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # size=4096 00:25:05.548 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:05.548 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:05.548 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@889 -- # return 0 00:25:05.548 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:05.548 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:05.548 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:25:05.548 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' -z BaseBdev2 ']' 00:25:05.548 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@745 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:25:05.548 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:05.548 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:25:05.548 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:05.548 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:05.548 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:05.548 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:25:05.548 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:05.548 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:05.548 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:25:05.806 /dev/nbd1 00:25:05.806 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:05.806 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:05.806 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:25:05.806 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # local i 00:25:05.806 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:05.806 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:05.806 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:25:05.806 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@873 -- # break 00:25:05.806 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:05.806 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:05.807 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:05.807 1+0 records in 00:25:05.807 1+0 records out 00:25:05.807 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000248709 s, 16.5 MB/s 00:25:05.807 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:05.807 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # size=4096 00:25:05.807 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:05.807 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:05.807 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@889 -- # return 0 00:25:05.807 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:05.807 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:05.807 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@746 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:25:06.066 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@747 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:06.066 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:06.066 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:06.066 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:06.066 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:25:06.066 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:06.066 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:06.323 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:06.323 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:06.323 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:06.323 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:06.323 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:06.323 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:06.323 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:25:06.323 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:06.323 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@749 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:06.323 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:06.323 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:06.323 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:06.323 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:25:06.323 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:06.323 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:06.323 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:06.323 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:06.323 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:06.323 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:06.323 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:06.323 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:06.323 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:25:06.323 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:06.324 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@758 -- # '[' false = true ']' 00:25:06.324 20:00:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@798 -- # killprocess 1493337 00:25:06.324 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@950 -- # '[' -z 1493337 ']' 00:25:06.324 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # kill -0 1493337 00:25:06.324 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # uname 00:25:06.581 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:25:06.581 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1493337 00:25:06.581 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:25:06.581 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:25:06.581 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1493337' 00:25:06.581 killing process with pid 1493337 00:25:06.581 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@969 -- # kill 1493337 00:25:06.581 Received shutdown signal, test time was about 13.223535 seconds 00:25:06.581 00:25:06.581 Latency(us) 00:25:06.581 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:06.581 =================================================================================================================== 00:25:06.581 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:25:06.581 [2024-07-24 20:00:57.962218] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:06.581 20:00:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@974 -- # wait 1493337 00:25:06.581 [2024-07-24 20:00:57.982733] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@800 -- # return 0 00:25:06.840 00:25:06.840 real 0m17.907s 00:25:06.840 user 0m27.233s 00:25:06.840 sys 0m2.822s 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:06.840 ************************************ 00:25:06.840 END TEST raid_rebuild_test_io 00:25:06.840 ************************************ 00:25:06.840 20:00:58 bdev_raid -- bdev/bdev_raid.sh@960 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:25:06.840 20:00:58 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:25:06.840 20:00:58 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:25:06.840 20:00:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:06.840 ************************************ 00:25:06.840 START TEST raid_rebuild_test_sb_io 00:25:06.840 ************************************ 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 true true true 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=2 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@586 -- # local superblock=true 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@587 -- # local background_io=true 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # local verify=true 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # local strip_size 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # local create_arg 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@594 -- # local data_offset 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # '[' true = true ']' 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # create_arg+=' -s' 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # raid_pid=1495904 00:25:06.840 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@613 -- # waitforlisten 1495904 /var/tmp/spdk-raid.sock 00:25:06.841 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:06.841 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@831 -- # '[' -z 1495904 ']' 00:25:06.841 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:06.841 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # local max_retries=100 00:25:06.841 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:06.841 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:06.841 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@840 -- # xtrace_disable 00:25:06.841 20:00:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:06.841 [2024-07-24 20:00:58.356795] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:25:06.841 [2024-07-24 20:00:58.356867] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1495904 ] 00:25:06.841 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:06.841 Zero copy mechanism will not be used. 00:25:07.098 [2024-07-24 20:00:58.487400] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:07.098 [2024-07-24 20:00:58.588647] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:07.098 [2024-07-24 20:00:58.645979] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:07.098 [2024-07-24 20:00:58.646022] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:08.035 20:00:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:25:08.035 20:00:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@864 -- # return 0 00:25:08.035 20:00:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:25:08.035 20:00:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:08.036 BaseBdev1_malloc 00:25:08.036 20:00:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:08.295 [2024-07-24 20:00:59.773213] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:08.295 [2024-07-24 20:00:59.773270] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:08.295 [2024-07-24 20:00:59.773296] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16a2cd0 00:25:08.295 [2024-07-24 20:00:59.773309] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:08.295 [2024-07-24 20:00:59.775042] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:08.295 [2024-07-24 20:00:59.775078] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:08.295 BaseBdev1 00:25:08.295 20:00:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:25:08.295 20:00:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:08.554 BaseBdev2_malloc 00:25:08.554 20:01:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:08.812 [2024-07-24 20:01:00.279352] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:08.812 [2024-07-24 20:01:00.279413] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:08.812 [2024-07-24 20:01:00.279435] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16a6460 00:25:08.812 [2024-07-24 20:01:00.279448] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:08.812 [2024-07-24 20:01:00.281094] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:08.812 [2024-07-24 20:01:00.281125] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:08.812 BaseBdev2 00:25:08.812 20:01:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:09.071 spare_malloc 00:25:09.071 20:01:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:09.329 spare_delay 00:25:09.329 20:01:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:09.587 [2024-07-24 20:01:01.039271] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:09.587 [2024-07-24 20:01:01.039324] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:09.587 [2024-07-24 20:01:01.039346] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x169ac70 00:25:09.587 [2024-07-24 20:01:01.039359] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:09.587 [2024-07-24 20:01:01.041025] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:09.587 [2024-07-24 20:01:01.041057] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:09.587 spare 00:25:09.587 20:01:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:25:09.846 [2024-07-24 20:01:01.287965] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:09.846 [2024-07-24 20:01:01.289327] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:09.846 [2024-07-24 20:01:01.289527] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1765c90 00:25:09.846 [2024-07-24 20:01:01.289541] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:09.846 [2024-07-24 20:01:01.289753] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x169af00 00:25:09.846 [2024-07-24 20:01:01.289898] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1765c90 00:25:09.846 [2024-07-24 20:01:01.289908] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1765c90 00:25:09.846 [2024-07-24 20:01:01.290012] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:09.846 20:01:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:09.846 20:01:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:09.846 20:01:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:09.846 20:01:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:09.846 20:01:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:09.846 20:01:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:09.846 20:01:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:09.846 20:01:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:09.846 20:01:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:09.846 20:01:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:09.846 20:01:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:09.846 20:01:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:10.105 20:01:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:10.105 "name": "raid_bdev1", 00:25:10.105 "uuid": "e2e73bc4-36c2-4711-a877-4cc625b9dd68", 00:25:10.105 "strip_size_kb": 0, 00:25:10.105 "state": "online", 00:25:10.105 "raid_level": "raid1", 00:25:10.105 "superblock": true, 00:25:10.105 "num_base_bdevs": 2, 00:25:10.105 "num_base_bdevs_discovered": 2, 00:25:10.105 "num_base_bdevs_operational": 2, 00:25:10.105 "base_bdevs_list": [ 00:25:10.105 { 00:25:10.105 "name": "BaseBdev1", 00:25:10.105 "uuid": "d53fe6ba-c53c-5b74-beb2-48c7c08195f9", 00:25:10.105 "is_configured": true, 00:25:10.105 "data_offset": 2048, 00:25:10.105 "data_size": 63488 00:25:10.105 }, 00:25:10.105 { 00:25:10.105 "name": "BaseBdev2", 00:25:10.105 "uuid": "e68b202d-37f1-581d-9686-4e4b5e3107ed", 00:25:10.105 "is_configured": true, 00:25:10.105 "data_offset": 2048, 00:25:10.105 "data_size": 63488 00:25:10.105 } 00:25:10.105 ] 00:25:10.105 }' 00:25:10.105 20:01:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:10.105 20:01:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:10.673 20:01:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:10.673 20:01:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:25:10.673 [2024-07-24 20:01:02.238693] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:10.673 20:01:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=63488 00:25:10.933 20:01:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:10.933 20:01:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:10.933 20:01:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@634 -- # data_offset=2048 00:25:10.933 20:01:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@636 -- # '[' true = true ']' 00:25:10.933 20:01:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:10.933 20:01:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@638 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:25:11.192 [2024-07-24 20:01:02.621750] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x169a910 00:25:11.192 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:11.192 Zero copy mechanism will not be used. 00:25:11.192 Running I/O for 60 seconds... 00:25:11.192 [2024-07-24 20:01:02.757375] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:11.192 [2024-07-24 20:01:02.757599] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x169a910 00:25:11.450 20:01:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:11.450 20:01:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:11.450 20:01:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:11.450 20:01:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:11.450 20:01:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:11.450 20:01:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:11.450 20:01:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:11.450 20:01:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:11.450 20:01:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:11.450 20:01:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:11.450 20:01:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:11.450 20:01:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:11.708 20:01:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:11.708 "name": "raid_bdev1", 00:25:11.708 "uuid": "e2e73bc4-36c2-4711-a877-4cc625b9dd68", 00:25:11.708 "strip_size_kb": 0, 00:25:11.708 "state": "online", 00:25:11.708 "raid_level": "raid1", 00:25:11.708 "superblock": true, 00:25:11.708 "num_base_bdevs": 2, 00:25:11.708 "num_base_bdevs_discovered": 1, 00:25:11.708 "num_base_bdevs_operational": 1, 00:25:11.708 "base_bdevs_list": [ 00:25:11.708 { 00:25:11.708 "name": null, 00:25:11.708 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:11.708 "is_configured": false, 00:25:11.708 "data_offset": 2048, 00:25:11.708 "data_size": 63488 00:25:11.708 }, 00:25:11.708 { 00:25:11.708 "name": "BaseBdev2", 00:25:11.708 "uuid": "e68b202d-37f1-581d-9686-4e4b5e3107ed", 00:25:11.708 "is_configured": true, 00:25:11.708 "data_offset": 2048, 00:25:11.708 "data_size": 63488 00:25:11.708 } 00:25:11.708 ] 00:25:11.708 }' 00:25:11.708 20:01:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:11.708 20:01:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:12.274 20:01:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:12.533 [2024-07-24 20:01:03.907051] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:12.534 [2024-07-24 20:01:03.965615] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13a9fb0 00:25:12.534 20:01:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:12.534 [2024-07-24 20:01:03.967985] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:12.534 [2024-07-24 20:01:04.100523] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:12.534 [2024-07-24 20:01:04.109037] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:12.794 [2024-07-24 20:01:04.328785] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:12.794 [2024-07-24 20:01:04.328973] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:13.362 [2024-07-24 20:01:04.827481] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:13.621 20:01:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:13.621 20:01:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:13.621 20:01:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:13.621 20:01:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:13.621 20:01:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:13.621 20:01:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.621 20:01:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:13.621 [2024-07-24 20:01:05.166890] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:13.621 [2024-07-24 20:01:05.167358] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:13.879 20:01:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:13.879 "name": "raid_bdev1", 00:25:13.879 "uuid": "e2e73bc4-36c2-4711-a877-4cc625b9dd68", 00:25:13.879 "strip_size_kb": 0, 00:25:13.879 "state": "online", 00:25:13.879 "raid_level": "raid1", 00:25:13.879 "superblock": true, 00:25:13.879 "num_base_bdevs": 2, 00:25:13.879 "num_base_bdevs_discovered": 2, 00:25:13.879 "num_base_bdevs_operational": 2, 00:25:13.879 "process": { 00:25:13.879 "type": "rebuild", 00:25:13.879 "target": "spare", 00:25:13.879 "progress": { 00:25:13.879 "blocks": 14336, 00:25:13.879 "percent": 22 00:25:13.879 } 00:25:13.879 }, 00:25:13.879 "base_bdevs_list": [ 00:25:13.879 { 00:25:13.879 "name": "spare", 00:25:13.879 "uuid": "e669e077-6d83-526c-ba48-8ac8b1846758", 00:25:13.879 "is_configured": true, 00:25:13.879 "data_offset": 2048, 00:25:13.879 "data_size": 63488 00:25:13.879 }, 00:25:13.879 { 00:25:13.879 "name": "BaseBdev2", 00:25:13.879 "uuid": "e68b202d-37f1-581d-9686-4e4b5e3107ed", 00:25:13.879 "is_configured": true, 00:25:13.879 "data_offset": 2048, 00:25:13.879 "data_size": 63488 00:25:13.879 } 00:25:13.879 ] 00:25:13.879 }' 00:25:13.879 20:01:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:13.879 20:01:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:13.880 20:01:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:13.880 20:01:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:13.880 20:01:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:13.880 [2024-07-24 20:01:05.385908] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:13.880 [2024-07-24 20:01:05.386152] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:14.139 [2024-07-24 20:01:05.602464] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:14.139 [2024-07-24 20:01:05.725460] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:14.397 [2024-07-24 20:01:05.743497] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:14.397 [2024-07-24 20:01:05.743527] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:14.397 [2024-07-24 20:01:05.743539] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:14.397 [2024-07-24 20:01:05.773410] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x169a910 00:25:14.397 20:01:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:14.397 20:01:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:14.397 20:01:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:14.397 20:01:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:14.397 20:01:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:14.397 20:01:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:14.397 20:01:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:14.397 20:01:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:14.397 20:01:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:14.397 20:01:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:14.397 20:01:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:14.397 20:01:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:14.656 20:01:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:14.656 "name": "raid_bdev1", 00:25:14.656 "uuid": "e2e73bc4-36c2-4711-a877-4cc625b9dd68", 00:25:14.656 "strip_size_kb": 0, 00:25:14.656 "state": "online", 00:25:14.656 "raid_level": "raid1", 00:25:14.656 "superblock": true, 00:25:14.656 "num_base_bdevs": 2, 00:25:14.656 "num_base_bdevs_discovered": 1, 00:25:14.656 "num_base_bdevs_operational": 1, 00:25:14.656 "base_bdevs_list": [ 00:25:14.656 { 00:25:14.656 "name": null, 00:25:14.656 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:14.656 "is_configured": false, 00:25:14.656 "data_offset": 2048, 00:25:14.656 "data_size": 63488 00:25:14.656 }, 00:25:14.656 { 00:25:14.656 "name": "BaseBdev2", 00:25:14.656 "uuid": "e68b202d-37f1-581d-9686-4e4b5e3107ed", 00:25:14.656 "is_configured": true, 00:25:14.656 "data_offset": 2048, 00:25:14.656 "data_size": 63488 00:25:14.656 } 00:25:14.656 ] 00:25:14.656 }' 00:25:14.656 20:01:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:14.656 20:01:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:15.223 20:01:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:15.223 20:01:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:15.223 20:01:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:15.223 20:01:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:15.223 20:01:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:15.224 20:01:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:15.224 20:01:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:15.539 20:01:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:15.540 "name": "raid_bdev1", 00:25:15.540 "uuid": "e2e73bc4-36c2-4711-a877-4cc625b9dd68", 00:25:15.540 "strip_size_kb": 0, 00:25:15.540 "state": "online", 00:25:15.540 "raid_level": "raid1", 00:25:15.540 "superblock": true, 00:25:15.540 "num_base_bdevs": 2, 00:25:15.540 "num_base_bdevs_discovered": 1, 00:25:15.540 "num_base_bdevs_operational": 1, 00:25:15.540 "base_bdevs_list": [ 00:25:15.540 { 00:25:15.540 "name": null, 00:25:15.540 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:15.540 "is_configured": false, 00:25:15.540 "data_offset": 2048, 00:25:15.540 "data_size": 63488 00:25:15.540 }, 00:25:15.540 { 00:25:15.540 "name": "BaseBdev2", 00:25:15.540 "uuid": "e68b202d-37f1-581d-9686-4e4b5e3107ed", 00:25:15.540 "is_configured": true, 00:25:15.540 "data_offset": 2048, 00:25:15.540 "data_size": 63488 00:25:15.540 } 00:25:15.540 ] 00:25:15.540 }' 00:25:15.540 20:01:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:15.540 20:01:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:15.540 20:01:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:15.540 20:01:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:15.540 20:01:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:15.799 [2024-07-24 20:01:07.294551] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:15.799 20:01:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@678 -- # sleep 1 00:25:15.799 [2024-07-24 20:01:07.362101] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16b7c10 00:25:15.799 [2024-07-24 20:01:07.363816] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:16.057 [2024-07-24 20:01:07.490329] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:16.057 [2024-07-24 20:01:07.490730] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:16.315 [2024-07-24 20:01:07.701559] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:16.315 [2024-07-24 20:01:07.701763] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:16.584 [2024-07-24 20:01:08.082735] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:16.847 [2024-07-24 20:01:08.309424] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:16.847 [2024-07-24 20:01:08.309564] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:16.847 20:01:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:16.847 20:01:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:16.847 20:01:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:16.847 20:01:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:16.847 20:01:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:16.847 20:01:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:16.847 20:01:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:17.107 20:01:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:17.107 "name": "raid_bdev1", 00:25:17.107 "uuid": "e2e73bc4-36c2-4711-a877-4cc625b9dd68", 00:25:17.107 "strip_size_kb": 0, 00:25:17.107 "state": "online", 00:25:17.107 "raid_level": "raid1", 00:25:17.107 "superblock": true, 00:25:17.107 "num_base_bdevs": 2, 00:25:17.107 "num_base_bdevs_discovered": 2, 00:25:17.107 "num_base_bdevs_operational": 2, 00:25:17.107 "process": { 00:25:17.107 "type": "rebuild", 00:25:17.107 "target": "spare", 00:25:17.107 "progress": { 00:25:17.107 "blocks": 12288, 00:25:17.107 "percent": 19 00:25:17.107 } 00:25:17.107 }, 00:25:17.107 "base_bdevs_list": [ 00:25:17.107 { 00:25:17.107 "name": "spare", 00:25:17.107 "uuid": "e669e077-6d83-526c-ba48-8ac8b1846758", 00:25:17.107 "is_configured": true, 00:25:17.107 "data_offset": 2048, 00:25:17.107 "data_size": 63488 00:25:17.107 }, 00:25:17.107 { 00:25:17.107 "name": "BaseBdev2", 00:25:17.107 "uuid": "e68b202d-37f1-581d-9686-4e4b5e3107ed", 00:25:17.107 "is_configured": true, 00:25:17.107 "data_offset": 2048, 00:25:17.107 "data_size": 63488 00:25:17.107 } 00:25:17.107 ] 00:25:17.107 }' 00:25:17.107 20:01:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:17.107 20:01:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:17.107 20:01:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:17.365 20:01:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:17.365 20:01:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@681 -- # '[' true = true ']' 00:25:17.365 20:01:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@681 -- # '[' = false ']' 00:25:17.365 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 681: [: =: unary operator expected 00:25:17.365 20:01:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=2 00:25:17.365 20:01:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:25:17.365 20:01:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # '[' 2 -gt 2 ']' 00:25:17.365 20:01:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # local timeout=889 00:25:17.365 20:01:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:25:17.365 20:01:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:17.365 20:01:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:17.365 20:01:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:17.365 20:01:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:17.365 20:01:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:17.365 20:01:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:17.365 20:01:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:17.623 20:01:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:17.623 "name": "raid_bdev1", 00:25:17.623 "uuid": "e2e73bc4-36c2-4711-a877-4cc625b9dd68", 00:25:17.623 "strip_size_kb": 0, 00:25:17.623 "state": "online", 00:25:17.623 "raid_level": "raid1", 00:25:17.623 "superblock": true, 00:25:17.623 "num_base_bdevs": 2, 00:25:17.623 "num_base_bdevs_discovered": 2, 00:25:17.623 "num_base_bdevs_operational": 2, 00:25:17.623 "process": { 00:25:17.623 "type": "rebuild", 00:25:17.623 "target": "spare", 00:25:17.623 "progress": { 00:25:17.623 "blocks": 16384, 00:25:17.623 "percent": 25 00:25:17.623 } 00:25:17.623 }, 00:25:17.623 "base_bdevs_list": [ 00:25:17.623 { 00:25:17.623 "name": "spare", 00:25:17.623 "uuid": "e669e077-6d83-526c-ba48-8ac8b1846758", 00:25:17.623 "is_configured": true, 00:25:17.623 "data_offset": 2048, 00:25:17.623 "data_size": 63488 00:25:17.623 }, 00:25:17.623 { 00:25:17.623 "name": "BaseBdev2", 00:25:17.623 "uuid": "e68b202d-37f1-581d-9686-4e4b5e3107ed", 00:25:17.623 "is_configured": true, 00:25:17.623 "data_offset": 2048, 00:25:17.623 "data_size": 63488 00:25:17.623 } 00:25:17.623 ] 00:25:17.623 }' 00:25:17.623 20:01:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:17.623 20:01:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:17.623 20:01:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:17.623 20:01:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:17.623 20:01:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:25:17.623 [2024-07-24 20:01:09.159337] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:25:18.191 [2024-07-24 20:01:09.490416] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:25:18.191 [2024-07-24 20:01:09.700209] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:25:18.760 [2024-07-24 20:01:10.046575] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:25:18.760 20:01:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:25:18.760 20:01:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:18.760 20:01:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:18.760 20:01:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:18.760 20:01:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:18.760 20:01:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:18.760 20:01:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:18.760 20:01:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:18.760 20:01:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:18.760 "name": "raid_bdev1", 00:25:18.760 "uuid": "e2e73bc4-36c2-4711-a877-4cc625b9dd68", 00:25:18.760 "strip_size_kb": 0, 00:25:18.760 "state": "online", 00:25:18.760 "raid_level": "raid1", 00:25:18.760 "superblock": true, 00:25:18.760 "num_base_bdevs": 2, 00:25:18.760 "num_base_bdevs_discovered": 2, 00:25:18.760 "num_base_bdevs_operational": 2, 00:25:18.760 "process": { 00:25:18.760 "type": "rebuild", 00:25:18.760 "target": "spare", 00:25:18.760 "progress": { 00:25:18.760 "blocks": 32768, 00:25:18.760 "percent": 51 00:25:18.760 } 00:25:18.760 }, 00:25:18.760 "base_bdevs_list": [ 00:25:18.760 { 00:25:18.760 "name": "spare", 00:25:18.760 "uuid": "e669e077-6d83-526c-ba48-8ac8b1846758", 00:25:18.760 "is_configured": true, 00:25:18.760 "data_offset": 2048, 00:25:18.760 "data_size": 63488 00:25:18.760 }, 00:25:18.760 { 00:25:18.760 "name": "BaseBdev2", 00:25:18.760 "uuid": "e68b202d-37f1-581d-9686-4e4b5e3107ed", 00:25:18.760 "is_configured": true, 00:25:18.760 "data_offset": 2048, 00:25:18.760 "data_size": 63488 00:25:18.760 } 00:25:18.760 ] 00:25:18.760 }' 00:25:18.760 20:01:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:18.760 [2024-07-24 20:01:10.274735] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:25:18.760 20:01:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:18.760 20:01:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:18.760 20:01:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:18.760 20:01:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:25:19.698 [2024-07-24 20:01:11.224767] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:25:19.698 [2024-07-24 20:01:11.225227] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:25:19.958 20:01:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:25:19.958 20:01:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:19.958 20:01:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:19.958 20:01:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:19.958 20:01:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:19.958 20:01:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:19.958 20:01:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:19.958 20:01:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:19.958 [2024-07-24 20:01:11.462919] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:25:20.217 20:01:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:20.217 "name": "raid_bdev1", 00:25:20.217 "uuid": "e2e73bc4-36c2-4711-a877-4cc625b9dd68", 00:25:20.217 "strip_size_kb": 0, 00:25:20.217 "state": "online", 00:25:20.217 "raid_level": "raid1", 00:25:20.217 "superblock": true, 00:25:20.217 "num_base_bdevs": 2, 00:25:20.217 "num_base_bdevs_discovered": 2, 00:25:20.217 "num_base_bdevs_operational": 2, 00:25:20.217 "process": { 00:25:20.217 "type": "rebuild", 00:25:20.217 "target": "spare", 00:25:20.217 "progress": { 00:25:20.217 "blocks": 53248, 00:25:20.217 "percent": 83 00:25:20.217 } 00:25:20.217 }, 00:25:20.217 "base_bdevs_list": [ 00:25:20.217 { 00:25:20.217 "name": "spare", 00:25:20.217 "uuid": "e669e077-6d83-526c-ba48-8ac8b1846758", 00:25:20.217 "is_configured": true, 00:25:20.217 "data_offset": 2048, 00:25:20.217 "data_size": 63488 00:25:20.217 }, 00:25:20.217 { 00:25:20.217 "name": "BaseBdev2", 00:25:20.217 "uuid": "e68b202d-37f1-581d-9686-4e4b5e3107ed", 00:25:20.217 "is_configured": true, 00:25:20.217 "data_offset": 2048, 00:25:20.217 "data_size": 63488 00:25:20.217 } 00:25:20.217 ] 00:25:20.217 }' 00:25:20.217 20:01:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:20.217 20:01:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:20.217 20:01:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:20.217 20:01:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:20.217 20:01:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:25:20.785 [2024-07-24 20:01:12.130662] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:20.785 [2024-07-24 20:01:12.238953] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:20.785 [2024-07-24 20:01:12.241266] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:21.353 20:01:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:25:21.353 20:01:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:21.353 20:01:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:21.353 20:01:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:21.353 20:01:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:21.353 20:01:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:21.353 20:01:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:21.353 20:01:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:21.353 20:01:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:21.353 "name": "raid_bdev1", 00:25:21.353 "uuid": "e2e73bc4-36c2-4711-a877-4cc625b9dd68", 00:25:21.353 "strip_size_kb": 0, 00:25:21.353 "state": "online", 00:25:21.353 "raid_level": "raid1", 00:25:21.354 "superblock": true, 00:25:21.354 "num_base_bdevs": 2, 00:25:21.354 "num_base_bdevs_discovered": 2, 00:25:21.354 "num_base_bdevs_operational": 2, 00:25:21.354 "base_bdevs_list": [ 00:25:21.354 { 00:25:21.354 "name": "spare", 00:25:21.354 "uuid": "e669e077-6d83-526c-ba48-8ac8b1846758", 00:25:21.354 "is_configured": true, 00:25:21.354 "data_offset": 2048, 00:25:21.354 "data_size": 63488 00:25:21.354 }, 00:25:21.354 { 00:25:21.354 "name": "BaseBdev2", 00:25:21.354 "uuid": "e68b202d-37f1-581d-9686-4e4b5e3107ed", 00:25:21.354 "is_configured": true, 00:25:21.354 "data_offset": 2048, 00:25:21.354 "data_size": 63488 00:25:21.354 } 00:25:21.354 ] 00:25:21.354 }' 00:25:21.354 20:01:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:21.612 20:01:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:21.612 20:01:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:21.612 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:21.612 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # break 00:25:21.612 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:21.612 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:21.612 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:21.612 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:21.612 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:21.612 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:21.612 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:21.872 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:21.872 "name": "raid_bdev1", 00:25:21.872 "uuid": "e2e73bc4-36c2-4711-a877-4cc625b9dd68", 00:25:21.872 "strip_size_kb": 0, 00:25:21.872 "state": "online", 00:25:21.872 "raid_level": "raid1", 00:25:21.872 "superblock": true, 00:25:21.872 "num_base_bdevs": 2, 00:25:21.872 "num_base_bdevs_discovered": 2, 00:25:21.872 "num_base_bdevs_operational": 2, 00:25:21.872 "base_bdevs_list": [ 00:25:21.872 { 00:25:21.872 "name": "spare", 00:25:21.872 "uuid": "e669e077-6d83-526c-ba48-8ac8b1846758", 00:25:21.872 "is_configured": true, 00:25:21.872 "data_offset": 2048, 00:25:21.872 "data_size": 63488 00:25:21.872 }, 00:25:21.872 { 00:25:21.872 "name": "BaseBdev2", 00:25:21.872 "uuid": "e68b202d-37f1-581d-9686-4e4b5e3107ed", 00:25:21.872 "is_configured": true, 00:25:21.872 "data_offset": 2048, 00:25:21.872 "data_size": 63488 00:25:21.872 } 00:25:21.872 ] 00:25:21.872 }' 00:25:21.872 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:21.872 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:21.872 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:21.872 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:21.872 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:21.872 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:21.872 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:21.872 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:21.872 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:21.872 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:21.872 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:21.872 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:21.872 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:21.872 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:21.872 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:21.872 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:22.136 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:22.136 "name": "raid_bdev1", 00:25:22.136 "uuid": "e2e73bc4-36c2-4711-a877-4cc625b9dd68", 00:25:22.136 "strip_size_kb": 0, 00:25:22.136 "state": "online", 00:25:22.136 "raid_level": "raid1", 00:25:22.136 "superblock": true, 00:25:22.136 "num_base_bdevs": 2, 00:25:22.136 "num_base_bdevs_discovered": 2, 00:25:22.136 "num_base_bdevs_operational": 2, 00:25:22.136 "base_bdevs_list": [ 00:25:22.136 { 00:25:22.136 "name": "spare", 00:25:22.136 "uuid": "e669e077-6d83-526c-ba48-8ac8b1846758", 00:25:22.136 "is_configured": true, 00:25:22.136 "data_offset": 2048, 00:25:22.136 "data_size": 63488 00:25:22.136 }, 00:25:22.136 { 00:25:22.136 "name": "BaseBdev2", 00:25:22.136 "uuid": "e68b202d-37f1-581d-9686-4e4b5e3107ed", 00:25:22.136 "is_configured": true, 00:25:22.136 "data_offset": 2048, 00:25:22.136 "data_size": 63488 00:25:22.136 } 00:25:22.136 ] 00:25:22.136 }' 00:25:22.136 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:22.136 20:01:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:22.702 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:22.702 [2024-07-24 20:01:14.284736] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:22.702 [2024-07-24 20:01:14.284773] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:22.961 00:25:22.961 Latency(us) 00:25:22.961 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:22.961 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:25:22.961 raid_bdev1 : 11.68 100.22 300.66 0.00 0.00 13639.06 290.28 118534.68 00:25:22.961 =================================================================================================================== 00:25:22.961 Total : 100.22 300.66 0.00 0.00 13639.06 290.28 118534.68 00:25:22.961 [2024-07-24 20:01:14.340822] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:22.961 [2024-07-24 20:01:14.340860] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:22.961 [2024-07-24 20:01:14.340936] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:22.961 [2024-07-24 20:01:14.340949] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1765c90 name raid_bdev1, state offline 00:25:22.961 0 00:25:22.961 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:22.961 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # jq length 00:25:23.220 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:25:23.220 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:25:23.220 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@738 -- # '[' true = true ']' 00:25:23.220 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@740 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:25:23.220 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:23.220 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:25:23.220 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:23.220 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:23.220 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:23.220 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:25:23.220 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:23.220 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:23.220 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:25:23.479 /dev/nbd0 00:25:23.479 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:23.479 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:23.479 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:25:23.479 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # local i 00:25:23.479 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:23.479 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:23.479 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:25:23.479 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@873 -- # break 00:25:23.479 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:23.479 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:23.479 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:23.479 1+0 records in 00:25:23.479 1+0 records out 00:25:23.479 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000285673 s, 14.3 MB/s 00:25:23.479 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:23.480 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # size=4096 00:25:23.480 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:23.480 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:23.480 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@889 -- # return 0 00:25:23.480 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:23.480 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:23.480 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:25:23.480 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' -z BaseBdev2 ']' 00:25:23.480 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:25:23.480 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:23.480 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:25:23.480 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:23.480 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:23.480 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:23.480 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:25:23.480 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:23.480 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:23.480 20:01:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:25:23.739 /dev/nbd1 00:25:23.739 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:23.739 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:23.739 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:25:23.739 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # local i 00:25:23.739 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:23.739 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:23.739 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:25:23.739 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@873 -- # break 00:25:23.739 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:23.739 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:23.739 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:23.739 1+0 records in 00:25:23.739 1+0 records out 00:25:23.739 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000259492 s, 15.8 MB/s 00:25:23.739 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:23.739 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # size=4096 00:25:23.739 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:23.739 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:23.739 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@889 -- # return 0 00:25:23.739 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:23.739 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:23.739 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@746 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:25:23.739 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:23.739 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:23.739 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:23.739 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:23.739 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:25:23.739 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:23.739 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:24.000 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:24.000 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:24.000 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:24.000 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:24.000 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:24.000 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:24.000 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:25:24.000 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:24.000 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:24.000 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:24.000 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:24.000 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:24.000 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:25:24.000 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:24.000 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:24.259 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:24.259 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:24.259 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:24.259 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:24.259 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:24.259 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:24.259 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:25:24.259 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:24.259 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@758 -- # '[' true = true ']' 00:25:24.259 20:01:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:24.518 20:01:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:24.777 [2024-07-24 20:01:16.307982] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:24.777 [2024-07-24 20:01:16.308034] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:24.777 [2024-07-24 20:01:16.308056] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x176ee40 00:25:24.777 [2024-07-24 20:01:16.308069] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:24.777 [2024-07-24 20:01:16.309714] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:24.777 [2024-07-24 20:01:16.309746] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:24.777 [2024-07-24 20:01:16.309833] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:24.777 [2024-07-24 20:01:16.309863] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:24.777 [2024-07-24 20:01:16.309970] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:24.777 spare 00:25:24.777 20:01:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:24.777 20:01:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:24.777 20:01:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:24.777 20:01:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:24.777 20:01:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:24.777 20:01:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:24.777 20:01:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:24.777 20:01:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:24.777 20:01:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:24.777 20:01:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:24.778 20:01:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:24.778 20:01:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:25.039 [2024-07-24 20:01:16.410287] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x16b5ef0 00:25:25.039 [2024-07-24 20:01:16.410307] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:25.039 [2024-07-24 20:01:16.410509] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1766980 00:25:25.039 [2024-07-24 20:01:16.410660] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16b5ef0 00:25:25.039 [2024-07-24 20:01:16.410671] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16b5ef0 00:25:25.039 [2024-07-24 20:01:16.410783] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:25.039 20:01:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:25.039 "name": "raid_bdev1", 00:25:25.039 "uuid": "e2e73bc4-36c2-4711-a877-4cc625b9dd68", 00:25:25.040 "strip_size_kb": 0, 00:25:25.040 "state": "online", 00:25:25.040 "raid_level": "raid1", 00:25:25.040 "superblock": true, 00:25:25.040 "num_base_bdevs": 2, 00:25:25.040 "num_base_bdevs_discovered": 2, 00:25:25.040 "num_base_bdevs_operational": 2, 00:25:25.040 "base_bdevs_list": [ 00:25:25.040 { 00:25:25.040 "name": "spare", 00:25:25.040 "uuid": "e669e077-6d83-526c-ba48-8ac8b1846758", 00:25:25.040 "is_configured": true, 00:25:25.040 "data_offset": 2048, 00:25:25.040 "data_size": 63488 00:25:25.040 }, 00:25:25.040 { 00:25:25.040 "name": "BaseBdev2", 00:25:25.040 "uuid": "e68b202d-37f1-581d-9686-4e4b5e3107ed", 00:25:25.040 "is_configured": true, 00:25:25.040 "data_offset": 2048, 00:25:25.040 "data_size": 63488 00:25:25.040 } 00:25:25.040 ] 00:25:25.040 }' 00:25:25.040 20:01:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:25.040 20:01:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:25.609 20:01:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:25.609 20:01:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:25.609 20:01:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:25.609 20:01:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:25.609 20:01:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:25.609 20:01:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:25.609 20:01:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:25.870 20:01:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:25.870 "name": "raid_bdev1", 00:25:25.870 "uuid": "e2e73bc4-36c2-4711-a877-4cc625b9dd68", 00:25:25.870 "strip_size_kb": 0, 00:25:25.870 "state": "online", 00:25:25.870 "raid_level": "raid1", 00:25:25.870 "superblock": true, 00:25:25.870 "num_base_bdevs": 2, 00:25:25.870 "num_base_bdevs_discovered": 2, 00:25:25.870 "num_base_bdevs_operational": 2, 00:25:25.870 "base_bdevs_list": [ 00:25:25.870 { 00:25:25.870 "name": "spare", 00:25:25.870 "uuid": "e669e077-6d83-526c-ba48-8ac8b1846758", 00:25:25.870 "is_configured": true, 00:25:25.870 "data_offset": 2048, 00:25:25.870 "data_size": 63488 00:25:25.870 }, 00:25:25.870 { 00:25:25.870 "name": "BaseBdev2", 00:25:25.870 "uuid": "e68b202d-37f1-581d-9686-4e4b5e3107ed", 00:25:25.870 "is_configured": true, 00:25:25.870 "data_offset": 2048, 00:25:25.870 "data_size": 63488 00:25:25.870 } 00:25:25.870 ] 00:25:25.870 }' 00:25:25.870 20:01:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:25.870 20:01:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:25.870 20:01:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:26.130 20:01:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:26.130 20:01:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@765 -- # jq -r '.[].base_bdevs_list[0].name' 00:25:26.130 20:01:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:26.130 20:01:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@765 -- # [[ spare == \s\p\a\r\e ]] 00:25:26.130 20:01:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:26.390 [2024-07-24 20:01:17.948675] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:26.390 20:01:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:26.390 20:01:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:26.390 20:01:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:26.390 20:01:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:26.390 20:01:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:26.390 20:01:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:26.390 20:01:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:26.390 20:01:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:26.390 20:01:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:26.390 20:01:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:26.390 20:01:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:26.390 20:01:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:26.650 20:01:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:26.650 "name": "raid_bdev1", 00:25:26.650 "uuid": "e2e73bc4-36c2-4711-a877-4cc625b9dd68", 00:25:26.650 "strip_size_kb": 0, 00:25:26.650 "state": "online", 00:25:26.650 "raid_level": "raid1", 00:25:26.650 "superblock": true, 00:25:26.650 "num_base_bdevs": 2, 00:25:26.650 "num_base_bdevs_discovered": 1, 00:25:26.650 "num_base_bdevs_operational": 1, 00:25:26.650 "base_bdevs_list": [ 00:25:26.650 { 00:25:26.650 "name": null, 00:25:26.650 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:26.650 "is_configured": false, 00:25:26.650 "data_offset": 2048, 00:25:26.650 "data_size": 63488 00:25:26.650 }, 00:25:26.650 { 00:25:26.650 "name": "BaseBdev2", 00:25:26.650 "uuid": "e68b202d-37f1-581d-9686-4e4b5e3107ed", 00:25:26.650 "is_configured": true, 00:25:26.650 "data_offset": 2048, 00:25:26.650 "data_size": 63488 00:25:26.650 } 00:25:26.650 ] 00:25:26.650 }' 00:25:26.650 20:01:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:26.650 20:01:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:27.225 20:01:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@770 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:27.484 [2024-07-24 20:01:19.019677] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:27.484 [2024-07-24 20:01:19.019835] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:27.484 [2024-07-24 20:01:19.019853] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:27.484 [2024-07-24 20:01:19.019882] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:27.484 [2024-07-24 20:01:19.025196] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17646e0 00:25:27.484 [2024-07-24 20:01:19.027451] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:27.484 20:01:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # sleep 1 00:25:28.858 20:01:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:28.858 20:01:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:28.858 20:01:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:28.858 20:01:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:28.858 20:01:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:28.858 20:01:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:28.858 20:01:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:28.858 20:01:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:28.858 "name": "raid_bdev1", 00:25:28.858 "uuid": "e2e73bc4-36c2-4711-a877-4cc625b9dd68", 00:25:28.858 "strip_size_kb": 0, 00:25:28.858 "state": "online", 00:25:28.858 "raid_level": "raid1", 00:25:28.858 "superblock": true, 00:25:28.858 "num_base_bdevs": 2, 00:25:28.858 "num_base_bdevs_discovered": 2, 00:25:28.858 "num_base_bdevs_operational": 2, 00:25:28.858 "process": { 00:25:28.858 "type": "rebuild", 00:25:28.858 "target": "spare", 00:25:28.858 "progress": { 00:25:28.858 "blocks": 22528, 00:25:28.858 "percent": 35 00:25:28.858 } 00:25:28.858 }, 00:25:28.858 "base_bdevs_list": [ 00:25:28.858 { 00:25:28.858 "name": "spare", 00:25:28.858 "uuid": "e669e077-6d83-526c-ba48-8ac8b1846758", 00:25:28.858 "is_configured": true, 00:25:28.858 "data_offset": 2048, 00:25:28.858 "data_size": 63488 00:25:28.858 }, 00:25:28.858 { 00:25:28.858 "name": "BaseBdev2", 00:25:28.858 "uuid": "e68b202d-37f1-581d-9686-4e4b5e3107ed", 00:25:28.858 "is_configured": true, 00:25:28.858 "data_offset": 2048, 00:25:28.858 "data_size": 63488 00:25:28.858 } 00:25:28.858 ] 00:25:28.858 }' 00:25:28.858 20:01:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:28.858 20:01:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:28.858 20:01:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:28.858 20:01:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:28.858 20:01:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:29.115 [2024-07-24 20:01:20.557470] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:29.115 [2024-07-24 20:01:20.640049] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:29.115 [2024-07-24 20:01:20.640099] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:29.115 [2024-07-24 20:01:20.640115] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:29.115 [2024-07-24 20:01:20.640124] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:29.115 20:01:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:29.115 20:01:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:29.115 20:01:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:29.115 20:01:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:29.115 20:01:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:29.115 20:01:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:29.115 20:01:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:29.115 20:01:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:29.115 20:01:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:29.115 20:01:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:29.115 20:01:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:29.115 20:01:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:29.373 20:01:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:29.373 "name": "raid_bdev1", 00:25:29.373 "uuid": "e2e73bc4-36c2-4711-a877-4cc625b9dd68", 00:25:29.373 "strip_size_kb": 0, 00:25:29.373 "state": "online", 00:25:29.373 "raid_level": "raid1", 00:25:29.373 "superblock": true, 00:25:29.373 "num_base_bdevs": 2, 00:25:29.373 "num_base_bdevs_discovered": 1, 00:25:29.373 "num_base_bdevs_operational": 1, 00:25:29.373 "base_bdevs_list": [ 00:25:29.373 { 00:25:29.373 "name": null, 00:25:29.373 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:29.373 "is_configured": false, 00:25:29.373 "data_offset": 2048, 00:25:29.373 "data_size": 63488 00:25:29.373 }, 00:25:29.373 { 00:25:29.373 "name": "BaseBdev2", 00:25:29.373 "uuid": "e68b202d-37f1-581d-9686-4e4b5e3107ed", 00:25:29.373 "is_configured": true, 00:25:29.373 "data_offset": 2048, 00:25:29.373 "data_size": 63488 00:25:29.373 } 00:25:29.373 ] 00:25:29.373 }' 00:25:29.373 20:01:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:29.373 20:01:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:29.940 20:01:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:30.198 [2024-07-24 20:01:21.740436] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:30.198 [2024-07-24 20:01:21.740493] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:30.198 [2024-07-24 20:01:21.740522] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16b6170 00:25:30.198 [2024-07-24 20:01:21.740535] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:30.198 [2024-07-24 20:01:21.740921] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:30.198 [2024-07-24 20:01:21.740939] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:30.198 [2024-07-24 20:01:21.741025] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:30.198 [2024-07-24 20:01:21.741039] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:30.198 [2024-07-24 20:01:21.741051] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:30.198 [2024-07-24 20:01:21.741069] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:30.198 [2024-07-24 20:01:21.746374] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1766980 00:25:30.198 spare 00:25:30.198 [2024-07-24 20:01:21.747852] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:30.198 20:01:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # sleep 1 00:25:31.589 20:01:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:31.589 20:01:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:31.589 20:01:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:31.589 20:01:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:31.589 20:01:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:31.589 20:01:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:31.589 20:01:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:31.589 20:01:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:31.589 "name": "raid_bdev1", 00:25:31.589 "uuid": "e2e73bc4-36c2-4711-a877-4cc625b9dd68", 00:25:31.589 "strip_size_kb": 0, 00:25:31.589 "state": "online", 00:25:31.589 "raid_level": "raid1", 00:25:31.589 "superblock": true, 00:25:31.589 "num_base_bdevs": 2, 00:25:31.589 "num_base_bdevs_discovered": 2, 00:25:31.589 "num_base_bdevs_operational": 2, 00:25:31.589 "process": { 00:25:31.589 "type": "rebuild", 00:25:31.589 "target": "spare", 00:25:31.589 "progress": { 00:25:31.589 "blocks": 24576, 00:25:31.589 "percent": 38 00:25:31.589 } 00:25:31.589 }, 00:25:31.589 "base_bdevs_list": [ 00:25:31.589 { 00:25:31.589 "name": "spare", 00:25:31.589 "uuid": "e669e077-6d83-526c-ba48-8ac8b1846758", 00:25:31.589 "is_configured": true, 00:25:31.589 "data_offset": 2048, 00:25:31.589 "data_size": 63488 00:25:31.589 }, 00:25:31.589 { 00:25:31.589 "name": "BaseBdev2", 00:25:31.589 "uuid": "e68b202d-37f1-581d-9686-4e4b5e3107ed", 00:25:31.589 "is_configured": true, 00:25:31.589 "data_offset": 2048, 00:25:31.589 "data_size": 63488 00:25:31.589 } 00:25:31.589 ] 00:25:31.589 }' 00:25:31.589 20:01:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:31.589 20:01:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:31.589 20:01:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:31.589 20:01:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:31.589 20:01:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:31.847 [2024-07-24 20:01:23.336672] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:31.847 [2024-07-24 20:01:23.360660] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:31.847 [2024-07-24 20:01:23.360704] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:31.847 [2024-07-24 20:01:23.360720] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:31.847 [2024-07-24 20:01:23.360728] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:31.847 20:01:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@783 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:31.848 20:01:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:31.848 20:01:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:31.848 20:01:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:31.848 20:01:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:31.848 20:01:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:31.848 20:01:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:31.848 20:01:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:31.848 20:01:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:31.848 20:01:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:31.848 20:01:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:31.848 20:01:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:32.106 20:01:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:32.106 "name": "raid_bdev1", 00:25:32.106 "uuid": "e2e73bc4-36c2-4711-a877-4cc625b9dd68", 00:25:32.106 "strip_size_kb": 0, 00:25:32.106 "state": "online", 00:25:32.106 "raid_level": "raid1", 00:25:32.106 "superblock": true, 00:25:32.106 "num_base_bdevs": 2, 00:25:32.106 "num_base_bdevs_discovered": 1, 00:25:32.106 "num_base_bdevs_operational": 1, 00:25:32.106 "base_bdevs_list": [ 00:25:32.106 { 00:25:32.106 "name": null, 00:25:32.106 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:32.106 "is_configured": false, 00:25:32.106 "data_offset": 2048, 00:25:32.106 "data_size": 63488 00:25:32.106 }, 00:25:32.106 { 00:25:32.106 "name": "BaseBdev2", 00:25:32.106 "uuid": "e68b202d-37f1-581d-9686-4e4b5e3107ed", 00:25:32.106 "is_configured": true, 00:25:32.106 "data_offset": 2048, 00:25:32.106 "data_size": 63488 00:25:32.106 } 00:25:32.106 ] 00:25:32.106 }' 00:25:32.106 20:01:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:32.106 20:01:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:32.673 20:01:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:32.673 20:01:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:32.673 20:01:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:32.673 20:01:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:32.673 20:01:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:32.673 20:01:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:32.673 20:01:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:32.932 20:01:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:32.932 "name": "raid_bdev1", 00:25:32.932 "uuid": "e2e73bc4-36c2-4711-a877-4cc625b9dd68", 00:25:32.932 "strip_size_kb": 0, 00:25:32.932 "state": "online", 00:25:32.932 "raid_level": "raid1", 00:25:32.932 "superblock": true, 00:25:32.932 "num_base_bdevs": 2, 00:25:32.932 "num_base_bdevs_discovered": 1, 00:25:32.932 "num_base_bdevs_operational": 1, 00:25:32.932 "base_bdevs_list": [ 00:25:32.932 { 00:25:32.932 "name": null, 00:25:32.932 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:32.932 "is_configured": false, 00:25:32.932 "data_offset": 2048, 00:25:32.932 "data_size": 63488 00:25:32.932 }, 00:25:32.932 { 00:25:32.932 "name": "BaseBdev2", 00:25:32.932 "uuid": "e68b202d-37f1-581d-9686-4e4b5e3107ed", 00:25:32.932 "is_configured": true, 00:25:32.932 "data_offset": 2048, 00:25:32.932 "data_size": 63488 00:25:32.932 } 00:25:32.932 ] 00:25:32.932 }' 00:25:32.932 20:01:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:33.192 20:01:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:33.192 20:01:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:33.192 20:01:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:33.192 20:01:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@787 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:33.452 20:01:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@788 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:33.712 [2024-07-24 20:01:25.065899] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:33.712 [2024-07-24 20:01:25.065949] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:33.712 [2024-07-24 20:01:25.065969] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16a2f00 00:25:33.712 [2024-07-24 20:01:25.065982] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:33.712 [2024-07-24 20:01:25.066331] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:33.712 [2024-07-24 20:01:25.066349] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:33.712 [2024-07-24 20:01:25.066434] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:33.712 [2024-07-24 20:01:25.066447] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:33.712 [2024-07-24 20:01:25.066459] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:33.712 BaseBdev1 00:25:33.712 20:01:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@789 -- # sleep 1 00:25:34.649 20:01:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@790 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:34.649 20:01:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:34.649 20:01:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:34.649 20:01:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:34.649 20:01:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:34.649 20:01:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:34.649 20:01:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:34.649 20:01:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:34.649 20:01:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:34.649 20:01:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:34.650 20:01:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:34.650 20:01:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:34.907 20:01:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:34.907 "name": "raid_bdev1", 00:25:34.907 "uuid": "e2e73bc4-36c2-4711-a877-4cc625b9dd68", 00:25:34.907 "strip_size_kb": 0, 00:25:34.907 "state": "online", 00:25:34.907 "raid_level": "raid1", 00:25:34.907 "superblock": true, 00:25:34.907 "num_base_bdevs": 2, 00:25:34.907 "num_base_bdevs_discovered": 1, 00:25:34.907 "num_base_bdevs_operational": 1, 00:25:34.907 "base_bdevs_list": [ 00:25:34.907 { 00:25:34.907 "name": null, 00:25:34.907 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:34.907 "is_configured": false, 00:25:34.907 "data_offset": 2048, 00:25:34.907 "data_size": 63488 00:25:34.907 }, 00:25:34.907 { 00:25:34.907 "name": "BaseBdev2", 00:25:34.907 "uuid": "e68b202d-37f1-581d-9686-4e4b5e3107ed", 00:25:34.907 "is_configured": true, 00:25:34.907 "data_offset": 2048, 00:25:34.907 "data_size": 63488 00:25:34.907 } 00:25:34.907 ] 00:25:34.907 }' 00:25:34.907 20:01:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:34.907 20:01:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:35.542 20:01:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:35.542 20:01:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:35.542 20:01:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:35.542 20:01:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:35.542 20:01:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:35.542 20:01:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:35.542 20:01:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:35.802 20:01:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:35.802 "name": "raid_bdev1", 00:25:35.802 "uuid": "e2e73bc4-36c2-4711-a877-4cc625b9dd68", 00:25:35.802 "strip_size_kb": 0, 00:25:35.802 "state": "online", 00:25:35.802 "raid_level": "raid1", 00:25:35.802 "superblock": true, 00:25:35.802 "num_base_bdevs": 2, 00:25:35.802 "num_base_bdevs_discovered": 1, 00:25:35.802 "num_base_bdevs_operational": 1, 00:25:35.802 "base_bdevs_list": [ 00:25:35.802 { 00:25:35.802 "name": null, 00:25:35.802 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:35.802 "is_configured": false, 00:25:35.802 "data_offset": 2048, 00:25:35.802 "data_size": 63488 00:25:35.802 }, 00:25:35.802 { 00:25:35.802 "name": "BaseBdev2", 00:25:35.802 "uuid": "e68b202d-37f1-581d-9686-4e4b5e3107ed", 00:25:35.802 "is_configured": true, 00:25:35.802 "data_offset": 2048, 00:25:35.802 "data_size": 63488 00:25:35.802 } 00:25:35.802 ] 00:25:35.802 }' 00:25:35.802 20:01:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:35.802 20:01:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:35.802 20:01:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:35.802 20:01:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:35.802 20:01:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@792 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:35.802 20:01:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # local es=0 00:25:35.802 20:01:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:35.802 20:01:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:35.802 20:01:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:25:35.802 20:01:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:35.802 20:01:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:25:35.802 20:01:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:35.802 20:01:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:25:35.802 20:01:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:35.802 20:01:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:35.802 20:01:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:36.061 [2024-07-24 20:01:27.532796] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:36.061 [2024-07-24 20:01:27.532929] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:36.061 [2024-07-24 20:01:27.532947] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:36.061 request: 00:25:36.061 { 00:25:36.061 "base_bdev": "BaseBdev1", 00:25:36.061 "raid_bdev": "raid_bdev1", 00:25:36.061 "method": "bdev_raid_add_base_bdev", 00:25:36.061 "req_id": 1 00:25:36.061 } 00:25:36.061 Got JSON-RPC error response 00:25:36.061 response: 00:25:36.061 { 00:25:36.061 "code": -22, 00:25:36.061 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:25:36.061 } 00:25:36.061 20:01:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@653 -- # es=1 00:25:36.061 20:01:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:25:36.061 20:01:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:25:36.061 20:01:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:25:36.061 20:01:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@793 -- # sleep 1 00:25:37.006 20:01:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@794 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:37.006 20:01:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:37.006 20:01:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:37.006 20:01:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:37.006 20:01:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:37.006 20:01:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:37.006 20:01:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:37.006 20:01:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:37.006 20:01:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:37.006 20:01:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:37.006 20:01:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:37.006 20:01:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:37.266 20:01:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:37.266 "name": "raid_bdev1", 00:25:37.266 "uuid": "e2e73bc4-36c2-4711-a877-4cc625b9dd68", 00:25:37.266 "strip_size_kb": 0, 00:25:37.266 "state": "online", 00:25:37.266 "raid_level": "raid1", 00:25:37.266 "superblock": true, 00:25:37.266 "num_base_bdevs": 2, 00:25:37.266 "num_base_bdevs_discovered": 1, 00:25:37.266 "num_base_bdevs_operational": 1, 00:25:37.266 "base_bdevs_list": [ 00:25:37.266 { 00:25:37.266 "name": null, 00:25:37.266 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:37.266 "is_configured": false, 00:25:37.266 "data_offset": 2048, 00:25:37.266 "data_size": 63488 00:25:37.266 }, 00:25:37.266 { 00:25:37.266 "name": "BaseBdev2", 00:25:37.266 "uuid": "e68b202d-37f1-581d-9686-4e4b5e3107ed", 00:25:37.266 "is_configured": true, 00:25:37.266 "data_offset": 2048, 00:25:37.266 "data_size": 63488 00:25:37.266 } 00:25:37.266 ] 00:25:37.266 }' 00:25:37.266 20:01:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:37.266 20:01:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:37.835 20:01:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@795 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:37.835 20:01:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:37.835 20:01:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:37.835 20:01:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:37.835 20:01:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:37.835 20:01:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:37.835 20:01:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:38.094 20:01:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:38.094 "name": "raid_bdev1", 00:25:38.094 "uuid": "e2e73bc4-36c2-4711-a877-4cc625b9dd68", 00:25:38.094 "strip_size_kb": 0, 00:25:38.094 "state": "online", 00:25:38.094 "raid_level": "raid1", 00:25:38.094 "superblock": true, 00:25:38.094 "num_base_bdevs": 2, 00:25:38.094 "num_base_bdevs_discovered": 1, 00:25:38.094 "num_base_bdevs_operational": 1, 00:25:38.094 "base_bdevs_list": [ 00:25:38.094 { 00:25:38.094 "name": null, 00:25:38.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:38.094 "is_configured": false, 00:25:38.094 "data_offset": 2048, 00:25:38.094 "data_size": 63488 00:25:38.094 }, 00:25:38.094 { 00:25:38.094 "name": "BaseBdev2", 00:25:38.094 "uuid": "e68b202d-37f1-581d-9686-4e4b5e3107ed", 00:25:38.094 "is_configured": true, 00:25:38.094 "data_offset": 2048, 00:25:38.094 "data_size": 63488 00:25:38.094 } 00:25:38.094 ] 00:25:38.094 }' 00:25:38.094 20:01:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:38.353 20:01:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:38.353 20:01:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:38.353 20:01:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:38.353 20:01:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@798 -- # killprocess 1495904 00:25:38.353 20:01:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@950 -- # '[' -z 1495904 ']' 00:25:38.353 20:01:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # kill -0 1495904 00:25:38.353 20:01:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # uname 00:25:38.353 20:01:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:25:38.353 20:01:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1495904 00:25:38.353 20:01:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:25:38.353 20:01:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:25:38.353 20:01:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1495904' 00:25:38.353 killing process with pid 1495904 00:25:38.353 20:01:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@969 -- # kill 1495904 00:25:38.353 Received shutdown signal, test time was about 27.119298 seconds 00:25:38.353 00:25:38.353 Latency(us) 00:25:38.353 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:38.353 =================================================================================================================== 00:25:38.353 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:25:38.353 [2024-07-24 20:01:29.809589] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:38.353 [2024-07-24 20:01:29.809686] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:38.353 [2024-07-24 20:01:29.809739] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:38.353 20:01:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@974 -- # wait 1495904 00:25:38.353 [2024-07-24 20:01:29.809751] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16b5ef0 name raid_bdev1, state offline 00:25:38.353 [2024-07-24 20:01:29.830793] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:38.613 20:01:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@800 -- # return 0 00:25:38.613 00:25:38.613 real 0m31.755s 00:25:38.613 user 0m49.598s 00:25:38.613 sys 0m4.646s 00:25:38.613 20:01:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:25:38.613 20:01:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:38.613 ************************************ 00:25:38.613 END TEST raid_rebuild_test_sb_io 00:25:38.613 ************************************ 00:25:38.613 20:01:30 bdev_raid -- bdev/bdev_raid.sh@956 -- # for n in 2 4 00:25:38.613 20:01:30 bdev_raid -- bdev/bdev_raid.sh@957 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:25:38.613 20:01:30 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:25:38.613 20:01:30 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:25:38.613 20:01:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:38.613 ************************************ 00:25:38.613 START TEST raid_rebuild_test 00:25:38.613 ************************************ 00:25:38.613 20:01:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 4 false false true 00:25:38.613 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:25:38.613 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=4 00:25:38.613 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@586 -- # local superblock=false 00:25:38.613 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@587 -- # local background_io=false 00:25:38.613 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # local verify=true 00:25:38.613 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:25:38.613 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:38.613 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:25:38.613 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:25:38.613 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:38.613 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:25:38.613 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:25:38.613 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:38.613 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # echo BaseBdev3 00:25:38.613 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:25:38.613 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:38.613 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # echo BaseBdev4 00:25:38.613 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:25:38.614 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:38.614 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:25:38.614 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:25:38.614 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:25:38.614 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # local strip_size 00:25:38.614 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@592 -- # local create_arg 00:25:38.614 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:25:38.614 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@594 -- # local data_offset 00:25:38.614 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:25:38.614 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:25:38.614 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # '[' false = true ']' 00:25:38.614 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # raid_pid=1500450 00:25:38.614 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@613 -- # waitforlisten 1500450 /var/tmp/spdk-raid.sock 00:25:38.614 20:01:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:38.614 20:01:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@831 -- # '[' -z 1500450 ']' 00:25:38.614 20:01:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:38.614 20:01:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:25:38.614 20:01:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:38.614 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:38.614 20:01:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:25:38.614 20:01:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:38.614 [2024-07-24 20:01:30.194809] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:25:38.614 [2024-07-24 20:01:30.194875] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1500450 ] 00:25:38.614 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:38.614 Zero copy mechanism will not be used. 00:25:38.873 [2024-07-24 20:01:30.325847] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:38.873 [2024-07-24 20:01:30.432141] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:39.133 [2024-07-24 20:01:30.503121] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:39.133 [2024-07-24 20:01:30.503162] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:39.702 20:01:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:25:39.702 20:01:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@864 -- # return 0 00:25:39.702 20:01:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:25:39.702 20:01:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:39.961 BaseBdev1_malloc 00:25:39.961 20:01:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:40.220 [2024-07-24 20:01:31.605381] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:40.220 [2024-07-24 20:01:31.605437] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:40.220 [2024-07-24 20:01:31.605459] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1384cd0 00:25:40.220 [2024-07-24 20:01:31.605472] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:40.220 [2024-07-24 20:01:31.607003] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:40.220 [2024-07-24 20:01:31.607033] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:40.220 BaseBdev1 00:25:40.220 20:01:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:25:40.220 20:01:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:40.479 BaseBdev2_malloc 00:25:40.479 20:01:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:40.738 [2024-07-24 20:01:32.107443] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:40.738 [2024-07-24 20:01:32.107496] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:40.738 [2024-07-24 20:01:32.107516] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1388460 00:25:40.738 [2024-07-24 20:01:32.107529] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:40.738 [2024-07-24 20:01:32.109066] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:40.738 [2024-07-24 20:01:32.109096] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:40.738 BaseBdev2 00:25:40.738 20:01:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:25:40.738 20:01:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:40.997 BaseBdev3_malloc 00:25:40.997 20:01:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:25:41.256 [2024-07-24 20:01:32.594558] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:25:41.256 [2024-07-24 20:01:32.594605] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:41.256 [2024-07-24 20:01:32.594626] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1448780 00:25:41.256 [2024-07-24 20:01:32.594639] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:41.256 [2024-07-24 20:01:32.596224] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:41.256 [2024-07-24 20:01:32.596256] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:41.256 BaseBdev3 00:25:41.256 20:01:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:25:41.256 20:01:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:25:41.515 BaseBdev4_malloc 00:25:41.516 20:01:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:25:41.516 [2024-07-24 20:01:33.085711] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:25:41.516 [2024-07-24 20:01:33.085759] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:41.516 [2024-07-24 20:01:33.085780] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1447e60 00:25:41.516 [2024-07-24 20:01:33.085792] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:41.516 [2024-07-24 20:01:33.087311] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:41.516 [2024-07-24 20:01:33.087343] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:25:41.516 BaseBdev4 00:25:41.516 20:01:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:41.778 spare_malloc 00:25:41.778 20:01:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:42.036 spare_delay 00:25:42.036 20:01:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:42.294 [2024-07-24 20:01:33.828250] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:42.294 [2024-07-24 20:01:33.828295] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:42.294 [2024-07-24 20:01:33.828318] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x137ea50 00:25:42.294 [2024-07-24 20:01:33.828331] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:42.294 [2024-07-24 20:01:33.829900] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:42.294 [2024-07-24 20:01:33.829934] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:42.294 spare 00:25:42.294 20:01:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:25:42.553 [2024-07-24 20:01:34.068906] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:42.553 [2024-07-24 20:01:34.070202] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:42.553 [2024-07-24 20:01:34.070256] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:42.553 [2024-07-24 20:01:34.070301] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:42.553 [2024-07-24 20:01:34.070386] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1381130 00:25:42.553 [2024-07-24 20:01:34.070405] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:25:42.553 [2024-07-24 20:01:34.070620] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x137cfc0 00:25:42.553 [2024-07-24 20:01:34.070777] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1381130 00:25:42.553 [2024-07-24 20:01:34.070788] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1381130 00:25:42.553 [2024-07-24 20:01:34.070903] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:42.553 20:01:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:25:42.553 20:01:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:42.553 20:01:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:42.553 20:01:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:42.553 20:01:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:42.553 20:01:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:42.553 20:01:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:42.553 20:01:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:42.553 20:01:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:42.553 20:01:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:42.553 20:01:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:42.553 20:01:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:42.812 20:01:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:42.812 "name": "raid_bdev1", 00:25:42.812 "uuid": "6836dc09-67f2-435c-8afd-a1919890cb07", 00:25:42.812 "strip_size_kb": 0, 00:25:42.812 "state": "online", 00:25:42.812 "raid_level": "raid1", 00:25:42.812 "superblock": false, 00:25:42.812 "num_base_bdevs": 4, 00:25:42.812 "num_base_bdevs_discovered": 4, 00:25:42.812 "num_base_bdevs_operational": 4, 00:25:42.812 "base_bdevs_list": [ 00:25:42.812 { 00:25:42.812 "name": "BaseBdev1", 00:25:42.812 "uuid": "796f2353-08e6-5df9-98b5-07731275ec6f", 00:25:42.812 "is_configured": true, 00:25:42.812 "data_offset": 0, 00:25:42.812 "data_size": 65536 00:25:42.812 }, 00:25:42.812 { 00:25:42.812 "name": "BaseBdev2", 00:25:42.812 "uuid": "89c36be6-d8dc-5260-9df6-748bb2676c55", 00:25:42.812 "is_configured": true, 00:25:42.812 "data_offset": 0, 00:25:42.812 "data_size": 65536 00:25:42.812 }, 00:25:42.812 { 00:25:42.812 "name": "BaseBdev3", 00:25:42.812 "uuid": "e6350d20-3442-5d18-b509-02a808fee572", 00:25:42.812 "is_configured": true, 00:25:42.812 "data_offset": 0, 00:25:42.812 "data_size": 65536 00:25:42.812 }, 00:25:42.812 { 00:25:42.812 "name": "BaseBdev4", 00:25:42.812 "uuid": "e76e0d2d-39bb-541c-8887-c229c674b723", 00:25:42.812 "is_configured": true, 00:25:42.812 "data_offset": 0, 00:25:42.812 "data_size": 65536 00:25:42.812 } 00:25:42.812 ] 00:25:42.812 }' 00:25:42.812 20:01:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:42.812 20:01:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:43.381 20:01:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:43.381 20:01:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:25:43.640 [2024-07-24 20:01:35.164087] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:43.640 20:01:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=65536 00:25:43.640 20:01:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:43.640 20:01:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:43.899 20:01:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # data_offset=0 00:25:43.899 20:01:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@636 -- # '[' false = true ']' 00:25:43.899 20:01:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # '[' true = true ']' 00:25:43.899 20:01:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@640 -- # local write_unit_size 00:25:43.899 20:01:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@643 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:25:43.899 20:01:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:43.899 20:01:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:25:43.899 20:01:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:43.899 20:01:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:43.899 20:01:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:43.899 20:01:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:25:43.899 20:01:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:43.899 20:01:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:43.899 20:01:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:25:44.157 [2024-07-24 20:01:35.665158] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x137cfc0 00:25:44.157 /dev/nbd0 00:25:44.157 20:01:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:44.157 20:01:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:44.157 20:01:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:25:44.157 20:01:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:25:44.157 20:01:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:44.157 20:01:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:44.157 20:01:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:25:44.157 20:01:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:25:44.157 20:01:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:44.157 20:01:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:44.157 20:01:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:44.157 1+0 records in 00:25:44.157 1+0 records out 00:25:44.157 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000251442 s, 16.3 MB/s 00:25:44.157 20:01:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:44.157 20:01:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:25:44.157 20:01:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:44.157 20:01:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:44.157 20:01:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:25:44.157 20:01:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:44.157 20:01:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:44.157 20:01:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@644 -- # '[' raid1 = raid5f ']' 00:25:44.157 20:01:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@648 -- # write_unit_size=1 00:25:44.157 20:01:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@650 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:25:52.289 65536+0 records in 00:25:52.289 65536+0 records out 00:25:52.289 33554432 bytes (34 MB, 32 MiB) copied, 6.88264 s, 4.9 MB/s 00:25:52.289 20:01:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@651 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:52.289 20:01:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:52.289 20:01:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:52.289 20:01:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:52.289 20:01:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:25:52.289 20:01:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:52.289 20:01:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:52.289 20:01:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:52.289 20:01:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:52.289 20:01:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:52.289 20:01:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:52.289 20:01:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:52.289 20:01:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:52.289 20:01:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:25:52.289 20:01:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:25:52.289 20:01:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:52.289 [2024-07-24 20:01:42.900130] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:52.289 [2024-07-24 20:01:43.120442] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:52.289 20:01:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:52.289 20:01:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:52.289 20:01:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:52.289 20:01:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:52.289 20:01:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:52.289 20:01:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:52.290 20:01:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:52.290 20:01:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:52.290 20:01:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:52.290 20:01:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:52.290 20:01:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:52.290 20:01:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:52.290 20:01:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:52.290 "name": "raid_bdev1", 00:25:52.290 "uuid": "6836dc09-67f2-435c-8afd-a1919890cb07", 00:25:52.290 "strip_size_kb": 0, 00:25:52.290 "state": "online", 00:25:52.290 "raid_level": "raid1", 00:25:52.290 "superblock": false, 00:25:52.290 "num_base_bdevs": 4, 00:25:52.290 "num_base_bdevs_discovered": 3, 00:25:52.290 "num_base_bdevs_operational": 3, 00:25:52.290 "base_bdevs_list": [ 00:25:52.290 { 00:25:52.290 "name": null, 00:25:52.290 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:52.290 "is_configured": false, 00:25:52.290 "data_offset": 0, 00:25:52.290 "data_size": 65536 00:25:52.290 }, 00:25:52.290 { 00:25:52.290 "name": "BaseBdev2", 00:25:52.290 "uuid": "89c36be6-d8dc-5260-9df6-748bb2676c55", 00:25:52.290 "is_configured": true, 00:25:52.290 "data_offset": 0, 00:25:52.290 "data_size": 65536 00:25:52.290 }, 00:25:52.290 { 00:25:52.290 "name": "BaseBdev3", 00:25:52.290 "uuid": "e6350d20-3442-5d18-b509-02a808fee572", 00:25:52.290 "is_configured": true, 00:25:52.290 "data_offset": 0, 00:25:52.290 "data_size": 65536 00:25:52.290 }, 00:25:52.290 { 00:25:52.290 "name": "BaseBdev4", 00:25:52.290 "uuid": "e76e0d2d-39bb-541c-8887-c229c674b723", 00:25:52.290 "is_configured": true, 00:25:52.290 "data_offset": 0, 00:25:52.290 "data_size": 65536 00:25:52.290 } 00:25:52.290 ] 00:25:52.290 }' 00:25:52.290 20:01:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:52.290 20:01:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:52.548 20:01:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:52.807 [2024-07-24 20:01:44.275520] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:52.807 [2024-07-24 20:01:44.279613] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x137cfc0 00:25:52.807 [2024-07-24 20:01:44.281984] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:52.807 20:01:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:53.744 20:01:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:53.744 20:01:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:53.744 20:01:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:53.744 20:01:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:53.744 20:01:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:53.744 20:01:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:53.744 20:01:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:54.003 20:01:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:54.003 "name": "raid_bdev1", 00:25:54.003 "uuid": "6836dc09-67f2-435c-8afd-a1919890cb07", 00:25:54.003 "strip_size_kb": 0, 00:25:54.003 "state": "online", 00:25:54.003 "raid_level": "raid1", 00:25:54.003 "superblock": false, 00:25:54.003 "num_base_bdevs": 4, 00:25:54.003 "num_base_bdevs_discovered": 4, 00:25:54.003 "num_base_bdevs_operational": 4, 00:25:54.003 "process": { 00:25:54.003 "type": "rebuild", 00:25:54.003 "target": "spare", 00:25:54.003 "progress": { 00:25:54.003 "blocks": 24576, 00:25:54.003 "percent": 37 00:25:54.003 } 00:25:54.003 }, 00:25:54.003 "base_bdevs_list": [ 00:25:54.003 { 00:25:54.003 "name": "spare", 00:25:54.003 "uuid": "88875076-faea-50c7-8c60-c6a4df546d9b", 00:25:54.003 "is_configured": true, 00:25:54.003 "data_offset": 0, 00:25:54.003 "data_size": 65536 00:25:54.003 }, 00:25:54.003 { 00:25:54.003 "name": "BaseBdev2", 00:25:54.003 "uuid": "89c36be6-d8dc-5260-9df6-748bb2676c55", 00:25:54.003 "is_configured": true, 00:25:54.003 "data_offset": 0, 00:25:54.003 "data_size": 65536 00:25:54.003 }, 00:25:54.003 { 00:25:54.003 "name": "BaseBdev3", 00:25:54.003 "uuid": "e6350d20-3442-5d18-b509-02a808fee572", 00:25:54.003 "is_configured": true, 00:25:54.003 "data_offset": 0, 00:25:54.003 "data_size": 65536 00:25:54.003 }, 00:25:54.003 { 00:25:54.003 "name": "BaseBdev4", 00:25:54.003 "uuid": "e76e0d2d-39bb-541c-8887-c229c674b723", 00:25:54.003 "is_configured": true, 00:25:54.003 "data_offset": 0, 00:25:54.003 "data_size": 65536 00:25:54.003 } 00:25:54.003 ] 00:25:54.003 }' 00:25:54.003 20:01:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:54.262 20:01:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:54.262 20:01:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:54.262 20:01:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:54.262 20:01:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:54.521 [2024-07-24 20:01:45.868753] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:54.521 [2024-07-24 20:01:45.894688] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:54.521 [2024-07-24 20:01:45.894730] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:54.521 [2024-07-24 20:01:45.894753] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:54.521 [2024-07-24 20:01:45.894762] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:54.521 20:01:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:54.521 20:01:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:54.521 20:01:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:54.521 20:01:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:54.521 20:01:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:54.521 20:01:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:54.521 20:01:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:54.521 20:01:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:54.521 20:01:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:54.521 20:01:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:54.521 20:01:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:54.521 20:01:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:54.780 20:01:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:54.780 "name": "raid_bdev1", 00:25:54.780 "uuid": "6836dc09-67f2-435c-8afd-a1919890cb07", 00:25:54.780 "strip_size_kb": 0, 00:25:54.780 "state": "online", 00:25:54.780 "raid_level": "raid1", 00:25:54.780 "superblock": false, 00:25:54.780 "num_base_bdevs": 4, 00:25:54.780 "num_base_bdevs_discovered": 3, 00:25:54.780 "num_base_bdevs_operational": 3, 00:25:54.780 "base_bdevs_list": [ 00:25:54.780 { 00:25:54.780 "name": null, 00:25:54.780 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:54.780 "is_configured": false, 00:25:54.780 "data_offset": 0, 00:25:54.780 "data_size": 65536 00:25:54.780 }, 00:25:54.780 { 00:25:54.780 "name": "BaseBdev2", 00:25:54.780 "uuid": "89c36be6-d8dc-5260-9df6-748bb2676c55", 00:25:54.780 "is_configured": true, 00:25:54.781 "data_offset": 0, 00:25:54.781 "data_size": 65536 00:25:54.781 }, 00:25:54.781 { 00:25:54.781 "name": "BaseBdev3", 00:25:54.781 "uuid": "e6350d20-3442-5d18-b509-02a808fee572", 00:25:54.781 "is_configured": true, 00:25:54.781 "data_offset": 0, 00:25:54.781 "data_size": 65536 00:25:54.781 }, 00:25:54.781 { 00:25:54.781 "name": "BaseBdev4", 00:25:54.781 "uuid": "e76e0d2d-39bb-541c-8887-c229c674b723", 00:25:54.781 "is_configured": true, 00:25:54.781 "data_offset": 0, 00:25:54.781 "data_size": 65536 00:25:54.781 } 00:25:54.781 ] 00:25:54.781 }' 00:25:54.781 20:01:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:54.781 20:01:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:55.349 20:01:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:55.349 20:01:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:55.349 20:01:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:55.349 20:01:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:55.349 20:01:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:55.349 20:01:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:55.349 20:01:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:55.608 20:01:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:55.608 "name": "raid_bdev1", 00:25:55.608 "uuid": "6836dc09-67f2-435c-8afd-a1919890cb07", 00:25:55.608 "strip_size_kb": 0, 00:25:55.608 "state": "online", 00:25:55.608 "raid_level": "raid1", 00:25:55.608 "superblock": false, 00:25:55.608 "num_base_bdevs": 4, 00:25:55.608 "num_base_bdevs_discovered": 3, 00:25:55.608 "num_base_bdevs_operational": 3, 00:25:55.608 "base_bdevs_list": [ 00:25:55.608 { 00:25:55.608 "name": null, 00:25:55.608 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:55.608 "is_configured": false, 00:25:55.608 "data_offset": 0, 00:25:55.608 "data_size": 65536 00:25:55.608 }, 00:25:55.608 { 00:25:55.608 "name": "BaseBdev2", 00:25:55.608 "uuid": "89c36be6-d8dc-5260-9df6-748bb2676c55", 00:25:55.608 "is_configured": true, 00:25:55.608 "data_offset": 0, 00:25:55.608 "data_size": 65536 00:25:55.608 }, 00:25:55.608 { 00:25:55.608 "name": "BaseBdev3", 00:25:55.608 "uuid": "e6350d20-3442-5d18-b509-02a808fee572", 00:25:55.608 "is_configured": true, 00:25:55.608 "data_offset": 0, 00:25:55.608 "data_size": 65536 00:25:55.608 }, 00:25:55.608 { 00:25:55.608 "name": "BaseBdev4", 00:25:55.608 "uuid": "e76e0d2d-39bb-541c-8887-c229c674b723", 00:25:55.608 "is_configured": true, 00:25:55.608 "data_offset": 0, 00:25:55.608 "data_size": 65536 00:25:55.608 } 00:25:55.608 ] 00:25:55.608 }' 00:25:55.608 20:01:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:55.608 20:01:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:55.608 20:01:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:55.608 20:01:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:55.608 20:01:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:55.868 [2024-07-24 20:01:47.386720] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:55.868 [2024-07-24 20:01:47.390808] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x137cfc0 00:25:55.868 [2024-07-24 20:01:47.392321] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:55.868 20:01:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@678 -- # sleep 1 00:25:56.867 20:01:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:56.867 20:01:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:56.867 20:01:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:56.867 20:01:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:56.867 20:01:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:56.867 20:01:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:56.867 20:01:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:57.126 20:01:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:57.126 "name": "raid_bdev1", 00:25:57.126 "uuid": "6836dc09-67f2-435c-8afd-a1919890cb07", 00:25:57.126 "strip_size_kb": 0, 00:25:57.126 "state": "online", 00:25:57.126 "raid_level": "raid1", 00:25:57.126 "superblock": false, 00:25:57.126 "num_base_bdevs": 4, 00:25:57.126 "num_base_bdevs_discovered": 4, 00:25:57.126 "num_base_bdevs_operational": 4, 00:25:57.126 "process": { 00:25:57.126 "type": "rebuild", 00:25:57.126 "target": "spare", 00:25:57.126 "progress": { 00:25:57.126 "blocks": 22528, 00:25:57.126 "percent": 34 00:25:57.126 } 00:25:57.126 }, 00:25:57.126 "base_bdevs_list": [ 00:25:57.126 { 00:25:57.126 "name": "spare", 00:25:57.126 "uuid": "88875076-faea-50c7-8c60-c6a4df546d9b", 00:25:57.126 "is_configured": true, 00:25:57.126 "data_offset": 0, 00:25:57.126 "data_size": 65536 00:25:57.126 }, 00:25:57.126 { 00:25:57.126 "name": "BaseBdev2", 00:25:57.126 "uuid": "89c36be6-d8dc-5260-9df6-748bb2676c55", 00:25:57.126 "is_configured": true, 00:25:57.126 "data_offset": 0, 00:25:57.126 "data_size": 65536 00:25:57.126 }, 00:25:57.126 { 00:25:57.126 "name": "BaseBdev3", 00:25:57.126 "uuid": "e6350d20-3442-5d18-b509-02a808fee572", 00:25:57.126 "is_configured": true, 00:25:57.126 "data_offset": 0, 00:25:57.126 "data_size": 65536 00:25:57.126 }, 00:25:57.126 { 00:25:57.126 "name": "BaseBdev4", 00:25:57.126 "uuid": "e76e0d2d-39bb-541c-8887-c229c674b723", 00:25:57.126 "is_configured": true, 00:25:57.126 "data_offset": 0, 00:25:57.126 "data_size": 65536 00:25:57.126 } 00:25:57.126 ] 00:25:57.126 }' 00:25:57.126 20:01:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:57.126 20:01:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:57.126 20:01:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:57.385 20:01:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:57.385 20:01:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@681 -- # '[' false = true ']' 00:25:57.385 20:01:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=4 00:25:57.385 20:01:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:25:57.385 20:01:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # '[' 4 -gt 2 ']' 00:25:57.385 20:01:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:25:57.385 [2024-07-24 20:01:48.956560] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:57.645 [2024-07-24 20:01:49.004613] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x137cfc0 00:25:57.645 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@713 -- # base_bdevs[1]= 00:25:57.645 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # (( num_base_bdevs_operational-- )) 00:25:57.645 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@717 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:57.645 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:57.645 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:57.645 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:57.645 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:57.645 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:57.645 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:57.903 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:57.903 "name": "raid_bdev1", 00:25:57.903 "uuid": "6836dc09-67f2-435c-8afd-a1919890cb07", 00:25:57.903 "strip_size_kb": 0, 00:25:57.903 "state": "online", 00:25:57.903 "raid_level": "raid1", 00:25:57.903 "superblock": false, 00:25:57.903 "num_base_bdevs": 4, 00:25:57.903 "num_base_bdevs_discovered": 3, 00:25:57.903 "num_base_bdevs_operational": 3, 00:25:57.903 "process": { 00:25:57.903 "type": "rebuild", 00:25:57.903 "target": "spare", 00:25:57.903 "progress": { 00:25:57.903 "blocks": 36864, 00:25:57.903 "percent": 56 00:25:57.903 } 00:25:57.903 }, 00:25:57.903 "base_bdevs_list": [ 00:25:57.903 { 00:25:57.903 "name": "spare", 00:25:57.903 "uuid": "88875076-faea-50c7-8c60-c6a4df546d9b", 00:25:57.903 "is_configured": true, 00:25:57.903 "data_offset": 0, 00:25:57.903 "data_size": 65536 00:25:57.903 }, 00:25:57.903 { 00:25:57.903 "name": null, 00:25:57.903 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:57.903 "is_configured": false, 00:25:57.903 "data_offset": 0, 00:25:57.903 "data_size": 65536 00:25:57.903 }, 00:25:57.903 { 00:25:57.903 "name": "BaseBdev3", 00:25:57.903 "uuid": "e6350d20-3442-5d18-b509-02a808fee572", 00:25:57.903 "is_configured": true, 00:25:57.903 "data_offset": 0, 00:25:57.903 "data_size": 65536 00:25:57.903 }, 00:25:57.903 { 00:25:57.903 "name": "BaseBdev4", 00:25:57.903 "uuid": "e76e0d2d-39bb-541c-8887-c229c674b723", 00:25:57.903 "is_configured": true, 00:25:57.903 "data_offset": 0, 00:25:57.903 "data_size": 65536 00:25:57.903 } 00:25:57.903 ] 00:25:57.903 }' 00:25:57.903 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:57.903 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:57.903 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:57.903 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:57.903 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # local timeout=930 00:25:57.903 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:25:57.903 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:57.903 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:57.903 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:57.903 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:57.903 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:57.903 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:57.903 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:58.161 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:58.161 "name": "raid_bdev1", 00:25:58.161 "uuid": "6836dc09-67f2-435c-8afd-a1919890cb07", 00:25:58.161 "strip_size_kb": 0, 00:25:58.161 "state": "online", 00:25:58.161 "raid_level": "raid1", 00:25:58.161 "superblock": false, 00:25:58.161 "num_base_bdevs": 4, 00:25:58.161 "num_base_bdevs_discovered": 3, 00:25:58.161 "num_base_bdevs_operational": 3, 00:25:58.161 "process": { 00:25:58.161 "type": "rebuild", 00:25:58.161 "target": "spare", 00:25:58.161 "progress": { 00:25:58.161 "blocks": 43008, 00:25:58.161 "percent": 65 00:25:58.161 } 00:25:58.161 }, 00:25:58.161 "base_bdevs_list": [ 00:25:58.161 { 00:25:58.161 "name": "spare", 00:25:58.161 "uuid": "88875076-faea-50c7-8c60-c6a4df546d9b", 00:25:58.161 "is_configured": true, 00:25:58.161 "data_offset": 0, 00:25:58.161 "data_size": 65536 00:25:58.161 }, 00:25:58.161 { 00:25:58.161 "name": null, 00:25:58.161 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:58.161 "is_configured": false, 00:25:58.161 "data_offset": 0, 00:25:58.161 "data_size": 65536 00:25:58.161 }, 00:25:58.161 { 00:25:58.161 "name": "BaseBdev3", 00:25:58.161 "uuid": "e6350d20-3442-5d18-b509-02a808fee572", 00:25:58.161 "is_configured": true, 00:25:58.161 "data_offset": 0, 00:25:58.161 "data_size": 65536 00:25:58.161 }, 00:25:58.161 { 00:25:58.161 "name": "BaseBdev4", 00:25:58.161 "uuid": "e76e0d2d-39bb-541c-8887-c229c674b723", 00:25:58.161 "is_configured": true, 00:25:58.161 "data_offset": 0, 00:25:58.161 "data_size": 65536 00:25:58.161 } 00:25:58.161 ] 00:25:58.161 }' 00:25:58.161 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:58.161 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:58.161 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:58.161 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:58.161 20:01:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@726 -- # sleep 1 00:25:59.094 [2024-07-24 20:01:50.616930] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:59.094 [2024-07-24 20:01:50.616992] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:59.094 [2024-07-24 20:01:50.617030] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:59.094 20:01:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:25:59.095 20:01:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:59.095 20:01:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:59.095 20:01:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:59.095 20:01:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:59.095 20:01:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:59.095 20:01:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:59.095 20:01:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:59.353 20:01:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:59.353 "name": "raid_bdev1", 00:25:59.353 "uuid": "6836dc09-67f2-435c-8afd-a1919890cb07", 00:25:59.353 "strip_size_kb": 0, 00:25:59.353 "state": "online", 00:25:59.353 "raid_level": "raid1", 00:25:59.353 "superblock": false, 00:25:59.353 "num_base_bdevs": 4, 00:25:59.353 "num_base_bdevs_discovered": 3, 00:25:59.353 "num_base_bdevs_operational": 3, 00:25:59.353 "base_bdevs_list": [ 00:25:59.353 { 00:25:59.353 "name": "spare", 00:25:59.353 "uuid": "88875076-faea-50c7-8c60-c6a4df546d9b", 00:25:59.353 "is_configured": true, 00:25:59.354 "data_offset": 0, 00:25:59.354 "data_size": 65536 00:25:59.354 }, 00:25:59.354 { 00:25:59.354 "name": null, 00:25:59.354 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:59.354 "is_configured": false, 00:25:59.354 "data_offset": 0, 00:25:59.354 "data_size": 65536 00:25:59.354 }, 00:25:59.354 { 00:25:59.354 "name": "BaseBdev3", 00:25:59.354 "uuid": "e6350d20-3442-5d18-b509-02a808fee572", 00:25:59.354 "is_configured": true, 00:25:59.354 "data_offset": 0, 00:25:59.354 "data_size": 65536 00:25:59.354 }, 00:25:59.354 { 00:25:59.354 "name": "BaseBdev4", 00:25:59.354 "uuid": "e76e0d2d-39bb-541c-8887-c229c674b723", 00:25:59.354 "is_configured": true, 00:25:59.354 "data_offset": 0, 00:25:59.354 "data_size": 65536 00:25:59.354 } 00:25:59.354 ] 00:25:59.354 }' 00:25:59.354 20:01:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:59.354 20:01:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:59.354 20:01:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:59.613 20:01:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:59.613 20:01:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@724 -- # break 00:25:59.613 20:01:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:59.613 20:01:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:59.613 20:01:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:59.613 20:01:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:59.613 20:01:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:59.613 20:01:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:59.613 20:01:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:59.871 20:01:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:59.871 "name": "raid_bdev1", 00:25:59.871 "uuid": "6836dc09-67f2-435c-8afd-a1919890cb07", 00:25:59.871 "strip_size_kb": 0, 00:25:59.871 "state": "online", 00:25:59.871 "raid_level": "raid1", 00:25:59.871 "superblock": false, 00:25:59.871 "num_base_bdevs": 4, 00:25:59.871 "num_base_bdevs_discovered": 3, 00:25:59.871 "num_base_bdevs_operational": 3, 00:25:59.871 "base_bdevs_list": [ 00:25:59.871 { 00:25:59.871 "name": "spare", 00:25:59.871 "uuid": "88875076-faea-50c7-8c60-c6a4df546d9b", 00:25:59.871 "is_configured": true, 00:25:59.871 "data_offset": 0, 00:25:59.871 "data_size": 65536 00:25:59.871 }, 00:25:59.871 { 00:25:59.871 "name": null, 00:25:59.871 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:59.871 "is_configured": false, 00:25:59.871 "data_offset": 0, 00:25:59.871 "data_size": 65536 00:25:59.871 }, 00:25:59.871 { 00:25:59.871 "name": "BaseBdev3", 00:25:59.871 "uuid": "e6350d20-3442-5d18-b509-02a808fee572", 00:25:59.871 "is_configured": true, 00:25:59.871 "data_offset": 0, 00:25:59.871 "data_size": 65536 00:25:59.871 }, 00:25:59.871 { 00:25:59.871 "name": "BaseBdev4", 00:25:59.871 "uuid": "e76e0d2d-39bb-541c-8887-c229c674b723", 00:25:59.871 "is_configured": true, 00:25:59.871 "data_offset": 0, 00:25:59.871 "data_size": 65536 00:25:59.871 } 00:25:59.871 ] 00:25:59.871 }' 00:25:59.871 20:01:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:59.871 20:01:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:59.871 20:01:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:59.871 20:01:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:59.871 20:01:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:59.871 20:01:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:59.871 20:01:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:59.871 20:01:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:59.871 20:01:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:59.871 20:01:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:59.871 20:01:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:59.871 20:01:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:59.871 20:01:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:59.871 20:01:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:59.871 20:01:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:59.871 20:01:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:00.130 20:01:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:00.130 "name": "raid_bdev1", 00:26:00.130 "uuid": "6836dc09-67f2-435c-8afd-a1919890cb07", 00:26:00.130 "strip_size_kb": 0, 00:26:00.130 "state": "online", 00:26:00.130 "raid_level": "raid1", 00:26:00.130 "superblock": false, 00:26:00.130 "num_base_bdevs": 4, 00:26:00.130 "num_base_bdevs_discovered": 3, 00:26:00.130 "num_base_bdevs_operational": 3, 00:26:00.130 "base_bdevs_list": [ 00:26:00.130 { 00:26:00.130 "name": "spare", 00:26:00.130 "uuid": "88875076-faea-50c7-8c60-c6a4df546d9b", 00:26:00.130 "is_configured": true, 00:26:00.130 "data_offset": 0, 00:26:00.130 "data_size": 65536 00:26:00.130 }, 00:26:00.130 { 00:26:00.130 "name": null, 00:26:00.130 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:00.130 "is_configured": false, 00:26:00.130 "data_offset": 0, 00:26:00.130 "data_size": 65536 00:26:00.130 }, 00:26:00.130 { 00:26:00.130 "name": "BaseBdev3", 00:26:00.130 "uuid": "e6350d20-3442-5d18-b509-02a808fee572", 00:26:00.130 "is_configured": true, 00:26:00.130 "data_offset": 0, 00:26:00.130 "data_size": 65536 00:26:00.130 }, 00:26:00.130 { 00:26:00.130 "name": "BaseBdev4", 00:26:00.130 "uuid": "e76e0d2d-39bb-541c-8887-c229c674b723", 00:26:00.130 "is_configured": true, 00:26:00.130 "data_offset": 0, 00:26:00.130 "data_size": 65536 00:26:00.130 } 00:26:00.130 ] 00:26:00.130 }' 00:26:00.130 20:01:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:00.130 20:01:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:26:00.701 20:01:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:00.960 [2024-07-24 20:01:52.473640] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:00.960 [2024-07-24 20:01:52.473669] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:00.960 [2024-07-24 20:01:52.473724] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:00.960 [2024-07-24 20:01:52.473795] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:00.960 [2024-07-24 20:01:52.473807] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1381130 name raid_bdev1, state offline 00:26:00.960 20:01:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:00.960 20:01:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@735 -- # jq length 00:26:01.219 20:01:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:26:01.219 20:01:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:26:01.219 20:01:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # '[' false = true ']' 00:26:01.219 20:01:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@752 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:26:01.219 20:01:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:01.219 20:01:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:26:01.219 20:01:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:01.219 20:01:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:01.219 20:01:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:01.219 20:01:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:26:01.219 20:01:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:01.219 20:01:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:01.219 20:01:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:26:01.478 /dev/nbd0 00:26:01.478 20:01:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:01.478 20:01:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:01.478 20:01:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:26:01.478 20:01:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:26:01.478 20:01:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:01.478 20:01:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:01.478 20:01:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:26:01.478 20:01:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:26:01.478 20:01:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:01.478 20:01:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:01.478 20:01:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:01.478 1+0 records in 00:26:01.478 1+0 records out 00:26:01.478 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000242662 s, 16.9 MB/s 00:26:01.478 20:01:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:01.478 20:01:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:26:01.478 20:01:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:01.478 20:01:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:01.478 20:01:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:26:01.478 20:01:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:01.478 20:01:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:01.478 20:01:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:26:01.737 /dev/nbd1 00:26:01.737 20:01:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:01.737 20:01:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:01.737 20:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:26:01.737 20:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:26:01.737 20:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:01.737 20:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:01.737 20:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:26:01.737 20:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:26:01.737 20:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:01.737 20:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:01.737 20:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:01.737 1+0 records in 00:26:01.737 1+0 records out 00:26:01.737 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000341495 s, 12.0 MB/s 00:26:01.737 20:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:01.737 20:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:26:01.737 20:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:01.737 20:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:01.737 20:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:26:01.737 20:01:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:01.737 20:01:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:01.737 20:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@753 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:26:01.994 20:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@754 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:26:01.994 20:01:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:01.994 20:01:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:01.994 20:01:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:01.994 20:01:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:26:01.994 20:01:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:01.994 20:01:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:02.261 20:01:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:02.261 20:01:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:02.261 20:01:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:02.261 20:01:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:02.261 20:01:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:02.261 20:01:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:02.261 20:01:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:26:02.261 20:01:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:26:02.261 20:01:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:02.261 20:01:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:02.261 20:01:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:02.261 20:01:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:02.261 20:01:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:02.261 20:01:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:02.261 20:01:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:02.261 20:01:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:02.261 20:01:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:26:02.261 20:01:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:26:02.261 20:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@758 -- # '[' false = true ']' 00:26:02.261 20:01:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@798 -- # killprocess 1500450 00:26:02.261 20:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@950 -- # '[' -z 1500450 ']' 00:26:02.261 20:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # kill -0 1500450 00:26:02.261 20:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # uname 00:26:02.261 20:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:26:02.261 20:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1500450 00:26:02.520 20:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:26:02.520 20:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:26:02.520 20:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1500450' 00:26:02.520 killing process with pid 1500450 00:26:02.520 20:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@969 -- # kill 1500450 00:26:02.520 Received shutdown signal, test time was about 60.000000 seconds 00:26:02.520 00:26:02.520 Latency(us) 00:26:02.520 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:02.520 =================================================================================================================== 00:26:02.520 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:26:02.520 [2024-07-24 20:01:53.888995] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:02.520 20:01:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@974 -- # wait 1500450 00:26:02.520 [2024-07-24 20:01:53.938741] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@800 -- # return 0 00:26:02.780 00:26:02.780 real 0m24.037s 00:26:02.780 user 0m33.095s 00:26:02.780 sys 0m4.921s 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:26:02.780 ************************************ 00:26:02.780 END TEST raid_rebuild_test 00:26:02.780 ************************************ 00:26:02.780 20:01:54 bdev_raid -- bdev/bdev_raid.sh@958 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:26:02.780 20:01:54 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:26:02.780 20:01:54 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:26:02.780 20:01:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:02.780 ************************************ 00:26:02.780 START TEST raid_rebuild_test_sb 00:26:02.780 ************************************ 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 4 true false true 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=4 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@586 -- # local superblock=true 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@587 -- # local background_io=false 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # local verify=true 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # echo BaseBdev3 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # echo BaseBdev4 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # local strip_size 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # local create_arg 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@594 -- # local data_offset 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # '[' true = true ']' 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # create_arg+=' -s' 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # raid_pid=1503676 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@613 -- # waitforlisten 1503676 /var/tmp/spdk-raid.sock 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1503676 ']' 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:02.780 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:26:02.780 20:01:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:02.780 [2024-07-24 20:01:54.326664] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:26:02.780 [2024-07-24 20:01:54.326733] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1503676 ] 00:26:02.780 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:02.780 Zero copy mechanism will not be used. 00:26:03.039 [2024-07-24 20:01:54.458036] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:03.039 [2024-07-24 20:01:54.561154] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:03.039 [2024-07-24 20:01:54.622402] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:03.039 [2024-07-24 20:01:54.622435] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:03.975 20:01:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:26:03.975 20:01:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@864 -- # return 0 00:26:03.975 20:01:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:26:03.975 20:01:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:26:03.975 BaseBdev1_malloc 00:26:03.975 20:01:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:04.234 [2024-07-24 20:01:55.742754] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:04.234 [2024-07-24 20:01:55.742803] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:04.234 [2024-07-24 20:01:55.742827] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a26cd0 00:26:04.234 [2024-07-24 20:01:55.742839] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:04.234 [2024-07-24 20:01:55.744384] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:04.234 [2024-07-24 20:01:55.744430] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:04.234 BaseBdev1 00:26:04.234 20:01:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:26:04.234 20:01:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:26:04.494 BaseBdev2_malloc 00:26:04.494 20:01:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:04.752 [2024-07-24 20:01:56.244843] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:04.752 [2024-07-24 20:01:56.244886] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:04.752 [2024-07-24 20:01:56.244905] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a2a460 00:26:04.752 [2024-07-24 20:01:56.244917] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:04.752 [2024-07-24 20:01:56.246302] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:04.752 [2024-07-24 20:01:56.246331] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:04.752 BaseBdev2 00:26:04.752 20:01:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:26:04.752 20:01:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:26:05.011 BaseBdev3_malloc 00:26:05.011 20:01:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:26:05.270 [2024-07-24 20:01:56.742699] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:26:05.270 [2024-07-24 20:01:56.742746] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:05.270 [2024-07-24 20:01:56.742766] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1aea780 00:26:05.270 [2024-07-24 20:01:56.742778] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:05.270 [2024-07-24 20:01:56.744225] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:05.270 [2024-07-24 20:01:56.744255] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:26:05.270 BaseBdev3 00:26:05.270 20:01:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:26:05.270 20:01:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:26:05.528 BaseBdev4_malloc 00:26:05.528 20:01:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:26:05.787 [2024-07-24 20:01:57.240606] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:26:05.787 [2024-07-24 20:01:57.240651] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:05.787 [2024-07-24 20:01:57.240670] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ae9e60 00:26:05.787 [2024-07-24 20:01:57.240683] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:05.787 [2024-07-24 20:01:57.242049] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:05.787 [2024-07-24 20:01:57.242076] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:26:05.787 BaseBdev4 00:26:05.787 20:01:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:26:06.044 spare_malloc 00:26:06.044 20:01:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:06.302 spare_delay 00:26:06.302 20:01:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:06.561 [2024-07-24 20:01:57.987062] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:06.561 [2024-07-24 20:01:57.987107] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:06.561 [2024-07-24 20:01:57.987130] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a20a50 00:26:06.561 [2024-07-24 20:01:57.987142] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:06.561 [2024-07-24 20:01:57.988568] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:06.561 [2024-07-24 20:01:57.988598] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:06.561 spare 00:26:06.561 20:01:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:26:06.820 [2024-07-24 20:01:58.235749] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:06.820 [2024-07-24 20:01:58.236908] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:06.820 [2024-07-24 20:01:58.236960] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:06.820 [2024-07-24 20:01:58.237005] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:06.820 [2024-07-24 20:01:58.237194] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a23130 00:26:06.820 [2024-07-24 20:01:58.237211] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:06.820 [2024-07-24 20:01:58.237408] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a1efc0 00:26:06.820 [2024-07-24 20:01:58.237553] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a23130 00:26:06.820 [2024-07-24 20:01:58.237563] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a23130 00:26:06.820 [2024-07-24 20:01:58.237652] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:06.820 20:01:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:26:06.820 20:01:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:06.820 20:01:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:06.820 20:01:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:06.820 20:01:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:06.820 20:01:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:06.820 20:01:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:06.820 20:01:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:06.820 20:01:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:06.821 20:01:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:06.821 20:01:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:06.821 20:01:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:07.080 20:01:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:07.080 "name": "raid_bdev1", 00:26:07.080 "uuid": "19818672-98af-4774-827c-62d8bf6d397e", 00:26:07.080 "strip_size_kb": 0, 00:26:07.080 "state": "online", 00:26:07.080 "raid_level": "raid1", 00:26:07.080 "superblock": true, 00:26:07.080 "num_base_bdevs": 4, 00:26:07.080 "num_base_bdevs_discovered": 4, 00:26:07.080 "num_base_bdevs_operational": 4, 00:26:07.080 "base_bdevs_list": [ 00:26:07.080 { 00:26:07.080 "name": "BaseBdev1", 00:26:07.080 "uuid": "5f495d75-6ad3-5a38-b4f0-09420ce69619", 00:26:07.080 "is_configured": true, 00:26:07.080 "data_offset": 2048, 00:26:07.080 "data_size": 63488 00:26:07.080 }, 00:26:07.080 { 00:26:07.080 "name": "BaseBdev2", 00:26:07.080 "uuid": "fbefda0f-b201-536d-80b6-58fd4780192c", 00:26:07.080 "is_configured": true, 00:26:07.080 "data_offset": 2048, 00:26:07.080 "data_size": 63488 00:26:07.080 }, 00:26:07.080 { 00:26:07.080 "name": "BaseBdev3", 00:26:07.080 "uuid": "a8d50281-c1e8-5027-a341-df63e62bb065", 00:26:07.080 "is_configured": true, 00:26:07.080 "data_offset": 2048, 00:26:07.080 "data_size": 63488 00:26:07.080 }, 00:26:07.080 { 00:26:07.080 "name": "BaseBdev4", 00:26:07.080 "uuid": "9d1c8db1-4f59-53ff-89ab-e8eff03e22ce", 00:26:07.080 "is_configured": true, 00:26:07.080 "data_offset": 2048, 00:26:07.080 "data_size": 63488 00:26:07.080 } 00:26:07.080 ] 00:26:07.080 }' 00:26:07.080 20:01:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:07.080 20:01:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:07.647 20:01:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:07.647 20:01:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:26:07.905 [2024-07-24 20:01:59.326897] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:07.905 20:01:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=63488 00:26:07.905 20:01:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:07.905 20:01:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:08.163 20:01:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # data_offset=2048 00:26:08.163 20:01:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@636 -- # '[' false = true ']' 00:26:08.163 20:01:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # '[' true = true ']' 00:26:08.163 20:01:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@640 -- # local write_unit_size 00:26:08.163 20:01:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@643 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:26:08.163 20:01:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:08.163 20:01:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:26:08.163 20:01:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:08.163 20:01:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:08.163 20:01:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:08.163 20:01:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:26:08.163 20:01:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:08.164 20:01:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:08.164 20:01:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:26:08.423 [2024-07-24 20:01:59.815931] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a1efc0 00:26:08.423 /dev/nbd0 00:26:08.423 20:01:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:08.423 20:01:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:08.423 20:01:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:26:08.423 20:01:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:26:08.423 20:01:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:08.423 20:01:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:08.423 20:01:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:26:08.423 20:01:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:26:08.423 20:01:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:08.423 20:01:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:08.423 20:01:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:08.423 1+0 records in 00:26:08.423 1+0 records out 00:26:08.423 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000252099 s, 16.2 MB/s 00:26:08.423 20:01:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:08.423 20:01:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:26:08.423 20:01:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:08.423 20:01:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:08.423 20:01:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:26:08.423 20:01:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:08.423 20:01:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:08.423 20:01:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@644 -- # '[' raid1 = raid5f ']' 00:26:08.423 20:01:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@648 -- # write_unit_size=1 00:26:08.423 20:01:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@650 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:26:16.554 63488+0 records in 00:26:16.554 63488+0 records out 00:26:16.554 32505856 bytes (33 MB, 31 MiB) copied, 7.22613 s, 4.5 MB/s 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@651 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:16.554 [2024-07-24 20:02:07.373149] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:16.554 [2024-07-24 20:02:07.612863] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:16.554 "name": "raid_bdev1", 00:26:16.554 "uuid": "19818672-98af-4774-827c-62d8bf6d397e", 00:26:16.554 "strip_size_kb": 0, 00:26:16.554 "state": "online", 00:26:16.554 "raid_level": "raid1", 00:26:16.554 "superblock": true, 00:26:16.554 "num_base_bdevs": 4, 00:26:16.554 "num_base_bdevs_discovered": 3, 00:26:16.554 "num_base_bdevs_operational": 3, 00:26:16.554 "base_bdevs_list": [ 00:26:16.554 { 00:26:16.554 "name": null, 00:26:16.554 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:16.554 "is_configured": false, 00:26:16.554 "data_offset": 2048, 00:26:16.554 "data_size": 63488 00:26:16.554 }, 00:26:16.554 { 00:26:16.554 "name": "BaseBdev2", 00:26:16.554 "uuid": "fbefda0f-b201-536d-80b6-58fd4780192c", 00:26:16.554 "is_configured": true, 00:26:16.554 "data_offset": 2048, 00:26:16.554 "data_size": 63488 00:26:16.554 }, 00:26:16.554 { 00:26:16.554 "name": "BaseBdev3", 00:26:16.554 "uuid": "a8d50281-c1e8-5027-a341-df63e62bb065", 00:26:16.554 "is_configured": true, 00:26:16.554 "data_offset": 2048, 00:26:16.554 "data_size": 63488 00:26:16.554 }, 00:26:16.554 { 00:26:16.554 "name": "BaseBdev4", 00:26:16.554 "uuid": "9d1c8db1-4f59-53ff-89ab-e8eff03e22ce", 00:26:16.554 "is_configured": true, 00:26:16.554 "data_offset": 2048, 00:26:16.554 "data_size": 63488 00:26:16.554 } 00:26:16.554 ] 00:26:16.554 }' 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:16.554 20:02:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:17.495 20:02:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:17.495 [2024-07-24 20:02:08.956443] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:17.495 [2024-07-24 20:02:08.960617] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a1efc0 00:26:17.495 [2024-07-24 20:02:08.962982] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:17.495 20:02:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:18.433 20:02:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:18.433 20:02:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:18.433 20:02:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:18.433 20:02:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:18.433 20:02:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:18.433 20:02:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:18.433 20:02:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:18.692 20:02:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:18.692 "name": "raid_bdev1", 00:26:18.692 "uuid": "19818672-98af-4774-827c-62d8bf6d397e", 00:26:18.692 "strip_size_kb": 0, 00:26:18.692 "state": "online", 00:26:18.692 "raid_level": "raid1", 00:26:18.692 "superblock": true, 00:26:18.692 "num_base_bdevs": 4, 00:26:18.692 "num_base_bdevs_discovered": 4, 00:26:18.692 "num_base_bdevs_operational": 4, 00:26:18.692 "process": { 00:26:18.692 "type": "rebuild", 00:26:18.692 "target": "spare", 00:26:18.692 "progress": { 00:26:18.692 "blocks": 24576, 00:26:18.692 "percent": 38 00:26:18.692 } 00:26:18.692 }, 00:26:18.692 "base_bdevs_list": [ 00:26:18.692 { 00:26:18.692 "name": "spare", 00:26:18.692 "uuid": "ecaacbe1-264d-5fa5-8798-389c2750c2de", 00:26:18.692 "is_configured": true, 00:26:18.692 "data_offset": 2048, 00:26:18.692 "data_size": 63488 00:26:18.692 }, 00:26:18.692 { 00:26:18.692 "name": "BaseBdev2", 00:26:18.692 "uuid": "fbefda0f-b201-536d-80b6-58fd4780192c", 00:26:18.692 "is_configured": true, 00:26:18.692 "data_offset": 2048, 00:26:18.692 "data_size": 63488 00:26:18.692 }, 00:26:18.692 { 00:26:18.692 "name": "BaseBdev3", 00:26:18.692 "uuid": "a8d50281-c1e8-5027-a341-df63e62bb065", 00:26:18.692 "is_configured": true, 00:26:18.692 "data_offset": 2048, 00:26:18.692 "data_size": 63488 00:26:18.692 }, 00:26:18.692 { 00:26:18.692 "name": "BaseBdev4", 00:26:18.692 "uuid": "9d1c8db1-4f59-53ff-89ab-e8eff03e22ce", 00:26:18.692 "is_configured": true, 00:26:18.692 "data_offset": 2048, 00:26:18.692 "data_size": 63488 00:26:18.692 } 00:26:18.692 ] 00:26:18.692 }' 00:26:18.692 20:02:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:18.989 20:02:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:18.989 20:02:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:18.989 20:02:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:18.989 20:02:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:18.989 [2024-07-24 20:02:10.563100] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:19.272 [2024-07-24 20:02:10.575572] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:19.272 [2024-07-24 20:02:10.575614] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:19.272 [2024-07-24 20:02:10.575631] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:19.272 [2024-07-24 20:02:10.575639] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:19.272 20:02:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:19.272 20:02:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:19.272 20:02:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:19.272 20:02:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:19.272 20:02:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:19.272 20:02:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:19.272 20:02:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:19.272 20:02:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:19.272 20:02:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:19.272 20:02:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:19.272 20:02:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:19.272 20:02:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:19.272 20:02:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:19.272 "name": "raid_bdev1", 00:26:19.272 "uuid": "19818672-98af-4774-827c-62d8bf6d397e", 00:26:19.272 "strip_size_kb": 0, 00:26:19.272 "state": "online", 00:26:19.272 "raid_level": "raid1", 00:26:19.272 "superblock": true, 00:26:19.272 "num_base_bdevs": 4, 00:26:19.272 "num_base_bdevs_discovered": 3, 00:26:19.272 "num_base_bdevs_operational": 3, 00:26:19.272 "base_bdevs_list": [ 00:26:19.272 { 00:26:19.272 "name": null, 00:26:19.272 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:19.272 "is_configured": false, 00:26:19.272 "data_offset": 2048, 00:26:19.272 "data_size": 63488 00:26:19.272 }, 00:26:19.272 { 00:26:19.272 "name": "BaseBdev2", 00:26:19.272 "uuid": "fbefda0f-b201-536d-80b6-58fd4780192c", 00:26:19.272 "is_configured": true, 00:26:19.272 "data_offset": 2048, 00:26:19.272 "data_size": 63488 00:26:19.272 }, 00:26:19.272 { 00:26:19.272 "name": "BaseBdev3", 00:26:19.272 "uuid": "a8d50281-c1e8-5027-a341-df63e62bb065", 00:26:19.272 "is_configured": true, 00:26:19.272 "data_offset": 2048, 00:26:19.272 "data_size": 63488 00:26:19.272 }, 00:26:19.272 { 00:26:19.272 "name": "BaseBdev4", 00:26:19.272 "uuid": "9d1c8db1-4f59-53ff-89ab-e8eff03e22ce", 00:26:19.272 "is_configured": true, 00:26:19.272 "data_offset": 2048, 00:26:19.272 "data_size": 63488 00:26:19.272 } 00:26:19.272 ] 00:26:19.272 }' 00:26:19.272 20:02:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:19.272 20:02:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:20.208 20:02:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:20.208 20:02:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:20.208 20:02:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:20.208 20:02:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:20.208 20:02:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:20.208 20:02:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:20.208 20:02:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:20.208 20:02:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:20.208 "name": "raid_bdev1", 00:26:20.208 "uuid": "19818672-98af-4774-827c-62d8bf6d397e", 00:26:20.208 "strip_size_kb": 0, 00:26:20.209 "state": "online", 00:26:20.209 "raid_level": "raid1", 00:26:20.209 "superblock": true, 00:26:20.209 "num_base_bdevs": 4, 00:26:20.209 "num_base_bdevs_discovered": 3, 00:26:20.209 "num_base_bdevs_operational": 3, 00:26:20.209 "base_bdevs_list": [ 00:26:20.209 { 00:26:20.209 "name": null, 00:26:20.209 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:20.209 "is_configured": false, 00:26:20.209 "data_offset": 2048, 00:26:20.209 "data_size": 63488 00:26:20.209 }, 00:26:20.209 { 00:26:20.209 "name": "BaseBdev2", 00:26:20.209 "uuid": "fbefda0f-b201-536d-80b6-58fd4780192c", 00:26:20.209 "is_configured": true, 00:26:20.209 "data_offset": 2048, 00:26:20.209 "data_size": 63488 00:26:20.209 }, 00:26:20.209 { 00:26:20.209 "name": "BaseBdev3", 00:26:20.209 "uuid": "a8d50281-c1e8-5027-a341-df63e62bb065", 00:26:20.209 "is_configured": true, 00:26:20.209 "data_offset": 2048, 00:26:20.209 "data_size": 63488 00:26:20.209 }, 00:26:20.209 { 00:26:20.209 "name": "BaseBdev4", 00:26:20.209 "uuid": "9d1c8db1-4f59-53ff-89ab-e8eff03e22ce", 00:26:20.209 "is_configured": true, 00:26:20.209 "data_offset": 2048, 00:26:20.209 "data_size": 63488 00:26:20.209 } 00:26:20.209 ] 00:26:20.209 }' 00:26:20.209 20:02:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:20.209 20:02:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:20.209 20:02:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:20.209 20:02:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:20.209 20:02:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:20.467 [2024-07-24 20:02:11.963375] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:20.467 [2024-07-24 20:02:11.967483] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a28b60 00:26:20.467 [2024-07-24 20:02:11.969021] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:20.467 20:02:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@678 -- # sleep 1 00:26:21.845 20:02:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:21.845 20:02:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:21.845 20:02:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:21.845 20:02:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:21.845 20:02:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:21.845 20:02:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:21.845 20:02:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:21.845 20:02:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:21.845 "name": "raid_bdev1", 00:26:21.845 "uuid": "19818672-98af-4774-827c-62d8bf6d397e", 00:26:21.845 "strip_size_kb": 0, 00:26:21.845 "state": "online", 00:26:21.845 "raid_level": "raid1", 00:26:21.845 "superblock": true, 00:26:21.845 "num_base_bdevs": 4, 00:26:21.845 "num_base_bdevs_discovered": 4, 00:26:21.845 "num_base_bdevs_operational": 4, 00:26:21.845 "process": { 00:26:21.845 "type": "rebuild", 00:26:21.845 "target": "spare", 00:26:21.845 "progress": { 00:26:21.845 "blocks": 24576, 00:26:21.845 "percent": 38 00:26:21.845 } 00:26:21.845 }, 00:26:21.845 "base_bdevs_list": [ 00:26:21.845 { 00:26:21.845 "name": "spare", 00:26:21.845 "uuid": "ecaacbe1-264d-5fa5-8798-389c2750c2de", 00:26:21.845 "is_configured": true, 00:26:21.845 "data_offset": 2048, 00:26:21.845 "data_size": 63488 00:26:21.845 }, 00:26:21.845 { 00:26:21.845 "name": "BaseBdev2", 00:26:21.845 "uuid": "fbefda0f-b201-536d-80b6-58fd4780192c", 00:26:21.845 "is_configured": true, 00:26:21.845 "data_offset": 2048, 00:26:21.845 "data_size": 63488 00:26:21.845 }, 00:26:21.845 { 00:26:21.845 "name": "BaseBdev3", 00:26:21.845 "uuid": "a8d50281-c1e8-5027-a341-df63e62bb065", 00:26:21.845 "is_configured": true, 00:26:21.845 "data_offset": 2048, 00:26:21.845 "data_size": 63488 00:26:21.845 }, 00:26:21.845 { 00:26:21.845 "name": "BaseBdev4", 00:26:21.845 "uuid": "9d1c8db1-4f59-53ff-89ab-e8eff03e22ce", 00:26:21.845 "is_configured": true, 00:26:21.845 "data_offset": 2048, 00:26:21.845 "data_size": 63488 00:26:21.845 } 00:26:21.845 ] 00:26:21.845 }' 00:26:21.845 20:02:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:21.845 20:02:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:21.845 20:02:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:21.845 20:02:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:21.845 20:02:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@681 -- # '[' true = true ']' 00:26:21.845 20:02:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@681 -- # '[' = false ']' 00:26:21.845 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 681: [: =: unary operator expected 00:26:21.845 20:02:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=4 00:26:21.845 20:02:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:26:21.845 20:02:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # '[' 4 -gt 2 ']' 00:26:21.845 20:02:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:26:22.104 [2024-07-24 20:02:13.564410] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:22.104 [2024-07-24 20:02:13.681648] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1a28b60 00:26:22.362 20:02:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@713 -- # base_bdevs[1]= 00:26:22.362 20:02:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # (( num_base_bdevs_operational-- )) 00:26:22.363 20:02:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@717 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:22.363 20:02:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:22.363 20:02:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:22.363 20:02:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:22.363 20:02:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:22.363 20:02:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:22.363 20:02:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:22.621 20:02:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:22.621 "name": "raid_bdev1", 00:26:22.621 "uuid": "19818672-98af-4774-827c-62d8bf6d397e", 00:26:22.622 "strip_size_kb": 0, 00:26:22.622 "state": "online", 00:26:22.622 "raid_level": "raid1", 00:26:22.622 "superblock": true, 00:26:22.622 "num_base_bdevs": 4, 00:26:22.622 "num_base_bdevs_discovered": 3, 00:26:22.622 "num_base_bdevs_operational": 3, 00:26:22.622 "process": { 00:26:22.622 "type": "rebuild", 00:26:22.622 "target": "spare", 00:26:22.622 "progress": { 00:26:22.622 "blocks": 36864, 00:26:22.622 "percent": 58 00:26:22.622 } 00:26:22.622 }, 00:26:22.622 "base_bdevs_list": [ 00:26:22.622 { 00:26:22.622 "name": "spare", 00:26:22.622 "uuid": "ecaacbe1-264d-5fa5-8798-389c2750c2de", 00:26:22.622 "is_configured": true, 00:26:22.622 "data_offset": 2048, 00:26:22.622 "data_size": 63488 00:26:22.622 }, 00:26:22.622 { 00:26:22.622 "name": null, 00:26:22.622 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:22.622 "is_configured": false, 00:26:22.622 "data_offset": 2048, 00:26:22.622 "data_size": 63488 00:26:22.622 }, 00:26:22.622 { 00:26:22.622 "name": "BaseBdev3", 00:26:22.622 "uuid": "a8d50281-c1e8-5027-a341-df63e62bb065", 00:26:22.622 "is_configured": true, 00:26:22.622 "data_offset": 2048, 00:26:22.622 "data_size": 63488 00:26:22.622 }, 00:26:22.622 { 00:26:22.622 "name": "BaseBdev4", 00:26:22.622 "uuid": "9d1c8db1-4f59-53ff-89ab-e8eff03e22ce", 00:26:22.622 "is_configured": true, 00:26:22.622 "data_offset": 2048, 00:26:22.622 "data_size": 63488 00:26:22.622 } 00:26:22.622 ] 00:26:22.622 }' 00:26:22.622 20:02:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:22.622 20:02:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:22.622 20:02:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:22.622 20:02:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:22.622 20:02:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # local timeout=955 00:26:22.622 20:02:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:26:22.622 20:02:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:22.622 20:02:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:22.622 20:02:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:22.622 20:02:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:22.622 20:02:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:22.622 20:02:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:22.622 20:02:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:22.881 20:02:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:22.881 "name": "raid_bdev1", 00:26:22.881 "uuid": "19818672-98af-4774-827c-62d8bf6d397e", 00:26:22.881 "strip_size_kb": 0, 00:26:22.881 "state": "online", 00:26:22.881 "raid_level": "raid1", 00:26:22.881 "superblock": true, 00:26:22.881 "num_base_bdevs": 4, 00:26:22.881 "num_base_bdevs_discovered": 3, 00:26:22.881 "num_base_bdevs_operational": 3, 00:26:22.881 "process": { 00:26:22.881 "type": "rebuild", 00:26:22.881 "target": "spare", 00:26:22.881 "progress": { 00:26:22.881 "blocks": 43008, 00:26:22.881 "percent": 67 00:26:22.881 } 00:26:22.881 }, 00:26:22.881 "base_bdevs_list": [ 00:26:22.881 { 00:26:22.881 "name": "spare", 00:26:22.881 "uuid": "ecaacbe1-264d-5fa5-8798-389c2750c2de", 00:26:22.881 "is_configured": true, 00:26:22.881 "data_offset": 2048, 00:26:22.881 "data_size": 63488 00:26:22.881 }, 00:26:22.881 { 00:26:22.881 "name": null, 00:26:22.881 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:22.881 "is_configured": false, 00:26:22.881 "data_offset": 2048, 00:26:22.881 "data_size": 63488 00:26:22.881 }, 00:26:22.881 { 00:26:22.881 "name": "BaseBdev3", 00:26:22.881 "uuid": "a8d50281-c1e8-5027-a341-df63e62bb065", 00:26:22.881 "is_configured": true, 00:26:22.881 "data_offset": 2048, 00:26:22.881 "data_size": 63488 00:26:22.881 }, 00:26:22.881 { 00:26:22.881 "name": "BaseBdev4", 00:26:22.881 "uuid": "9d1c8db1-4f59-53ff-89ab-e8eff03e22ce", 00:26:22.881 "is_configured": true, 00:26:22.881 "data_offset": 2048, 00:26:22.881 "data_size": 63488 00:26:22.881 } 00:26:22.881 ] 00:26:22.881 }' 00:26:22.881 20:02:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:22.881 20:02:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:22.881 20:02:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:22.881 20:02:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:22.881 20:02:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@726 -- # sleep 1 00:26:23.817 [2024-07-24 20:02:15.193315] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:23.817 [2024-07-24 20:02:15.193375] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:23.817 [2024-07-24 20:02:15.193478] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:23.817 20:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:26:23.817 20:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:23.817 20:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:23.817 20:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:23.817 20:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:23.817 20:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:23.817 20:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:23.817 20:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:24.077 20:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:24.077 "name": "raid_bdev1", 00:26:24.077 "uuid": "19818672-98af-4774-827c-62d8bf6d397e", 00:26:24.077 "strip_size_kb": 0, 00:26:24.077 "state": "online", 00:26:24.077 "raid_level": "raid1", 00:26:24.077 "superblock": true, 00:26:24.077 "num_base_bdevs": 4, 00:26:24.077 "num_base_bdevs_discovered": 3, 00:26:24.077 "num_base_bdevs_operational": 3, 00:26:24.077 "base_bdevs_list": [ 00:26:24.077 { 00:26:24.077 "name": "spare", 00:26:24.077 "uuid": "ecaacbe1-264d-5fa5-8798-389c2750c2de", 00:26:24.077 "is_configured": true, 00:26:24.077 "data_offset": 2048, 00:26:24.077 "data_size": 63488 00:26:24.077 }, 00:26:24.077 { 00:26:24.077 "name": null, 00:26:24.077 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:24.077 "is_configured": false, 00:26:24.077 "data_offset": 2048, 00:26:24.077 "data_size": 63488 00:26:24.077 }, 00:26:24.077 { 00:26:24.077 "name": "BaseBdev3", 00:26:24.077 "uuid": "a8d50281-c1e8-5027-a341-df63e62bb065", 00:26:24.077 "is_configured": true, 00:26:24.077 "data_offset": 2048, 00:26:24.077 "data_size": 63488 00:26:24.077 }, 00:26:24.077 { 00:26:24.077 "name": "BaseBdev4", 00:26:24.077 "uuid": "9d1c8db1-4f59-53ff-89ab-e8eff03e22ce", 00:26:24.077 "is_configured": true, 00:26:24.077 "data_offset": 2048, 00:26:24.077 "data_size": 63488 00:26:24.077 } 00:26:24.077 ] 00:26:24.077 }' 00:26:24.077 20:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:24.336 20:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:24.336 20:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:24.336 20:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:24.336 20:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@724 -- # break 00:26:24.336 20:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:24.336 20:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:24.336 20:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:24.336 20:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:24.336 20:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:24.336 20:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:24.336 20:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:24.596 20:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:24.596 "name": "raid_bdev1", 00:26:24.596 "uuid": "19818672-98af-4774-827c-62d8bf6d397e", 00:26:24.596 "strip_size_kb": 0, 00:26:24.596 "state": "online", 00:26:24.596 "raid_level": "raid1", 00:26:24.596 "superblock": true, 00:26:24.596 "num_base_bdevs": 4, 00:26:24.596 "num_base_bdevs_discovered": 3, 00:26:24.596 "num_base_bdevs_operational": 3, 00:26:24.596 "base_bdevs_list": [ 00:26:24.596 { 00:26:24.596 "name": "spare", 00:26:24.596 "uuid": "ecaacbe1-264d-5fa5-8798-389c2750c2de", 00:26:24.596 "is_configured": true, 00:26:24.596 "data_offset": 2048, 00:26:24.596 "data_size": 63488 00:26:24.596 }, 00:26:24.596 { 00:26:24.596 "name": null, 00:26:24.596 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:24.596 "is_configured": false, 00:26:24.596 "data_offset": 2048, 00:26:24.596 "data_size": 63488 00:26:24.596 }, 00:26:24.596 { 00:26:24.596 "name": "BaseBdev3", 00:26:24.596 "uuid": "a8d50281-c1e8-5027-a341-df63e62bb065", 00:26:24.596 "is_configured": true, 00:26:24.596 "data_offset": 2048, 00:26:24.596 "data_size": 63488 00:26:24.596 }, 00:26:24.596 { 00:26:24.596 "name": "BaseBdev4", 00:26:24.597 "uuid": "9d1c8db1-4f59-53ff-89ab-e8eff03e22ce", 00:26:24.597 "is_configured": true, 00:26:24.597 "data_offset": 2048, 00:26:24.597 "data_size": 63488 00:26:24.597 } 00:26:24.597 ] 00:26:24.597 }' 00:26:24.597 20:02:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:24.597 20:02:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:24.597 20:02:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:24.597 20:02:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:24.597 20:02:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:24.597 20:02:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:24.597 20:02:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:24.597 20:02:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:24.597 20:02:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:24.597 20:02:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:24.597 20:02:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:24.597 20:02:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:24.597 20:02:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:24.597 20:02:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:24.597 20:02:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:24.597 20:02:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:24.856 20:02:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:24.856 "name": "raid_bdev1", 00:26:24.856 "uuid": "19818672-98af-4774-827c-62d8bf6d397e", 00:26:24.856 "strip_size_kb": 0, 00:26:24.856 "state": "online", 00:26:24.856 "raid_level": "raid1", 00:26:24.856 "superblock": true, 00:26:24.856 "num_base_bdevs": 4, 00:26:24.856 "num_base_bdevs_discovered": 3, 00:26:24.856 "num_base_bdevs_operational": 3, 00:26:24.856 "base_bdevs_list": [ 00:26:24.856 { 00:26:24.856 "name": "spare", 00:26:24.856 "uuid": "ecaacbe1-264d-5fa5-8798-389c2750c2de", 00:26:24.856 "is_configured": true, 00:26:24.856 "data_offset": 2048, 00:26:24.856 "data_size": 63488 00:26:24.856 }, 00:26:24.856 { 00:26:24.856 "name": null, 00:26:24.856 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:24.856 "is_configured": false, 00:26:24.856 "data_offset": 2048, 00:26:24.856 "data_size": 63488 00:26:24.856 }, 00:26:24.856 { 00:26:24.856 "name": "BaseBdev3", 00:26:24.856 "uuid": "a8d50281-c1e8-5027-a341-df63e62bb065", 00:26:24.856 "is_configured": true, 00:26:24.856 "data_offset": 2048, 00:26:24.856 "data_size": 63488 00:26:24.856 }, 00:26:24.856 { 00:26:24.856 "name": "BaseBdev4", 00:26:24.856 "uuid": "9d1c8db1-4f59-53ff-89ab-e8eff03e22ce", 00:26:24.856 "is_configured": true, 00:26:24.856 "data_offset": 2048, 00:26:24.856 "data_size": 63488 00:26:24.856 } 00:26:24.856 ] 00:26:24.856 }' 00:26:24.856 20:02:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:24.856 20:02:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:25.424 20:02:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:25.683 [2024-07-24 20:02:17.097979] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:25.683 [2024-07-24 20:02:17.098009] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:25.683 [2024-07-24 20:02:17.098069] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:25.683 [2024-07-24 20:02:17.098139] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:25.683 [2024-07-24 20:02:17.098152] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a23130 name raid_bdev1, state offline 00:26:25.683 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:25.683 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@735 -- # jq length 00:26:25.942 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:26:25.942 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:26:25.942 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # '[' false = true ']' 00:26:25.942 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:26:25.942 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:25.942 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:26:25.942 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:25.942 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:25.942 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:25.942 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:26:25.942 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:25.942 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:25.942 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:26:26.201 /dev/nbd0 00:26:26.201 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:26.201 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:26.201 20:02:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:26:26.201 20:02:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:26:26.201 20:02:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:26.201 20:02:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:26.201 20:02:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:26:26.201 20:02:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:26:26.201 20:02:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:26.201 20:02:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:26.201 20:02:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:26.201 1+0 records in 00:26:26.201 1+0 records out 00:26:26.201 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000252926 s, 16.2 MB/s 00:26:26.201 20:02:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:26.201 20:02:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:26:26.201 20:02:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:26.201 20:02:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:26.201 20:02:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:26:26.201 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:26.201 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:26.201 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:26:26.460 /dev/nbd1 00:26:26.460 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:26.460 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:26.460 20:02:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:26:26.460 20:02:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:26:26.460 20:02:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:26.460 20:02:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:26.460 20:02:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:26:26.460 20:02:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:26:26.460 20:02:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:26.460 20:02:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:26.460 20:02:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:26.460 1+0 records in 00:26:26.460 1+0 records out 00:26:26.460 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000316661 s, 12.9 MB/s 00:26:26.460 20:02:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:26.460 20:02:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:26:26.460 20:02:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:26.460 20:02:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:26.460 20:02:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:26:26.460 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:26.460 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:26.460 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:26:26.460 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:26:26.460 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:26.460 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:26.460 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:26.460 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:26:26.460 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:26.460 20:02:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:26.720 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:26.720 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:26.720 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:26.720 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:26.720 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:26.720 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:26.720 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:26:26.720 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:26:26.720 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:26.720 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:26.980 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:26.980 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:26.980 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:26.980 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:26.980 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:26.980 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:26.980 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:26:26.980 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:26:26.980 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@758 -- # '[' true = true ']' 00:26:26.980 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:27.239 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:27.498 [2024-07-24 20:02:18.947257] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:27.498 [2024-07-24 20:02:18.947305] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:27.498 [2024-07-24 20:02:18.947330] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1aeacd0 00:26:27.498 [2024-07-24 20:02:18.947342] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:27.498 [2024-07-24 20:02:18.949003] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:27.498 [2024-07-24 20:02:18.949035] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:27.498 [2024-07-24 20:02:18.949118] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:27.498 [2024-07-24 20:02:18.949147] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:27.498 [2024-07-24 20:02:18.949251] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:27.498 [2024-07-24 20:02:18.949324] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:27.498 spare 00:26:27.498 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:27.498 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:27.498 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:27.498 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:27.498 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:27.498 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:27.498 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:27.498 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:27.498 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:27.498 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:27.499 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:27.499 20:02:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:27.499 [2024-07-24 20:02:19.049639] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a228e0 00:26:27.499 [2024-07-24 20:02:19.049656] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:27.499 [2024-07-24 20:02:19.049858] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ae86e0 00:26:27.499 [2024-07-24 20:02:19.050007] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a228e0 00:26:27.499 [2024-07-24 20:02:19.050018] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a228e0 00:26:27.499 [2024-07-24 20:02:19.050124] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:27.758 20:02:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:27.758 "name": "raid_bdev1", 00:26:27.758 "uuid": "19818672-98af-4774-827c-62d8bf6d397e", 00:26:27.758 "strip_size_kb": 0, 00:26:27.758 "state": "online", 00:26:27.758 "raid_level": "raid1", 00:26:27.758 "superblock": true, 00:26:27.758 "num_base_bdevs": 4, 00:26:27.758 "num_base_bdevs_discovered": 3, 00:26:27.758 "num_base_bdevs_operational": 3, 00:26:27.758 "base_bdevs_list": [ 00:26:27.758 { 00:26:27.758 "name": "spare", 00:26:27.758 "uuid": "ecaacbe1-264d-5fa5-8798-389c2750c2de", 00:26:27.758 "is_configured": true, 00:26:27.758 "data_offset": 2048, 00:26:27.758 "data_size": 63488 00:26:27.758 }, 00:26:27.758 { 00:26:27.758 "name": null, 00:26:27.758 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:27.758 "is_configured": false, 00:26:27.758 "data_offset": 2048, 00:26:27.758 "data_size": 63488 00:26:27.758 }, 00:26:27.758 { 00:26:27.758 "name": "BaseBdev3", 00:26:27.758 "uuid": "a8d50281-c1e8-5027-a341-df63e62bb065", 00:26:27.758 "is_configured": true, 00:26:27.758 "data_offset": 2048, 00:26:27.758 "data_size": 63488 00:26:27.758 }, 00:26:27.758 { 00:26:27.758 "name": "BaseBdev4", 00:26:27.758 "uuid": "9d1c8db1-4f59-53ff-89ab-e8eff03e22ce", 00:26:27.758 "is_configured": true, 00:26:27.758 "data_offset": 2048, 00:26:27.758 "data_size": 63488 00:26:27.758 } 00:26:27.758 ] 00:26:27.758 }' 00:26:27.758 20:02:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:27.758 20:02:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:28.326 20:02:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:28.326 20:02:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:28.326 20:02:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:28.326 20:02:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:28.326 20:02:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:28.326 20:02:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:28.326 20:02:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:28.585 20:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:28.585 "name": "raid_bdev1", 00:26:28.585 "uuid": "19818672-98af-4774-827c-62d8bf6d397e", 00:26:28.585 "strip_size_kb": 0, 00:26:28.585 "state": "online", 00:26:28.585 "raid_level": "raid1", 00:26:28.585 "superblock": true, 00:26:28.585 "num_base_bdevs": 4, 00:26:28.585 "num_base_bdevs_discovered": 3, 00:26:28.585 "num_base_bdevs_operational": 3, 00:26:28.585 "base_bdevs_list": [ 00:26:28.585 { 00:26:28.585 "name": "spare", 00:26:28.585 "uuid": "ecaacbe1-264d-5fa5-8798-389c2750c2de", 00:26:28.585 "is_configured": true, 00:26:28.585 "data_offset": 2048, 00:26:28.585 "data_size": 63488 00:26:28.585 }, 00:26:28.585 { 00:26:28.585 "name": null, 00:26:28.585 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:28.585 "is_configured": false, 00:26:28.585 "data_offset": 2048, 00:26:28.585 "data_size": 63488 00:26:28.585 }, 00:26:28.585 { 00:26:28.585 "name": "BaseBdev3", 00:26:28.585 "uuid": "a8d50281-c1e8-5027-a341-df63e62bb065", 00:26:28.585 "is_configured": true, 00:26:28.585 "data_offset": 2048, 00:26:28.585 "data_size": 63488 00:26:28.585 }, 00:26:28.585 { 00:26:28.585 "name": "BaseBdev4", 00:26:28.585 "uuid": "9d1c8db1-4f59-53ff-89ab-e8eff03e22ce", 00:26:28.585 "is_configured": true, 00:26:28.585 "data_offset": 2048, 00:26:28.585 "data_size": 63488 00:26:28.585 } 00:26:28.585 ] 00:26:28.585 }' 00:26:28.585 20:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:28.585 20:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:28.585 20:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:28.844 20:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:28.844 20:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:28.844 20:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@765 -- # jq -r '.[].base_bdevs_list[0].name' 00:26:29.103 20:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@765 -- # [[ spare == \s\p\a\r\e ]] 00:26:29.103 20:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:29.103 [2024-07-24 20:02:20.675979] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:29.103 20:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:29.103 20:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:29.103 20:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:29.103 20:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:29.103 20:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:29.362 20:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:29.362 20:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:29.362 20:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:29.362 20:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:29.362 20:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:29.362 20:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:29.362 20:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:29.362 20:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:29.362 "name": "raid_bdev1", 00:26:29.362 "uuid": "19818672-98af-4774-827c-62d8bf6d397e", 00:26:29.362 "strip_size_kb": 0, 00:26:29.362 "state": "online", 00:26:29.362 "raid_level": "raid1", 00:26:29.362 "superblock": true, 00:26:29.362 "num_base_bdevs": 4, 00:26:29.362 "num_base_bdevs_discovered": 2, 00:26:29.362 "num_base_bdevs_operational": 2, 00:26:29.362 "base_bdevs_list": [ 00:26:29.362 { 00:26:29.362 "name": null, 00:26:29.362 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:29.362 "is_configured": false, 00:26:29.362 "data_offset": 2048, 00:26:29.362 "data_size": 63488 00:26:29.362 }, 00:26:29.362 { 00:26:29.362 "name": null, 00:26:29.362 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:29.362 "is_configured": false, 00:26:29.362 "data_offset": 2048, 00:26:29.362 "data_size": 63488 00:26:29.362 }, 00:26:29.362 { 00:26:29.362 "name": "BaseBdev3", 00:26:29.362 "uuid": "a8d50281-c1e8-5027-a341-df63e62bb065", 00:26:29.362 "is_configured": true, 00:26:29.362 "data_offset": 2048, 00:26:29.362 "data_size": 63488 00:26:29.362 }, 00:26:29.362 { 00:26:29.362 "name": "BaseBdev4", 00:26:29.362 "uuid": "9d1c8db1-4f59-53ff-89ab-e8eff03e22ce", 00:26:29.362 "is_configured": true, 00:26:29.362 "data_offset": 2048, 00:26:29.362 "data_size": 63488 00:26:29.362 } 00:26:29.362 ] 00:26:29.362 }' 00:26:29.362 20:02:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:29.362 20:02:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:30.300 20:02:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@770 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:30.300 [2024-07-24 20:02:21.762855] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:30.300 [2024-07-24 20:02:21.763011] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:26:30.300 [2024-07-24 20:02:21.763033] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:30.300 [2024-07-24 20:02:21.763062] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:30.300 [2024-07-24 20:02:21.767029] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ae86e0 00:26:30.300 [2024-07-24 20:02:21.768378] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:30.300 20:02:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # sleep 1 00:26:31.238 20:02:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:31.238 20:02:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:31.238 20:02:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:31.238 20:02:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:31.238 20:02:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:31.238 20:02:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:31.238 20:02:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:31.497 20:02:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:31.497 "name": "raid_bdev1", 00:26:31.497 "uuid": "19818672-98af-4774-827c-62d8bf6d397e", 00:26:31.497 "strip_size_kb": 0, 00:26:31.497 "state": "online", 00:26:31.497 "raid_level": "raid1", 00:26:31.497 "superblock": true, 00:26:31.497 "num_base_bdevs": 4, 00:26:31.497 "num_base_bdevs_discovered": 3, 00:26:31.497 "num_base_bdevs_operational": 3, 00:26:31.497 "process": { 00:26:31.497 "type": "rebuild", 00:26:31.497 "target": "spare", 00:26:31.497 "progress": { 00:26:31.497 "blocks": 24576, 00:26:31.497 "percent": 38 00:26:31.497 } 00:26:31.497 }, 00:26:31.497 "base_bdevs_list": [ 00:26:31.497 { 00:26:31.497 "name": "spare", 00:26:31.497 "uuid": "ecaacbe1-264d-5fa5-8798-389c2750c2de", 00:26:31.497 "is_configured": true, 00:26:31.497 "data_offset": 2048, 00:26:31.497 "data_size": 63488 00:26:31.497 }, 00:26:31.497 { 00:26:31.497 "name": null, 00:26:31.497 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:31.497 "is_configured": false, 00:26:31.497 "data_offset": 2048, 00:26:31.497 "data_size": 63488 00:26:31.497 }, 00:26:31.497 { 00:26:31.497 "name": "BaseBdev3", 00:26:31.497 "uuid": "a8d50281-c1e8-5027-a341-df63e62bb065", 00:26:31.497 "is_configured": true, 00:26:31.497 "data_offset": 2048, 00:26:31.497 "data_size": 63488 00:26:31.497 }, 00:26:31.497 { 00:26:31.497 "name": "BaseBdev4", 00:26:31.497 "uuid": "9d1c8db1-4f59-53ff-89ab-e8eff03e22ce", 00:26:31.497 "is_configured": true, 00:26:31.497 "data_offset": 2048, 00:26:31.497 "data_size": 63488 00:26:31.497 } 00:26:31.497 ] 00:26:31.497 }' 00:26:31.497 20:02:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:31.756 20:02:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:31.756 20:02:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:31.756 20:02:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:31.756 20:02:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:32.015 [2024-07-24 20:02:23.356144] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:32.015 [2024-07-24 20:02:23.380644] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:32.015 [2024-07-24 20:02:23.380691] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:32.015 [2024-07-24 20:02:23.380708] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:32.015 [2024-07-24 20:02:23.380716] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:32.015 20:02:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:32.015 20:02:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:32.015 20:02:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:32.015 20:02:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:32.015 20:02:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:32.015 20:02:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:32.015 20:02:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:32.015 20:02:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:32.015 20:02:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:32.015 20:02:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:32.015 20:02:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:32.015 20:02:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:32.273 20:02:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:32.273 "name": "raid_bdev1", 00:26:32.273 "uuid": "19818672-98af-4774-827c-62d8bf6d397e", 00:26:32.273 "strip_size_kb": 0, 00:26:32.273 "state": "online", 00:26:32.273 "raid_level": "raid1", 00:26:32.273 "superblock": true, 00:26:32.273 "num_base_bdevs": 4, 00:26:32.273 "num_base_bdevs_discovered": 2, 00:26:32.273 "num_base_bdevs_operational": 2, 00:26:32.273 "base_bdevs_list": [ 00:26:32.273 { 00:26:32.273 "name": null, 00:26:32.273 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:32.273 "is_configured": false, 00:26:32.273 "data_offset": 2048, 00:26:32.273 "data_size": 63488 00:26:32.273 }, 00:26:32.273 { 00:26:32.273 "name": null, 00:26:32.273 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:32.273 "is_configured": false, 00:26:32.273 "data_offset": 2048, 00:26:32.273 "data_size": 63488 00:26:32.273 }, 00:26:32.273 { 00:26:32.273 "name": "BaseBdev3", 00:26:32.273 "uuid": "a8d50281-c1e8-5027-a341-df63e62bb065", 00:26:32.273 "is_configured": true, 00:26:32.273 "data_offset": 2048, 00:26:32.273 "data_size": 63488 00:26:32.273 }, 00:26:32.273 { 00:26:32.273 "name": "BaseBdev4", 00:26:32.273 "uuid": "9d1c8db1-4f59-53ff-89ab-e8eff03e22ce", 00:26:32.273 "is_configured": true, 00:26:32.273 "data_offset": 2048, 00:26:32.273 "data_size": 63488 00:26:32.273 } 00:26:32.273 ] 00:26:32.273 }' 00:26:32.273 20:02:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:32.273 20:02:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:32.841 20:02:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:33.100 [2024-07-24 20:02:24.484235] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:33.100 [2024-07-24 20:02:24.484288] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:33.100 [2024-07-24 20:02:24.484310] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a29130 00:26:33.100 [2024-07-24 20:02:24.484323] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:33.100 [2024-07-24 20:02:24.484723] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:33.100 [2024-07-24 20:02:24.484743] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:33.100 [2024-07-24 20:02:24.484830] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:33.100 [2024-07-24 20:02:24.484843] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:26:33.100 [2024-07-24 20:02:24.484854] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:33.100 [2024-07-24 20:02:24.484873] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:33.100 [2024-07-24 20:02:24.488899] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ae86e0 00:26:33.100 spare 00:26:33.100 [2024-07-24 20:02:24.490260] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:33.100 20:02:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # sleep 1 00:26:34.039 20:02:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:34.039 20:02:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:34.039 20:02:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:34.039 20:02:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:34.039 20:02:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:34.039 20:02:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:34.039 20:02:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:34.298 20:02:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:34.298 "name": "raid_bdev1", 00:26:34.298 "uuid": "19818672-98af-4774-827c-62d8bf6d397e", 00:26:34.298 "strip_size_kb": 0, 00:26:34.298 "state": "online", 00:26:34.298 "raid_level": "raid1", 00:26:34.298 "superblock": true, 00:26:34.298 "num_base_bdevs": 4, 00:26:34.298 "num_base_bdevs_discovered": 3, 00:26:34.298 "num_base_bdevs_operational": 3, 00:26:34.298 "process": { 00:26:34.298 "type": "rebuild", 00:26:34.298 "target": "spare", 00:26:34.298 "progress": { 00:26:34.298 "blocks": 24576, 00:26:34.298 "percent": 38 00:26:34.298 } 00:26:34.298 }, 00:26:34.298 "base_bdevs_list": [ 00:26:34.298 { 00:26:34.298 "name": "spare", 00:26:34.298 "uuid": "ecaacbe1-264d-5fa5-8798-389c2750c2de", 00:26:34.298 "is_configured": true, 00:26:34.298 "data_offset": 2048, 00:26:34.298 "data_size": 63488 00:26:34.298 }, 00:26:34.298 { 00:26:34.298 "name": null, 00:26:34.298 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:34.298 "is_configured": false, 00:26:34.298 "data_offset": 2048, 00:26:34.298 "data_size": 63488 00:26:34.298 }, 00:26:34.298 { 00:26:34.298 "name": "BaseBdev3", 00:26:34.298 "uuid": "a8d50281-c1e8-5027-a341-df63e62bb065", 00:26:34.298 "is_configured": true, 00:26:34.298 "data_offset": 2048, 00:26:34.298 "data_size": 63488 00:26:34.298 }, 00:26:34.298 { 00:26:34.298 "name": "BaseBdev4", 00:26:34.298 "uuid": "9d1c8db1-4f59-53ff-89ab-e8eff03e22ce", 00:26:34.298 "is_configured": true, 00:26:34.298 "data_offset": 2048, 00:26:34.298 "data_size": 63488 00:26:34.298 } 00:26:34.298 ] 00:26:34.298 }' 00:26:34.298 20:02:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:34.298 20:02:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:34.298 20:02:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:34.298 20:02:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:34.298 20:02:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:34.557 [2024-07-24 20:02:26.068547] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:34.557 [2024-07-24 20:02:26.102925] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:34.557 [2024-07-24 20:02:26.102969] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:34.557 [2024-07-24 20:02:26.102985] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:34.557 [2024-07-24 20:02:26.102993] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:34.557 20:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@783 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:34.557 20:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:34.557 20:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:34.557 20:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:34.557 20:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:34.557 20:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:34.557 20:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:34.557 20:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:34.557 20:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:34.557 20:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:34.557 20:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:34.557 20:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:34.816 20:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:34.816 "name": "raid_bdev1", 00:26:34.816 "uuid": "19818672-98af-4774-827c-62d8bf6d397e", 00:26:34.816 "strip_size_kb": 0, 00:26:34.816 "state": "online", 00:26:34.816 "raid_level": "raid1", 00:26:34.816 "superblock": true, 00:26:34.816 "num_base_bdevs": 4, 00:26:34.816 "num_base_bdevs_discovered": 2, 00:26:34.816 "num_base_bdevs_operational": 2, 00:26:34.816 "base_bdevs_list": [ 00:26:34.816 { 00:26:34.816 "name": null, 00:26:34.816 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:34.816 "is_configured": false, 00:26:34.816 "data_offset": 2048, 00:26:34.816 "data_size": 63488 00:26:34.816 }, 00:26:34.816 { 00:26:34.816 "name": null, 00:26:34.816 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:34.816 "is_configured": false, 00:26:34.816 "data_offset": 2048, 00:26:34.816 "data_size": 63488 00:26:34.816 }, 00:26:34.816 { 00:26:34.816 "name": "BaseBdev3", 00:26:34.816 "uuid": "a8d50281-c1e8-5027-a341-df63e62bb065", 00:26:34.816 "is_configured": true, 00:26:34.816 "data_offset": 2048, 00:26:34.816 "data_size": 63488 00:26:34.816 }, 00:26:34.816 { 00:26:34.816 "name": "BaseBdev4", 00:26:34.816 "uuid": "9d1c8db1-4f59-53ff-89ab-e8eff03e22ce", 00:26:34.816 "is_configured": true, 00:26:34.816 "data_offset": 2048, 00:26:34.816 "data_size": 63488 00:26:34.816 } 00:26:34.816 ] 00:26:34.816 }' 00:26:34.816 20:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:34.816 20:02:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:35.382 20:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:35.382 20:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:35.382 20:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:35.382 20:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:35.382 20:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:35.641 20:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:35.641 20:02:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:35.642 20:02:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:35.642 "name": "raid_bdev1", 00:26:35.642 "uuid": "19818672-98af-4774-827c-62d8bf6d397e", 00:26:35.642 "strip_size_kb": 0, 00:26:35.642 "state": "online", 00:26:35.642 "raid_level": "raid1", 00:26:35.642 "superblock": true, 00:26:35.642 "num_base_bdevs": 4, 00:26:35.642 "num_base_bdevs_discovered": 2, 00:26:35.642 "num_base_bdevs_operational": 2, 00:26:35.642 "base_bdevs_list": [ 00:26:35.642 { 00:26:35.642 "name": null, 00:26:35.642 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:35.642 "is_configured": false, 00:26:35.642 "data_offset": 2048, 00:26:35.642 "data_size": 63488 00:26:35.642 }, 00:26:35.642 { 00:26:35.642 "name": null, 00:26:35.642 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:35.642 "is_configured": false, 00:26:35.642 "data_offset": 2048, 00:26:35.642 "data_size": 63488 00:26:35.642 }, 00:26:35.642 { 00:26:35.642 "name": "BaseBdev3", 00:26:35.642 "uuid": "a8d50281-c1e8-5027-a341-df63e62bb065", 00:26:35.642 "is_configured": true, 00:26:35.642 "data_offset": 2048, 00:26:35.642 "data_size": 63488 00:26:35.642 }, 00:26:35.642 { 00:26:35.642 "name": "BaseBdev4", 00:26:35.642 "uuid": "9d1c8db1-4f59-53ff-89ab-e8eff03e22ce", 00:26:35.642 "is_configured": true, 00:26:35.642 "data_offset": 2048, 00:26:35.642 "data_size": 63488 00:26:35.642 } 00:26:35.642 ] 00:26:35.642 }' 00:26:35.642 20:02:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:35.903 20:02:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:35.903 20:02:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:35.903 20:02:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:35.903 20:02:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@787 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:26:36.190 20:02:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@788 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:36.449 [2024-07-24 20:02:27.795416] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:36.449 [2024-07-24 20:02:27.795470] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:36.449 [2024-07-24 20:02:27.795490] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a26f00 00:26:36.449 [2024-07-24 20:02:27.795503] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:36.449 [2024-07-24 20:02:27.795866] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:36.449 [2024-07-24 20:02:27.795887] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:36.449 [2024-07-24 20:02:27.795961] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:26:36.449 [2024-07-24 20:02:27.795974] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:26:36.449 [2024-07-24 20:02:27.795985] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:36.449 BaseBdev1 00:26:36.449 20:02:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@789 -- # sleep 1 00:26:37.386 20:02:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@790 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:37.386 20:02:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:37.386 20:02:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:37.386 20:02:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:37.386 20:02:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:37.386 20:02:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:37.386 20:02:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:37.386 20:02:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:37.386 20:02:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:37.386 20:02:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:37.386 20:02:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:37.386 20:02:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:37.645 20:02:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:37.645 "name": "raid_bdev1", 00:26:37.645 "uuid": "19818672-98af-4774-827c-62d8bf6d397e", 00:26:37.645 "strip_size_kb": 0, 00:26:37.645 "state": "online", 00:26:37.645 "raid_level": "raid1", 00:26:37.645 "superblock": true, 00:26:37.645 "num_base_bdevs": 4, 00:26:37.645 "num_base_bdevs_discovered": 2, 00:26:37.645 "num_base_bdevs_operational": 2, 00:26:37.645 "base_bdevs_list": [ 00:26:37.645 { 00:26:37.645 "name": null, 00:26:37.645 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:37.645 "is_configured": false, 00:26:37.645 "data_offset": 2048, 00:26:37.645 "data_size": 63488 00:26:37.645 }, 00:26:37.645 { 00:26:37.645 "name": null, 00:26:37.645 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:37.645 "is_configured": false, 00:26:37.645 "data_offset": 2048, 00:26:37.645 "data_size": 63488 00:26:37.645 }, 00:26:37.645 { 00:26:37.645 "name": "BaseBdev3", 00:26:37.645 "uuid": "a8d50281-c1e8-5027-a341-df63e62bb065", 00:26:37.645 "is_configured": true, 00:26:37.645 "data_offset": 2048, 00:26:37.645 "data_size": 63488 00:26:37.645 }, 00:26:37.645 { 00:26:37.645 "name": "BaseBdev4", 00:26:37.645 "uuid": "9d1c8db1-4f59-53ff-89ab-e8eff03e22ce", 00:26:37.645 "is_configured": true, 00:26:37.645 "data_offset": 2048, 00:26:37.645 "data_size": 63488 00:26:37.645 } 00:26:37.645 ] 00:26:37.645 }' 00:26:37.645 20:02:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:37.645 20:02:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:38.213 20:02:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:38.213 20:02:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:38.213 20:02:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:38.213 20:02:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:38.213 20:02:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:38.213 20:02:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:38.213 20:02:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:38.472 20:02:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:38.472 "name": "raid_bdev1", 00:26:38.472 "uuid": "19818672-98af-4774-827c-62d8bf6d397e", 00:26:38.472 "strip_size_kb": 0, 00:26:38.472 "state": "online", 00:26:38.472 "raid_level": "raid1", 00:26:38.472 "superblock": true, 00:26:38.472 "num_base_bdevs": 4, 00:26:38.472 "num_base_bdevs_discovered": 2, 00:26:38.472 "num_base_bdevs_operational": 2, 00:26:38.472 "base_bdevs_list": [ 00:26:38.472 { 00:26:38.472 "name": null, 00:26:38.472 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:38.472 "is_configured": false, 00:26:38.472 "data_offset": 2048, 00:26:38.472 "data_size": 63488 00:26:38.472 }, 00:26:38.472 { 00:26:38.472 "name": null, 00:26:38.472 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:38.472 "is_configured": false, 00:26:38.472 "data_offset": 2048, 00:26:38.472 "data_size": 63488 00:26:38.472 }, 00:26:38.472 { 00:26:38.472 "name": "BaseBdev3", 00:26:38.472 "uuid": "a8d50281-c1e8-5027-a341-df63e62bb065", 00:26:38.472 "is_configured": true, 00:26:38.472 "data_offset": 2048, 00:26:38.472 "data_size": 63488 00:26:38.472 }, 00:26:38.472 { 00:26:38.472 "name": "BaseBdev4", 00:26:38.472 "uuid": "9d1c8db1-4f59-53ff-89ab-e8eff03e22ce", 00:26:38.472 "is_configured": true, 00:26:38.472 "data_offset": 2048, 00:26:38.472 "data_size": 63488 00:26:38.472 } 00:26:38.472 ] 00:26:38.472 }' 00:26:38.472 20:02:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:38.472 20:02:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:38.472 20:02:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:38.472 20:02:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:38.472 20:02:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@792 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:38.472 20:02:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # local es=0 00:26:38.472 20:02:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:38.472 20:02:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:38.472 20:02:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:26:38.472 20:02:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:38.472 20:02:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:26:38.472 20:02:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:38.472 20:02:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:26:38.472 20:02:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:38.472 20:02:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:38.472 20:02:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:38.731 [2024-07-24 20:02:30.193797] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:38.731 [2024-07-24 20:02:30.193942] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:26:38.731 [2024-07-24 20:02:30.193959] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:38.731 request: 00:26:38.731 { 00:26:38.731 "base_bdev": "BaseBdev1", 00:26:38.731 "raid_bdev": "raid_bdev1", 00:26:38.731 "method": "bdev_raid_add_base_bdev", 00:26:38.731 "req_id": 1 00:26:38.731 } 00:26:38.731 Got JSON-RPC error response 00:26:38.731 response: 00:26:38.731 { 00:26:38.731 "code": -22, 00:26:38.731 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:26:38.731 } 00:26:38.731 20:02:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@653 -- # es=1 00:26:38.731 20:02:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:26:38.731 20:02:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:26:38.731 20:02:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:26:38.731 20:02:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@793 -- # sleep 1 00:26:39.668 20:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@794 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:39.668 20:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:39.668 20:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:39.668 20:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:39.668 20:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:39.668 20:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:39.669 20:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:39.669 20:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:39.669 20:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:39.669 20:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:39.669 20:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:39.669 20:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:39.928 20:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:39.928 "name": "raid_bdev1", 00:26:39.928 "uuid": "19818672-98af-4774-827c-62d8bf6d397e", 00:26:39.928 "strip_size_kb": 0, 00:26:39.928 "state": "online", 00:26:39.928 "raid_level": "raid1", 00:26:39.928 "superblock": true, 00:26:39.928 "num_base_bdevs": 4, 00:26:39.928 "num_base_bdevs_discovered": 2, 00:26:39.928 "num_base_bdevs_operational": 2, 00:26:39.928 "base_bdevs_list": [ 00:26:39.928 { 00:26:39.928 "name": null, 00:26:39.928 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:39.928 "is_configured": false, 00:26:39.928 "data_offset": 2048, 00:26:39.928 "data_size": 63488 00:26:39.928 }, 00:26:39.928 { 00:26:39.928 "name": null, 00:26:39.928 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:39.928 "is_configured": false, 00:26:39.928 "data_offset": 2048, 00:26:39.928 "data_size": 63488 00:26:39.928 }, 00:26:39.928 { 00:26:39.928 "name": "BaseBdev3", 00:26:39.928 "uuid": "a8d50281-c1e8-5027-a341-df63e62bb065", 00:26:39.928 "is_configured": true, 00:26:39.928 "data_offset": 2048, 00:26:39.928 "data_size": 63488 00:26:39.928 }, 00:26:39.928 { 00:26:39.928 "name": "BaseBdev4", 00:26:39.928 "uuid": "9d1c8db1-4f59-53ff-89ab-e8eff03e22ce", 00:26:39.928 "is_configured": true, 00:26:39.928 "data_offset": 2048, 00:26:39.928 "data_size": 63488 00:26:39.928 } 00:26:39.928 ] 00:26:39.928 }' 00:26:39.928 20:02:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:39.928 20:02:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:40.866 20:02:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@795 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:40.866 20:02:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:40.866 20:02:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:40.866 20:02:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:40.866 20:02:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:40.866 20:02:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:40.866 20:02:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:40.866 20:02:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:40.866 "name": "raid_bdev1", 00:26:40.866 "uuid": "19818672-98af-4774-827c-62d8bf6d397e", 00:26:40.866 "strip_size_kb": 0, 00:26:40.866 "state": "online", 00:26:40.866 "raid_level": "raid1", 00:26:40.866 "superblock": true, 00:26:40.866 "num_base_bdevs": 4, 00:26:40.866 "num_base_bdevs_discovered": 2, 00:26:40.866 "num_base_bdevs_operational": 2, 00:26:40.866 "base_bdevs_list": [ 00:26:40.866 { 00:26:40.866 "name": null, 00:26:40.866 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:40.866 "is_configured": false, 00:26:40.866 "data_offset": 2048, 00:26:40.866 "data_size": 63488 00:26:40.866 }, 00:26:40.866 { 00:26:40.866 "name": null, 00:26:40.866 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:40.866 "is_configured": false, 00:26:40.866 "data_offset": 2048, 00:26:40.866 "data_size": 63488 00:26:40.866 }, 00:26:40.866 { 00:26:40.866 "name": "BaseBdev3", 00:26:40.866 "uuid": "a8d50281-c1e8-5027-a341-df63e62bb065", 00:26:40.866 "is_configured": true, 00:26:40.866 "data_offset": 2048, 00:26:40.866 "data_size": 63488 00:26:40.866 }, 00:26:40.866 { 00:26:40.866 "name": "BaseBdev4", 00:26:40.866 "uuid": "9d1c8db1-4f59-53ff-89ab-e8eff03e22ce", 00:26:40.866 "is_configured": true, 00:26:40.866 "data_offset": 2048, 00:26:40.866 "data_size": 63488 00:26:40.866 } 00:26:40.866 ] 00:26:40.866 }' 00:26:40.866 20:02:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:40.866 20:02:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:40.866 20:02:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:41.126 20:02:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:41.126 20:02:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@798 -- # killprocess 1503676 00:26:41.126 20:02:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1503676 ']' 00:26:41.126 20:02:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # kill -0 1503676 00:26:41.126 20:02:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # uname 00:26:41.126 20:02:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:26:41.126 20:02:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1503676 00:26:41.126 20:02:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:26:41.126 20:02:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:26:41.126 20:02:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1503676' 00:26:41.126 killing process with pid 1503676 00:26:41.126 20:02:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@969 -- # kill 1503676 00:26:41.126 Received shutdown signal, test time was about 60.000000 seconds 00:26:41.126 00:26:41.126 Latency(us) 00:26:41.126 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:41.126 =================================================================================================================== 00:26:41.126 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:26:41.126 [2024-07-24 20:02:32.522356] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:41.126 [2024-07-24 20:02:32.522462] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:41.126 20:02:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@974 -- # wait 1503676 00:26:41.126 [2024-07-24 20:02:32.522525] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:41.126 [2024-07-24 20:02:32.522539] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a228e0 name raid_bdev1, state offline 00:26:41.126 [2024-07-24 20:02:32.576839] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@800 -- # return 0 00:26:41.386 00:26:41.386 real 0m38.549s 00:26:41.386 user 0m56.197s 00:26:41.386 sys 0m7.492s 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:41.386 ************************************ 00:26:41.386 END TEST raid_rebuild_test_sb 00:26:41.386 ************************************ 00:26:41.386 20:02:32 bdev_raid -- bdev/bdev_raid.sh@959 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:26:41.386 20:02:32 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:26:41.386 20:02:32 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:26:41.386 20:02:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:41.386 ************************************ 00:26:41.386 START TEST raid_rebuild_test_io 00:26:41.386 ************************************ 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 4 false true true 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=4 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@586 -- # local superblock=false 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@587 -- # local background_io=true 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # local verify=true 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev3 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev4 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # local strip_size 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@592 -- # local create_arg 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@594 -- # local data_offset 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # '[' false = true ']' 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # raid_pid=1509044 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@613 -- # waitforlisten 1509044 /var/tmp/spdk-raid.sock 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@831 -- # '[' -z 1509044 ']' 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # local max_retries=100 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:41.386 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@840 -- # xtrace_disable 00:26:41.386 20:02:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:41.386 [2024-07-24 20:02:32.969065] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:26:41.386 [2024-07-24 20:02:32.969131] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1509044 ] 00:26:41.386 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:41.386 Zero copy mechanism will not be used. 00:26:41.645 [2024-07-24 20:02:33.089080] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:41.645 [2024-07-24 20:02:33.193257] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:41.907 [2024-07-24 20:02:33.251909] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:41.907 [2024-07-24 20:02:33.251943] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:41.907 20:02:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:26:41.907 20:02:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@864 -- # return 0 00:26:41.907 20:02:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:26:41.907 20:02:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:26:42.165 BaseBdev1_malloc 00:26:42.165 20:02:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:42.425 [2024-07-24 20:02:33.917065] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:42.425 [2024-07-24 20:02:33.917115] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:42.425 [2024-07-24 20:02:33.917140] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2343cd0 00:26:42.425 [2024-07-24 20:02:33.917153] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:42.425 [2024-07-24 20:02:33.918837] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:42.425 [2024-07-24 20:02:33.918868] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:42.425 BaseBdev1 00:26:42.425 20:02:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:26:42.425 20:02:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:26:42.684 BaseBdev2_malloc 00:26:42.684 20:02:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:42.944 [2024-07-24 20:02:34.396398] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:42.944 [2024-07-24 20:02:34.396445] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:42.944 [2024-07-24 20:02:34.396467] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2347460 00:26:42.944 [2024-07-24 20:02:34.396480] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:42.944 [2024-07-24 20:02:34.398066] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:42.944 [2024-07-24 20:02:34.398098] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:42.944 BaseBdev2 00:26:42.944 20:02:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:26:42.944 20:02:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:26:43.204 BaseBdev3_malloc 00:26:43.204 20:02:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:26:43.464 [2024-07-24 20:02:34.895616] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:26:43.464 [2024-07-24 20:02:34.895661] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:43.464 [2024-07-24 20:02:34.895681] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2407780 00:26:43.464 [2024-07-24 20:02:34.895694] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:43.464 [2024-07-24 20:02:34.897241] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:43.464 [2024-07-24 20:02:34.897271] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:26:43.464 BaseBdev3 00:26:43.464 20:02:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:26:43.464 20:02:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:26:43.723 BaseBdev4_malloc 00:26:43.723 20:02:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:26:43.982 [2024-07-24 20:02:35.381492] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:26:43.982 [2024-07-24 20:02:35.381538] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:43.982 [2024-07-24 20:02:35.381561] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2406e60 00:26:43.982 [2024-07-24 20:02:35.381573] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:43.982 [2024-07-24 20:02:35.383095] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:43.982 [2024-07-24 20:02:35.383125] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:26:43.982 BaseBdev4 00:26:43.982 20:02:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:26:44.242 spare_malloc 00:26:44.242 20:02:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:44.501 spare_delay 00:26:44.501 20:02:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:44.760 [2024-07-24 20:02:36.128575] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:44.760 [2024-07-24 20:02:36.128620] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:44.760 [2024-07-24 20:02:36.128642] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x233da50 00:26:44.760 [2024-07-24 20:02:36.128655] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:44.760 [2024-07-24 20:02:36.130087] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:44.760 [2024-07-24 20:02:36.130117] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:44.760 spare 00:26:44.760 20:02:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:26:45.020 [2024-07-24 20:02:36.385274] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:45.020 [2024-07-24 20:02:36.386625] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:45.020 [2024-07-24 20:02:36.386679] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:45.020 [2024-07-24 20:02:36.386724] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:45.020 [2024-07-24 20:02:36.386815] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2340130 00:26:45.020 [2024-07-24 20:02:36.386831] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:26:45.020 [2024-07-24 20:02:36.387049] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x233bfc0 00:26:45.020 [2024-07-24 20:02:36.387207] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2340130 00:26:45.020 [2024-07-24 20:02:36.387218] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2340130 00:26:45.020 [2024-07-24 20:02:36.387335] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:45.020 20:02:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:26:45.020 20:02:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:45.020 20:02:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:45.020 20:02:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:45.020 20:02:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:45.020 20:02:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:45.020 20:02:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:45.020 20:02:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:45.020 20:02:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:45.020 20:02:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:45.020 20:02:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:45.020 20:02:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:45.286 20:02:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:45.286 "name": "raid_bdev1", 00:26:45.286 "uuid": "c41a02f7-fffb-4d7c-b7f2-e4a4da0bf23e", 00:26:45.286 "strip_size_kb": 0, 00:26:45.286 "state": "online", 00:26:45.286 "raid_level": "raid1", 00:26:45.286 "superblock": false, 00:26:45.286 "num_base_bdevs": 4, 00:26:45.286 "num_base_bdevs_discovered": 4, 00:26:45.286 "num_base_bdevs_operational": 4, 00:26:45.286 "base_bdevs_list": [ 00:26:45.286 { 00:26:45.286 "name": "BaseBdev1", 00:26:45.286 "uuid": "521b24f4-c54f-5a13-a40d-12667851d465", 00:26:45.286 "is_configured": true, 00:26:45.286 "data_offset": 0, 00:26:45.286 "data_size": 65536 00:26:45.286 }, 00:26:45.286 { 00:26:45.286 "name": "BaseBdev2", 00:26:45.286 "uuid": "4c743ac9-5851-551a-befd-04dad948d8df", 00:26:45.286 "is_configured": true, 00:26:45.286 "data_offset": 0, 00:26:45.286 "data_size": 65536 00:26:45.286 }, 00:26:45.286 { 00:26:45.286 "name": "BaseBdev3", 00:26:45.286 "uuid": "f87f49fa-67db-58f0-a19f-6f6f9cc59b1a", 00:26:45.286 "is_configured": true, 00:26:45.286 "data_offset": 0, 00:26:45.286 "data_size": 65536 00:26:45.286 }, 00:26:45.286 { 00:26:45.286 "name": "BaseBdev4", 00:26:45.286 "uuid": "5534946f-b17b-5480-a926-08ce0510be97", 00:26:45.286 "is_configured": true, 00:26:45.286 "data_offset": 0, 00:26:45.286 "data_size": 65536 00:26:45.286 } 00:26:45.286 ] 00:26:45.286 }' 00:26:45.286 20:02:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:45.286 20:02:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:45.861 20:02:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:45.861 20:02:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:26:46.121 [2024-07-24 20:02:37.456377] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:46.121 20:02:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=65536 00:26:46.121 20:02:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:46.121 20:02:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:46.380 20:02:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@634 -- # data_offset=0 00:26:46.380 20:02:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@636 -- # '[' true = true ']' 00:26:46.380 20:02:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:46.380 20:02:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@638 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:26:46.380 [2024-07-24 20:02:37.839238] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2340100 00:26:46.380 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:46.380 Zero copy mechanism will not be used. 00:26:46.380 Running I/O for 60 seconds... 00:26:46.380 [2024-07-24 20:02:37.957490] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:46.380 [2024-07-24 20:02:37.965645] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x2340100 00:26:46.640 20:02:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:46.640 20:02:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:46.640 20:02:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:46.640 20:02:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:46.640 20:02:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:46.640 20:02:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:46.640 20:02:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:46.640 20:02:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:46.640 20:02:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:46.640 20:02:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:46.640 20:02:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:46.640 20:02:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:46.900 20:02:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:46.900 "name": "raid_bdev1", 00:26:46.900 "uuid": "c41a02f7-fffb-4d7c-b7f2-e4a4da0bf23e", 00:26:46.900 "strip_size_kb": 0, 00:26:46.900 "state": "online", 00:26:46.900 "raid_level": "raid1", 00:26:46.900 "superblock": false, 00:26:46.900 "num_base_bdevs": 4, 00:26:46.900 "num_base_bdevs_discovered": 3, 00:26:46.900 "num_base_bdevs_operational": 3, 00:26:46.900 "base_bdevs_list": [ 00:26:46.900 { 00:26:46.900 "name": null, 00:26:46.900 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:46.900 "is_configured": false, 00:26:46.900 "data_offset": 0, 00:26:46.900 "data_size": 65536 00:26:46.900 }, 00:26:46.900 { 00:26:46.900 "name": "BaseBdev2", 00:26:46.900 "uuid": "4c743ac9-5851-551a-befd-04dad948d8df", 00:26:46.900 "is_configured": true, 00:26:46.900 "data_offset": 0, 00:26:46.900 "data_size": 65536 00:26:46.900 }, 00:26:46.900 { 00:26:46.900 "name": "BaseBdev3", 00:26:46.900 "uuid": "f87f49fa-67db-58f0-a19f-6f6f9cc59b1a", 00:26:46.900 "is_configured": true, 00:26:46.900 "data_offset": 0, 00:26:46.900 "data_size": 65536 00:26:46.900 }, 00:26:46.900 { 00:26:46.900 "name": "BaseBdev4", 00:26:46.900 "uuid": "5534946f-b17b-5480-a926-08ce0510be97", 00:26:46.900 "is_configured": true, 00:26:46.900 "data_offset": 0, 00:26:46.900 "data_size": 65536 00:26:46.900 } 00:26:46.900 ] 00:26:46.900 }' 00:26:46.900 20:02:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:46.900 20:02:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:47.468 20:02:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:47.728 [2024-07-24 20:02:39.068952] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:47.728 20:02:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:47.728 [2024-07-24 20:02:39.161288] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x204afb0 00:26:47.728 [2024-07-24 20:02:39.163699] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:47.728 [2024-07-24 20:02:39.274666] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:47.728 [2024-07-24 20:02:39.275235] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:47.987 [2024-07-24 20:02:39.498856] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:47.987 [2024-07-24 20:02:39.499532] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:48.555 [2024-07-24 20:02:40.008161] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:48.555 20:02:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:48.555 20:02:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:48.555 20:02:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:48.555 20:02:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:48.555 20:02:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:48.815 20:02:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:48.815 20:02:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:48.815 [2024-07-24 20:02:40.384220] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:48.815 [2024-07-24 20:02:40.385405] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:49.074 [2024-07-24 20:02:40.607825] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:49.333 20:02:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:49.333 "name": "raid_bdev1", 00:26:49.333 "uuid": "c41a02f7-fffb-4d7c-b7f2-e4a4da0bf23e", 00:26:49.333 "strip_size_kb": 0, 00:26:49.333 "state": "online", 00:26:49.333 "raid_level": "raid1", 00:26:49.334 "superblock": false, 00:26:49.334 "num_base_bdevs": 4, 00:26:49.334 "num_base_bdevs_discovered": 4, 00:26:49.334 "num_base_bdevs_operational": 4, 00:26:49.334 "process": { 00:26:49.334 "type": "rebuild", 00:26:49.334 "target": "spare", 00:26:49.334 "progress": { 00:26:49.334 "blocks": 16384, 00:26:49.334 "percent": 25 00:26:49.334 } 00:26:49.334 }, 00:26:49.334 "base_bdevs_list": [ 00:26:49.334 { 00:26:49.334 "name": "spare", 00:26:49.334 "uuid": "6b424f86-bbac-566a-bfc0-d2f5a63dc60a", 00:26:49.334 "is_configured": true, 00:26:49.334 "data_offset": 0, 00:26:49.334 "data_size": 65536 00:26:49.334 }, 00:26:49.334 { 00:26:49.334 "name": "BaseBdev2", 00:26:49.334 "uuid": "4c743ac9-5851-551a-befd-04dad948d8df", 00:26:49.334 "is_configured": true, 00:26:49.334 "data_offset": 0, 00:26:49.334 "data_size": 65536 00:26:49.334 }, 00:26:49.334 { 00:26:49.334 "name": "BaseBdev3", 00:26:49.334 "uuid": "f87f49fa-67db-58f0-a19f-6f6f9cc59b1a", 00:26:49.334 "is_configured": true, 00:26:49.334 "data_offset": 0, 00:26:49.334 "data_size": 65536 00:26:49.334 }, 00:26:49.334 { 00:26:49.334 "name": "BaseBdev4", 00:26:49.334 "uuid": "5534946f-b17b-5480-a926-08ce0510be97", 00:26:49.334 "is_configured": true, 00:26:49.334 "data_offset": 0, 00:26:49.334 "data_size": 65536 00:26:49.334 } 00:26:49.334 ] 00:26:49.334 }' 00:26:49.334 20:02:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:49.334 20:02:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:49.334 20:02:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:49.334 20:02:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:49.334 20:02:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:49.334 [2024-07-24 20:02:40.842704] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:26:49.594 [2024-07-24 20:02:40.990645] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:49.594 [2024-07-24 20:02:41.047207] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:26:49.594 [2024-07-24 20:02:41.047813] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:26:49.594 [2024-07-24 20:02:41.158322] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:49.594 [2024-07-24 20:02:41.180510] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:49.594 [2024-07-24 20:02:41.180545] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:49.594 [2024-07-24 20:02:41.180556] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:49.853 [2024-07-24 20:02:41.196166] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x2340100 00:26:49.853 20:02:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:49.853 20:02:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:49.853 20:02:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:49.853 20:02:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:49.853 20:02:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:49.853 20:02:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:49.853 20:02:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:49.853 20:02:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:49.853 20:02:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:49.853 20:02:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:49.853 20:02:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:49.853 20:02:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:50.113 20:02:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:50.113 "name": "raid_bdev1", 00:26:50.113 "uuid": "c41a02f7-fffb-4d7c-b7f2-e4a4da0bf23e", 00:26:50.113 "strip_size_kb": 0, 00:26:50.113 "state": "online", 00:26:50.113 "raid_level": "raid1", 00:26:50.113 "superblock": false, 00:26:50.113 "num_base_bdevs": 4, 00:26:50.113 "num_base_bdevs_discovered": 3, 00:26:50.113 "num_base_bdevs_operational": 3, 00:26:50.113 "base_bdevs_list": [ 00:26:50.113 { 00:26:50.113 "name": null, 00:26:50.113 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:50.113 "is_configured": false, 00:26:50.113 "data_offset": 0, 00:26:50.113 "data_size": 65536 00:26:50.113 }, 00:26:50.113 { 00:26:50.113 "name": "BaseBdev2", 00:26:50.113 "uuid": "4c743ac9-5851-551a-befd-04dad948d8df", 00:26:50.113 "is_configured": true, 00:26:50.113 "data_offset": 0, 00:26:50.113 "data_size": 65536 00:26:50.113 }, 00:26:50.113 { 00:26:50.113 "name": "BaseBdev3", 00:26:50.113 "uuid": "f87f49fa-67db-58f0-a19f-6f6f9cc59b1a", 00:26:50.113 "is_configured": true, 00:26:50.113 "data_offset": 0, 00:26:50.113 "data_size": 65536 00:26:50.113 }, 00:26:50.113 { 00:26:50.113 "name": "BaseBdev4", 00:26:50.113 "uuid": "5534946f-b17b-5480-a926-08ce0510be97", 00:26:50.113 "is_configured": true, 00:26:50.113 "data_offset": 0, 00:26:50.113 "data_size": 65536 00:26:50.113 } 00:26:50.113 ] 00:26:50.113 }' 00:26:50.113 20:02:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:50.113 20:02:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:50.681 20:02:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:50.681 20:02:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:50.681 20:02:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:50.681 20:02:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:50.681 20:02:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:50.681 20:02:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:50.681 20:02:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:50.941 20:02:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:50.941 "name": "raid_bdev1", 00:26:50.941 "uuid": "c41a02f7-fffb-4d7c-b7f2-e4a4da0bf23e", 00:26:50.941 "strip_size_kb": 0, 00:26:50.941 "state": "online", 00:26:50.941 "raid_level": "raid1", 00:26:50.941 "superblock": false, 00:26:50.941 "num_base_bdevs": 4, 00:26:50.941 "num_base_bdevs_discovered": 3, 00:26:50.941 "num_base_bdevs_operational": 3, 00:26:50.941 "base_bdevs_list": [ 00:26:50.941 { 00:26:50.941 "name": null, 00:26:50.941 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:50.941 "is_configured": false, 00:26:50.941 "data_offset": 0, 00:26:50.941 "data_size": 65536 00:26:50.941 }, 00:26:50.941 { 00:26:50.941 "name": "BaseBdev2", 00:26:50.941 "uuid": "4c743ac9-5851-551a-befd-04dad948d8df", 00:26:50.941 "is_configured": true, 00:26:50.941 "data_offset": 0, 00:26:50.941 "data_size": 65536 00:26:50.941 }, 00:26:50.941 { 00:26:50.941 "name": "BaseBdev3", 00:26:50.941 "uuid": "f87f49fa-67db-58f0-a19f-6f6f9cc59b1a", 00:26:50.941 "is_configured": true, 00:26:50.941 "data_offset": 0, 00:26:50.941 "data_size": 65536 00:26:50.941 }, 00:26:50.941 { 00:26:50.941 "name": "BaseBdev4", 00:26:50.941 "uuid": "5534946f-b17b-5480-a926-08ce0510be97", 00:26:50.941 "is_configured": true, 00:26:50.941 "data_offset": 0, 00:26:50.941 "data_size": 65536 00:26:50.941 } 00:26:50.941 ] 00:26:50.941 }' 00:26:50.941 20:02:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:50.941 20:02:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:50.941 20:02:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:50.941 20:02:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:50.941 20:02:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:51.200 [2024-07-24 20:02:42.714931] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:51.459 20:02:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@678 -- # sleep 1 00:26:51.459 [2024-07-24 20:02:42.808359] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24e7390 00:26:51.459 [2024-07-24 20:02:42.809910] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:51.459 [2024-07-24 20:02:42.944420] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:51.719 [2024-07-24 20:02:43.057187] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:51.719 [2024-07-24 20:02:43.057501] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:51.976 [2024-07-24 20:02:43.430339] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:51.976 [2024-07-24 20:02:43.431628] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:52.235 [2024-07-24 20:02:43.658068] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:52.235 20:02:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:52.235 20:02:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:52.235 20:02:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:52.235 20:02:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:52.235 20:02:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:52.235 20:02:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:52.235 20:02:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:52.494 [2024-07-24 20:02:43.950884] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:52.494 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:52.494 "name": "raid_bdev1", 00:26:52.494 "uuid": "c41a02f7-fffb-4d7c-b7f2-e4a4da0bf23e", 00:26:52.494 "strip_size_kb": 0, 00:26:52.494 "state": "online", 00:26:52.494 "raid_level": "raid1", 00:26:52.494 "superblock": false, 00:26:52.494 "num_base_bdevs": 4, 00:26:52.494 "num_base_bdevs_discovered": 4, 00:26:52.494 "num_base_bdevs_operational": 4, 00:26:52.494 "process": { 00:26:52.494 "type": "rebuild", 00:26:52.494 "target": "spare", 00:26:52.494 "progress": { 00:26:52.494 "blocks": 14336, 00:26:52.494 "percent": 21 00:26:52.494 } 00:26:52.494 }, 00:26:52.494 "base_bdevs_list": [ 00:26:52.494 { 00:26:52.494 "name": "spare", 00:26:52.494 "uuid": "6b424f86-bbac-566a-bfc0-d2f5a63dc60a", 00:26:52.494 "is_configured": true, 00:26:52.494 "data_offset": 0, 00:26:52.494 "data_size": 65536 00:26:52.494 }, 00:26:52.494 { 00:26:52.494 "name": "BaseBdev2", 00:26:52.494 "uuid": "4c743ac9-5851-551a-befd-04dad948d8df", 00:26:52.494 "is_configured": true, 00:26:52.494 "data_offset": 0, 00:26:52.494 "data_size": 65536 00:26:52.494 }, 00:26:52.494 { 00:26:52.494 "name": "BaseBdev3", 00:26:52.494 "uuid": "f87f49fa-67db-58f0-a19f-6f6f9cc59b1a", 00:26:52.494 "is_configured": true, 00:26:52.494 "data_offset": 0, 00:26:52.494 "data_size": 65536 00:26:52.494 }, 00:26:52.494 { 00:26:52.494 "name": "BaseBdev4", 00:26:52.494 "uuid": "5534946f-b17b-5480-a926-08ce0510be97", 00:26:52.494 "is_configured": true, 00:26:52.494 "data_offset": 0, 00:26:52.494 "data_size": 65536 00:26:52.494 } 00:26:52.494 ] 00:26:52.494 }' 00:26:52.494 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:52.494 [2024-07-24 20:02:44.073588] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:52.754 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:52.754 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:52.754 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:52.754 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@681 -- # '[' false = true ']' 00:26:52.754 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=4 00:26:52.754 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:26:52.754 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # '[' 4 -gt 2 ']' 00:26:52.754 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:26:52.754 [2024-07-24 20:02:44.317447] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:26:53.051 [2024-07-24 20:02:44.384382] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:53.051 [2024-07-24 20:02:44.560724] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x2340100 00:26:53.051 [2024-07-24 20:02:44.560767] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x24e7390 00:26:53.051 [2024-07-24 20:02:44.570368] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:26:53.051 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@713 -- # base_bdevs[1]= 00:26:53.051 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # (( num_base_bdevs_operational-- )) 00:26:53.051 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@717 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:53.051 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:53.051 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:53.051 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:53.051 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:53.051 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:53.051 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:53.310 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:53.310 "name": "raid_bdev1", 00:26:53.310 "uuid": "c41a02f7-fffb-4d7c-b7f2-e4a4da0bf23e", 00:26:53.310 "strip_size_kb": 0, 00:26:53.310 "state": "online", 00:26:53.310 "raid_level": "raid1", 00:26:53.310 "superblock": false, 00:26:53.310 "num_base_bdevs": 4, 00:26:53.310 "num_base_bdevs_discovered": 3, 00:26:53.310 "num_base_bdevs_operational": 3, 00:26:53.310 "process": { 00:26:53.310 "type": "rebuild", 00:26:53.310 "target": "spare", 00:26:53.310 "progress": { 00:26:53.310 "blocks": 24576, 00:26:53.310 "percent": 37 00:26:53.310 } 00:26:53.310 }, 00:26:53.310 "base_bdevs_list": [ 00:26:53.310 { 00:26:53.310 "name": "spare", 00:26:53.310 "uuid": "6b424f86-bbac-566a-bfc0-d2f5a63dc60a", 00:26:53.310 "is_configured": true, 00:26:53.310 "data_offset": 0, 00:26:53.310 "data_size": 65536 00:26:53.310 }, 00:26:53.310 { 00:26:53.310 "name": null, 00:26:53.310 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:53.310 "is_configured": false, 00:26:53.310 "data_offset": 0, 00:26:53.310 "data_size": 65536 00:26:53.310 }, 00:26:53.310 { 00:26:53.310 "name": "BaseBdev3", 00:26:53.310 "uuid": "f87f49fa-67db-58f0-a19f-6f6f9cc59b1a", 00:26:53.310 "is_configured": true, 00:26:53.310 "data_offset": 0, 00:26:53.310 "data_size": 65536 00:26:53.310 }, 00:26:53.310 { 00:26:53.310 "name": "BaseBdev4", 00:26:53.310 "uuid": "5534946f-b17b-5480-a926-08ce0510be97", 00:26:53.310 "is_configured": true, 00:26:53.311 "data_offset": 0, 00:26:53.311 "data_size": 65536 00:26:53.311 } 00:26:53.311 ] 00:26:53.311 }' 00:26:53.311 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:53.570 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:53.570 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:53.570 [2024-07-24 20:02:44.952540] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:26:53.570 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:53.570 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # local timeout=985 00:26:53.570 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:26:53.570 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:53.570 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:53.570 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:53.570 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:53.570 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:53.570 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:53.570 20:02:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:53.829 [2024-07-24 20:02:45.182826] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:26:53.829 [2024-07-24 20:02:45.183305] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:26:53.829 20:02:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:53.829 "name": "raid_bdev1", 00:26:53.829 "uuid": "c41a02f7-fffb-4d7c-b7f2-e4a4da0bf23e", 00:26:53.829 "strip_size_kb": 0, 00:26:53.829 "state": "online", 00:26:53.829 "raid_level": "raid1", 00:26:53.829 "superblock": false, 00:26:53.829 "num_base_bdevs": 4, 00:26:53.829 "num_base_bdevs_discovered": 3, 00:26:53.830 "num_base_bdevs_operational": 3, 00:26:53.830 "process": { 00:26:53.830 "type": "rebuild", 00:26:53.830 "target": "spare", 00:26:53.830 "progress": { 00:26:53.830 "blocks": 28672, 00:26:53.830 "percent": 43 00:26:53.830 } 00:26:53.830 }, 00:26:53.830 "base_bdevs_list": [ 00:26:53.830 { 00:26:53.830 "name": "spare", 00:26:53.830 "uuid": "6b424f86-bbac-566a-bfc0-d2f5a63dc60a", 00:26:53.830 "is_configured": true, 00:26:53.830 "data_offset": 0, 00:26:53.830 "data_size": 65536 00:26:53.830 }, 00:26:53.830 { 00:26:53.830 "name": null, 00:26:53.830 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:53.830 "is_configured": false, 00:26:53.830 "data_offset": 0, 00:26:53.830 "data_size": 65536 00:26:53.830 }, 00:26:53.830 { 00:26:53.830 "name": "BaseBdev3", 00:26:53.830 "uuid": "f87f49fa-67db-58f0-a19f-6f6f9cc59b1a", 00:26:53.830 "is_configured": true, 00:26:53.830 "data_offset": 0, 00:26:53.830 "data_size": 65536 00:26:53.830 }, 00:26:53.830 { 00:26:53.830 "name": "BaseBdev4", 00:26:53.830 "uuid": "5534946f-b17b-5480-a926-08ce0510be97", 00:26:53.830 "is_configured": true, 00:26:53.830 "data_offset": 0, 00:26:53.830 "data_size": 65536 00:26:53.830 } 00:26:53.830 ] 00:26:53.830 }' 00:26:53.830 20:02:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:53.830 20:02:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:53.830 20:02:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:53.830 20:02:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:53.830 20:02:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:26:54.089 [2024-07-24 20:02:45.556193] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:26:54.348 [2024-07-24 20:02:45.686181] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:26:54.607 [2024-07-24 20:02:46.018807] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:26:54.866 [2024-07-24 20:02:46.249948] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:26:54.866 20:02:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:26:54.866 20:02:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:54.866 20:02:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:54.866 20:02:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:54.866 20:02:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:54.866 20:02:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:54.866 20:02:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:54.866 20:02:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:55.126 20:02:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:55.126 "name": "raid_bdev1", 00:26:55.126 "uuid": "c41a02f7-fffb-4d7c-b7f2-e4a4da0bf23e", 00:26:55.126 "strip_size_kb": 0, 00:26:55.126 "state": "online", 00:26:55.126 "raid_level": "raid1", 00:26:55.126 "superblock": false, 00:26:55.126 "num_base_bdevs": 4, 00:26:55.126 "num_base_bdevs_discovered": 3, 00:26:55.126 "num_base_bdevs_operational": 3, 00:26:55.126 "process": { 00:26:55.126 "type": "rebuild", 00:26:55.126 "target": "spare", 00:26:55.126 "progress": { 00:26:55.126 "blocks": 43008, 00:26:55.126 "percent": 65 00:26:55.126 } 00:26:55.126 }, 00:26:55.126 "base_bdevs_list": [ 00:26:55.126 { 00:26:55.126 "name": "spare", 00:26:55.126 "uuid": "6b424f86-bbac-566a-bfc0-d2f5a63dc60a", 00:26:55.126 "is_configured": true, 00:26:55.126 "data_offset": 0, 00:26:55.126 "data_size": 65536 00:26:55.126 }, 00:26:55.126 { 00:26:55.126 "name": null, 00:26:55.126 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:55.126 "is_configured": false, 00:26:55.126 "data_offset": 0, 00:26:55.126 "data_size": 65536 00:26:55.126 }, 00:26:55.126 { 00:26:55.126 "name": "BaseBdev3", 00:26:55.126 "uuid": "f87f49fa-67db-58f0-a19f-6f6f9cc59b1a", 00:26:55.126 "is_configured": true, 00:26:55.126 "data_offset": 0, 00:26:55.126 "data_size": 65536 00:26:55.126 }, 00:26:55.126 { 00:26:55.126 "name": "BaseBdev4", 00:26:55.126 "uuid": "5534946f-b17b-5480-a926-08ce0510be97", 00:26:55.126 "is_configured": true, 00:26:55.126 "data_offset": 0, 00:26:55.126 "data_size": 65536 00:26:55.126 } 00:26:55.126 ] 00:26:55.126 }' 00:26:55.126 20:02:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:55.126 [2024-07-24 20:02:46.566034] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:26:55.126 20:02:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:55.126 20:02:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:55.126 20:02:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:55.126 20:02:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:26:55.126 [2024-07-24 20:02:46.677331] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:26:56.064 [2024-07-24 20:02:47.348884] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:26:56.064 20:02:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:26:56.064 20:02:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:56.064 20:02:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:56.064 20:02:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:56.064 20:02:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:56.064 20:02:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:56.065 20:02:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:56.065 20:02:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:56.324 [2024-07-24 20:02:47.803206] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:56.324 20:02:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:56.324 "name": "raid_bdev1", 00:26:56.324 "uuid": "c41a02f7-fffb-4d7c-b7f2-e4a4da0bf23e", 00:26:56.324 "strip_size_kb": 0, 00:26:56.324 "state": "online", 00:26:56.324 "raid_level": "raid1", 00:26:56.324 "superblock": false, 00:26:56.324 "num_base_bdevs": 4, 00:26:56.324 "num_base_bdevs_discovered": 3, 00:26:56.324 "num_base_bdevs_operational": 3, 00:26:56.324 "process": { 00:26:56.324 "type": "rebuild", 00:26:56.324 "target": "spare", 00:26:56.324 "progress": { 00:26:56.324 "blocks": 65536, 00:26:56.324 "percent": 100 00:26:56.324 } 00:26:56.324 }, 00:26:56.324 "base_bdevs_list": [ 00:26:56.324 { 00:26:56.324 "name": "spare", 00:26:56.324 "uuid": "6b424f86-bbac-566a-bfc0-d2f5a63dc60a", 00:26:56.324 "is_configured": true, 00:26:56.324 "data_offset": 0, 00:26:56.324 "data_size": 65536 00:26:56.324 }, 00:26:56.324 { 00:26:56.324 "name": null, 00:26:56.324 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:56.324 "is_configured": false, 00:26:56.324 "data_offset": 0, 00:26:56.324 "data_size": 65536 00:26:56.324 }, 00:26:56.324 { 00:26:56.324 "name": "BaseBdev3", 00:26:56.324 "uuid": "f87f49fa-67db-58f0-a19f-6f6f9cc59b1a", 00:26:56.324 "is_configured": true, 00:26:56.324 "data_offset": 0, 00:26:56.324 "data_size": 65536 00:26:56.324 }, 00:26:56.324 { 00:26:56.324 "name": "BaseBdev4", 00:26:56.324 "uuid": "5534946f-b17b-5480-a926-08ce0510be97", 00:26:56.324 "is_configured": true, 00:26:56.324 "data_offset": 0, 00:26:56.324 "data_size": 65536 00:26:56.324 } 00:26:56.324 ] 00:26:56.324 }' 00:26:56.324 20:02:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:56.324 [2024-07-24 20:02:47.911520] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:56.324 [2024-07-24 20:02:47.914907] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:56.584 20:02:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:56.584 20:02:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:56.584 20:02:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:56.584 20:02:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:26:57.521 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:26:57.521 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:57.521 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:57.521 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:57.521 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:57.521 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:57.521 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:57.521 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:57.780 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:57.780 "name": "raid_bdev1", 00:26:57.780 "uuid": "c41a02f7-fffb-4d7c-b7f2-e4a4da0bf23e", 00:26:57.780 "strip_size_kb": 0, 00:26:57.780 "state": "online", 00:26:57.780 "raid_level": "raid1", 00:26:57.780 "superblock": false, 00:26:57.780 "num_base_bdevs": 4, 00:26:57.780 "num_base_bdevs_discovered": 3, 00:26:57.780 "num_base_bdevs_operational": 3, 00:26:57.780 "base_bdevs_list": [ 00:26:57.780 { 00:26:57.780 "name": "spare", 00:26:57.780 "uuid": "6b424f86-bbac-566a-bfc0-d2f5a63dc60a", 00:26:57.780 "is_configured": true, 00:26:57.780 "data_offset": 0, 00:26:57.780 "data_size": 65536 00:26:57.780 }, 00:26:57.781 { 00:26:57.781 "name": null, 00:26:57.781 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:57.781 "is_configured": false, 00:26:57.781 "data_offset": 0, 00:26:57.781 "data_size": 65536 00:26:57.781 }, 00:26:57.781 { 00:26:57.781 "name": "BaseBdev3", 00:26:57.781 "uuid": "f87f49fa-67db-58f0-a19f-6f6f9cc59b1a", 00:26:57.781 "is_configured": true, 00:26:57.781 "data_offset": 0, 00:26:57.781 "data_size": 65536 00:26:57.781 }, 00:26:57.781 { 00:26:57.781 "name": "BaseBdev4", 00:26:57.781 "uuid": "5534946f-b17b-5480-a926-08ce0510be97", 00:26:57.781 "is_configured": true, 00:26:57.781 "data_offset": 0, 00:26:57.781 "data_size": 65536 00:26:57.781 } 00:26:57.781 ] 00:26:57.781 }' 00:26:57.781 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:57.781 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:57.781 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:57.781 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:57.781 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # break 00:26:57.781 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:57.781 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:57.781 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:57.781 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:57.781 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:57.781 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:57.781 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:58.349 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:58.349 "name": "raid_bdev1", 00:26:58.349 "uuid": "c41a02f7-fffb-4d7c-b7f2-e4a4da0bf23e", 00:26:58.349 "strip_size_kb": 0, 00:26:58.349 "state": "online", 00:26:58.349 "raid_level": "raid1", 00:26:58.349 "superblock": false, 00:26:58.349 "num_base_bdevs": 4, 00:26:58.349 "num_base_bdevs_discovered": 3, 00:26:58.349 "num_base_bdevs_operational": 3, 00:26:58.349 "base_bdevs_list": [ 00:26:58.349 { 00:26:58.349 "name": "spare", 00:26:58.349 "uuid": "6b424f86-bbac-566a-bfc0-d2f5a63dc60a", 00:26:58.349 "is_configured": true, 00:26:58.349 "data_offset": 0, 00:26:58.349 "data_size": 65536 00:26:58.349 }, 00:26:58.349 { 00:26:58.349 "name": null, 00:26:58.349 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:58.349 "is_configured": false, 00:26:58.349 "data_offset": 0, 00:26:58.349 "data_size": 65536 00:26:58.349 }, 00:26:58.349 { 00:26:58.349 "name": "BaseBdev3", 00:26:58.349 "uuid": "f87f49fa-67db-58f0-a19f-6f6f9cc59b1a", 00:26:58.349 "is_configured": true, 00:26:58.349 "data_offset": 0, 00:26:58.349 "data_size": 65536 00:26:58.349 }, 00:26:58.349 { 00:26:58.349 "name": "BaseBdev4", 00:26:58.349 "uuid": "5534946f-b17b-5480-a926-08ce0510be97", 00:26:58.349 "is_configured": true, 00:26:58.349 "data_offset": 0, 00:26:58.349 "data_size": 65536 00:26:58.349 } 00:26:58.349 ] 00:26:58.349 }' 00:26:58.349 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:58.349 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:58.349 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:58.349 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:58.349 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:58.349 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:58.349 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:58.349 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:58.349 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:58.349 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:58.349 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:58.349 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:58.349 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:58.350 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:58.350 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:58.350 20:02:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:58.609 20:02:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:58.609 "name": "raid_bdev1", 00:26:58.609 "uuid": "c41a02f7-fffb-4d7c-b7f2-e4a4da0bf23e", 00:26:58.609 "strip_size_kb": 0, 00:26:58.609 "state": "online", 00:26:58.609 "raid_level": "raid1", 00:26:58.609 "superblock": false, 00:26:58.609 "num_base_bdevs": 4, 00:26:58.609 "num_base_bdevs_discovered": 3, 00:26:58.609 "num_base_bdevs_operational": 3, 00:26:58.609 "base_bdevs_list": [ 00:26:58.609 { 00:26:58.609 "name": "spare", 00:26:58.609 "uuid": "6b424f86-bbac-566a-bfc0-d2f5a63dc60a", 00:26:58.609 "is_configured": true, 00:26:58.609 "data_offset": 0, 00:26:58.609 "data_size": 65536 00:26:58.609 }, 00:26:58.609 { 00:26:58.609 "name": null, 00:26:58.609 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:58.609 "is_configured": false, 00:26:58.609 "data_offset": 0, 00:26:58.609 "data_size": 65536 00:26:58.609 }, 00:26:58.609 { 00:26:58.609 "name": "BaseBdev3", 00:26:58.609 "uuid": "f87f49fa-67db-58f0-a19f-6f6f9cc59b1a", 00:26:58.609 "is_configured": true, 00:26:58.609 "data_offset": 0, 00:26:58.609 "data_size": 65536 00:26:58.609 }, 00:26:58.609 { 00:26:58.609 "name": "BaseBdev4", 00:26:58.609 "uuid": "5534946f-b17b-5480-a926-08ce0510be97", 00:26:58.609 "is_configured": true, 00:26:58.609 "data_offset": 0, 00:26:58.609 "data_size": 65536 00:26:58.609 } 00:26:58.609 ] 00:26:58.609 }' 00:26:58.609 20:02:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:58.609 20:02:50 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:59.176 20:02:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:59.435 [2024-07-24 20:02:50.875084] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:59.435 [2024-07-24 20:02:50.875117] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:59.435 00:26:59.435 Latency(us) 00:26:59.435 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:59.435 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:26:59.435 raid_bdev1 : 13.03 89.98 269.93 0.00 0.00 14977.01 292.06 122181.90 00:26:59.435 =================================================================================================================== 00:26:59.435 Total : 89.98 269.93 0.00 0.00 14977.01 292.06 122181.90 00:26:59.435 [2024-07-24 20:02:50.903050] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:59.435 [2024-07-24 20:02:50.903078] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:59.435 [2024-07-24 20:02:50.903175] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:59.435 [2024-07-24 20:02:50.903188] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2340130 name raid_bdev1, state offline 00:26:59.435 0 00:26:59.435 20:02:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:59.435 20:02:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # jq length 00:26:59.694 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:26:59.694 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:26:59.694 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@738 -- # '[' true = true ']' 00:26:59.694 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@740 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:26:59.694 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:59.694 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:26:59.694 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:59.694 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:59.694 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:59.694 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:26:59.694 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:59.694 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:59.694 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:26:59.953 /dev/nbd0 00:26:59.953 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:59.953 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:59.953 20:02:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # local i 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@873 -- # break 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:59.954 1+0 records in 00:26:59.954 1+0 records out 00:26:59.954 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000292646 s, 14.0 MB/s 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # size=4096 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@889 -- # return 0 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' -z '' ']' 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@743 -- # continue 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' -z BaseBdev3 ']' 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@745 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:59.954 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:27:00.213 /dev/nbd1 00:27:00.213 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:00.213 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:00.213 20:02:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:27:00.213 20:02:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # local i 00:27:00.213 20:02:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:27:00.213 20:02:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:27:00.213 20:02:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:27:00.213 20:02:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@873 -- # break 00:27:00.213 20:02:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:27:00.213 20:02:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:27:00.213 20:02:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:00.213 1+0 records in 00:27:00.213 1+0 records out 00:27:00.213 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00028684 s, 14.3 MB/s 00:27:00.213 20:02:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:00.213 20:02:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # size=4096 00:27:00.213 20:02:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:00.213 20:02:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:27:00.213 20:02:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@889 -- # return 0 00:27:00.213 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:00.213 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:00.213 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@746 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:27:00.473 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@747 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:27:00.473 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:00.473 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:27:00.473 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:00.473 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:27:00.473 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:00.473 20:02:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:00.732 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:00.732 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:00.732 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:00.732 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:00.732 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:00.732 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:00.732 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:27:00.732 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:27:00.732 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:27:00.732 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' -z BaseBdev4 ']' 00:27:00.732 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@745 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:27:00.732 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:00.732 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:27:00.732 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:00.732 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:27:00.732 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:00.732 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:27:00.732 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:00.732 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:00.732 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:27:00.996 /dev/nbd1 00:27:00.996 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:00.996 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:00.996 20:02:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:27:00.996 20:02:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # local i 00:27:00.996 20:02:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:27:00.996 20:02:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:27:00.996 20:02:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:27:00.996 20:02:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@873 -- # break 00:27:00.996 20:02:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:27:00.996 20:02:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:27:00.996 20:02:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:00.996 1+0 records in 00:27:00.996 1+0 records out 00:27:00.996 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000288879 s, 14.2 MB/s 00:27:00.996 20:02:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:00.996 20:02:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # size=4096 00:27:00.996 20:02:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:00.996 20:02:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:27:00.996 20:02:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@889 -- # return 0 00:27:00.996 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:00.996 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:00.996 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@746 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:27:00.996 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@747 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:27:00.997 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:00.997 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:27:00.997 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:00.997 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:27:00.997 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:00.997 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:01.256 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:01.256 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:01.256 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:01.256 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:01.256 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:01.256 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:01.256 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:27:01.256 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:27:01.256 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@749 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:27:01.256 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:01.256 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:01.256 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:01.256 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:27:01.256 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:01.256 20:02:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:01.516 20:02:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:01.516 20:02:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:01.516 20:02:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:01.516 20:02:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:01.516 20:02:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:01.516 20:02:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:01.516 20:02:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:27:01.516 20:02:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:27:01.516 20:02:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@758 -- # '[' false = true ']' 00:27:01.516 20:02:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@798 -- # killprocess 1509044 00:27:01.516 20:02:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@950 -- # '[' -z 1509044 ']' 00:27:01.516 20:02:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # kill -0 1509044 00:27:01.516 20:02:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # uname 00:27:01.516 20:02:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:27:01.516 20:02:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1509044 00:27:01.516 20:02:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:27:01.516 20:02:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:27:01.516 20:02:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1509044' 00:27:01.516 killing process with pid 1509044 00:27:01.516 20:02:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@969 -- # kill 1509044 00:27:01.516 Received shutdown signal, test time was about 15.205886 seconds 00:27:01.516 00:27:01.516 Latency(us) 00:27:01.516 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:01.516 =================================================================================================================== 00:27:01.516 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:01.516 [2024-07-24 20:02:53.084643] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:01.516 20:02:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@974 -- # wait 1509044 00:27:01.776 [2024-07-24 20:02:53.125421] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:01.776 20:02:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@800 -- # return 0 00:27:01.776 00:27:01.776 real 0m20.439s 00:27:01.776 user 0m31.568s 00:27:01.776 sys 0m3.637s 00:27:01.776 20:02:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:27:01.776 20:02:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:27:01.776 ************************************ 00:27:01.776 END TEST raid_rebuild_test_io 00:27:01.776 ************************************ 00:27:02.034 20:02:53 bdev_raid -- bdev/bdev_raid.sh@960 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:27:02.034 20:02:53 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:27:02.034 20:02:53 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:27:02.034 20:02:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:02.034 ************************************ 00:27:02.034 START TEST raid_rebuild_test_sb_io 00:27:02.034 ************************************ 00:27:02.034 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 4 true true true 00:27:02.034 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:27:02.034 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=4 00:27:02.034 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@586 -- # local superblock=true 00:27:02.034 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@587 -- # local background_io=true 00:27:02.034 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # local verify=true 00:27:02.034 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:27:02.034 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:27:02.034 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:27:02.034 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:27:02.034 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:27:02.034 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:27:02.035 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:27:02.035 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:27:02.035 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev3 00:27:02.035 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:27:02.035 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:27:02.035 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev4 00:27:02.035 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:27:02.035 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:27:02.035 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:27:02.035 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:27:02.035 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:27:02.035 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # local strip_size 00:27:02.035 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # local create_arg 00:27:02.035 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:27:02.035 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@594 -- # local data_offset 00:27:02.035 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:27:02.035 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:27:02.035 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # '[' true = true ']' 00:27:02.035 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # create_arg+=' -s' 00:27:02.035 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # raid_pid=1511931 00:27:02.035 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@613 -- # waitforlisten 1511931 /var/tmp/spdk-raid.sock 00:27:02.035 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:27:02.035 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@831 -- # '[' -z 1511931 ']' 00:27:02.035 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:02.035 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # local max_retries=100 00:27:02.035 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:02.035 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:02.035 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@840 -- # xtrace_disable 00:27:02.035 20:02:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:02.035 [2024-07-24 20:02:53.491480] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:27:02.035 [2024-07-24 20:02:53.491549] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1511931 ] 00:27:02.035 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:02.035 Zero copy mechanism will not be used. 00:27:02.035 [2024-07-24 20:02:53.619562] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:02.294 [2024-07-24 20:02:53.726260] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:02.294 [2024-07-24 20:02:53.790378] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:02.294 [2024-07-24 20:02:53.790418] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:02.863 20:02:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:27:02.863 20:02:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@864 -- # return 0 00:27:02.863 20:02:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:27:02.863 20:02:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:27:03.121 BaseBdev1_malloc 00:27:03.121 20:02:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:03.380 [2024-07-24 20:02:54.899905] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:03.380 [2024-07-24 20:02:54.899954] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:03.380 [2024-07-24 20:02:54.899978] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa91cd0 00:27:03.380 [2024-07-24 20:02:54.899990] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:03.381 [2024-07-24 20:02:54.901733] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:03.381 [2024-07-24 20:02:54.901764] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:03.381 BaseBdev1 00:27:03.381 20:02:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:27:03.381 20:02:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:27:03.639 BaseBdev2_malloc 00:27:03.639 20:02:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:27:03.898 [2024-07-24 20:02:55.385949] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:27:03.898 [2024-07-24 20:02:55.385995] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:03.898 [2024-07-24 20:02:55.386014] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa95460 00:27:03.898 [2024-07-24 20:02:55.386027] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:03.898 [2024-07-24 20:02:55.387550] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:03.898 [2024-07-24 20:02:55.387578] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:27:03.898 BaseBdev2 00:27:03.898 20:02:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:27:03.898 20:02:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:27:04.157 BaseBdev3_malloc 00:27:04.157 20:02:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:27:04.416 [2024-07-24 20:02:55.891928] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:27:04.416 [2024-07-24 20:02:55.891976] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:04.416 [2024-07-24 20:02:55.891996] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb55780 00:27:04.416 [2024-07-24 20:02:55.892009] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:04.416 [2024-07-24 20:02:55.893594] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:04.416 [2024-07-24 20:02:55.893623] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:27:04.416 BaseBdev3 00:27:04.416 20:02:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:27:04.416 20:02:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:27:04.675 BaseBdev4_malloc 00:27:04.675 20:02:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:27:04.934 [2024-07-24 20:02:56.390970] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:27:04.934 [2024-07-24 20:02:56.391019] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:04.934 [2024-07-24 20:02:56.391039] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb54e60 00:27:04.934 [2024-07-24 20:02:56.391052] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:04.934 [2024-07-24 20:02:56.392634] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:04.934 [2024-07-24 20:02:56.392665] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:27:04.934 BaseBdev4 00:27:04.934 20:02:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:27:05.193 spare_malloc 00:27:05.193 20:02:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:27:05.452 spare_delay 00:27:05.452 20:02:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:05.710 [2024-07-24 20:02:57.129523] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:05.710 [2024-07-24 20:02:57.129570] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:05.710 [2024-07-24 20:02:57.129594] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa8ba50 00:27:05.710 [2024-07-24 20:02:57.129607] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:05.710 [2024-07-24 20:02:57.131167] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:05.710 [2024-07-24 20:02:57.131197] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:05.710 spare 00:27:05.710 20:02:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:27:05.969 [2024-07-24 20:02:57.374205] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:05.969 [2024-07-24 20:02:57.375511] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:05.969 [2024-07-24 20:02:57.375564] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:27:05.969 [2024-07-24 20:02:57.375611] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:27:05.969 [2024-07-24 20:02:57.375814] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xa8e130 00:27:05.969 [2024-07-24 20:02:57.375825] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:27:05.969 [2024-07-24 20:02:57.376029] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa89fc0 00:27:05.969 [2024-07-24 20:02:57.376177] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa8e130 00:27:05.969 [2024-07-24 20:02:57.376187] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa8e130 00:27:05.969 [2024-07-24 20:02:57.376286] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:05.969 20:02:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:27:05.969 20:02:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:05.969 20:02:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:05.969 20:02:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:05.969 20:02:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:05.969 20:02:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:27:05.969 20:02:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:05.969 20:02:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:05.969 20:02:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:05.969 20:02:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:05.969 20:02:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:05.969 20:02:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:06.228 20:02:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:06.228 "name": "raid_bdev1", 00:27:06.228 "uuid": "1d9ecef4-840c-4598-b75f-b251fa289d33", 00:27:06.228 "strip_size_kb": 0, 00:27:06.228 "state": "online", 00:27:06.228 "raid_level": "raid1", 00:27:06.228 "superblock": true, 00:27:06.228 "num_base_bdevs": 4, 00:27:06.228 "num_base_bdevs_discovered": 4, 00:27:06.228 "num_base_bdevs_operational": 4, 00:27:06.228 "base_bdevs_list": [ 00:27:06.228 { 00:27:06.228 "name": "BaseBdev1", 00:27:06.228 "uuid": "de42b4db-14b3-5fa7-ba4a-4c1a39d4cf9f", 00:27:06.228 "is_configured": true, 00:27:06.228 "data_offset": 2048, 00:27:06.228 "data_size": 63488 00:27:06.228 }, 00:27:06.228 { 00:27:06.228 "name": "BaseBdev2", 00:27:06.228 "uuid": "934ff626-0411-5d8d-8b70-696868535c1b", 00:27:06.228 "is_configured": true, 00:27:06.228 "data_offset": 2048, 00:27:06.228 "data_size": 63488 00:27:06.228 }, 00:27:06.228 { 00:27:06.228 "name": "BaseBdev3", 00:27:06.228 "uuid": "b415badc-c7ea-5511-bea2-d716adb30c30", 00:27:06.228 "is_configured": true, 00:27:06.228 "data_offset": 2048, 00:27:06.228 "data_size": 63488 00:27:06.228 }, 00:27:06.228 { 00:27:06.228 "name": "BaseBdev4", 00:27:06.228 "uuid": "334c2e1a-fcd2-55a9-b7c7-3d9d2771f4b4", 00:27:06.228 "is_configured": true, 00:27:06.228 "data_offset": 2048, 00:27:06.228 "data_size": 63488 00:27:06.228 } 00:27:06.228 ] 00:27:06.228 }' 00:27:06.228 20:02:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:06.228 20:02:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:07.166 20:02:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:07.166 20:02:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:27:07.736 [2024-07-24 20:02:59.058954] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:07.736 20:02:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=63488 00:27:07.736 20:02:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:07.736 20:02:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:27:07.995 20:02:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@634 -- # data_offset=2048 00:27:07.995 20:02:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@636 -- # '[' true = true ']' 00:27:07.995 20:02:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:27:07.995 20:02:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@638 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:27:07.995 [2024-07-24 20:02:59.517980] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc42960 00:27:07.995 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:07.995 Zero copy mechanism will not be used. 00:27:07.995 Running I/O for 60 seconds... 00:27:08.255 [2024-07-24 20:02:59.640249] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:08.255 [2024-07-24 20:02:59.651380] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xc42960 00:27:08.255 20:02:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:08.255 20:02:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:08.255 20:02:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:08.255 20:02:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:08.255 20:02:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:08.255 20:02:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:08.255 20:02:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:08.255 20:02:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:08.255 20:02:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:08.255 20:02:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:08.255 20:02:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:08.255 20:02:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:08.825 20:03:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:08.825 "name": "raid_bdev1", 00:27:08.825 "uuid": "1d9ecef4-840c-4598-b75f-b251fa289d33", 00:27:08.825 "strip_size_kb": 0, 00:27:08.825 "state": "online", 00:27:08.825 "raid_level": "raid1", 00:27:08.825 "superblock": true, 00:27:08.825 "num_base_bdevs": 4, 00:27:08.825 "num_base_bdevs_discovered": 3, 00:27:08.825 "num_base_bdevs_operational": 3, 00:27:08.825 "base_bdevs_list": [ 00:27:08.825 { 00:27:08.825 "name": null, 00:27:08.825 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:08.825 "is_configured": false, 00:27:08.825 "data_offset": 2048, 00:27:08.825 "data_size": 63488 00:27:08.825 }, 00:27:08.825 { 00:27:08.825 "name": "BaseBdev2", 00:27:08.825 "uuid": "934ff626-0411-5d8d-8b70-696868535c1b", 00:27:08.825 "is_configured": true, 00:27:08.825 "data_offset": 2048, 00:27:08.825 "data_size": 63488 00:27:08.825 }, 00:27:08.825 { 00:27:08.825 "name": "BaseBdev3", 00:27:08.825 "uuid": "b415badc-c7ea-5511-bea2-d716adb30c30", 00:27:08.825 "is_configured": true, 00:27:08.825 "data_offset": 2048, 00:27:08.825 "data_size": 63488 00:27:08.825 }, 00:27:08.825 { 00:27:08.825 "name": "BaseBdev4", 00:27:08.825 "uuid": "334c2e1a-fcd2-55a9-b7c7-3d9d2771f4b4", 00:27:08.825 "is_configured": true, 00:27:08.825 "data_offset": 2048, 00:27:08.825 "data_size": 63488 00:27:08.825 } 00:27:08.825 ] 00:27:08.825 }' 00:27:08.825 20:03:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:08.825 20:03:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:09.763 20:03:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:10.064 [2024-07-24 20:03:01.431319] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:10.064 20:03:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:27:10.064 [2024-07-24 20:03:01.517412] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x798fb0 00:27:10.064 [2024-07-24 20:03:01.519773] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:10.064 [2024-07-24 20:03:01.645420] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:27:10.322 [2024-07-24 20:03:01.891373] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:27:10.322 [2024-07-24 20:03:01.892081] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:27:10.889 [2024-07-24 20:03:02.308965] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:27:10.889 [2024-07-24 20:03:02.429624] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:27:10.889 [2024-07-24 20:03:02.429940] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:27:11.148 20:03:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:11.148 20:03:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:11.148 20:03:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:11.148 20:03:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:11.148 20:03:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:11.148 20:03:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:11.148 20:03:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:11.148 [2024-07-24 20:03:02.687897] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:27:11.148 [2024-07-24 20:03:02.688216] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:27:11.407 [2024-07-24 20:03:02.800572] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:27:11.407 [2024-07-24 20:03:02.800745] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:27:11.667 20:03:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:11.667 "name": "raid_bdev1", 00:27:11.667 "uuid": "1d9ecef4-840c-4598-b75f-b251fa289d33", 00:27:11.667 "strip_size_kb": 0, 00:27:11.667 "state": "online", 00:27:11.667 "raid_level": "raid1", 00:27:11.667 "superblock": true, 00:27:11.667 "num_base_bdevs": 4, 00:27:11.667 "num_base_bdevs_discovered": 4, 00:27:11.667 "num_base_bdevs_operational": 4, 00:27:11.667 "process": { 00:27:11.667 "type": "rebuild", 00:27:11.667 "target": "spare", 00:27:11.667 "progress": { 00:27:11.667 "blocks": 18432, 00:27:11.667 "percent": 29 00:27:11.667 } 00:27:11.667 }, 00:27:11.667 "base_bdevs_list": [ 00:27:11.667 { 00:27:11.667 "name": "spare", 00:27:11.667 "uuid": "33203e59-8ae6-5b52-aff7-1d34780c471d", 00:27:11.667 "is_configured": true, 00:27:11.667 "data_offset": 2048, 00:27:11.667 "data_size": 63488 00:27:11.667 }, 00:27:11.667 { 00:27:11.667 "name": "BaseBdev2", 00:27:11.667 "uuid": "934ff626-0411-5d8d-8b70-696868535c1b", 00:27:11.667 "is_configured": true, 00:27:11.667 "data_offset": 2048, 00:27:11.667 "data_size": 63488 00:27:11.667 }, 00:27:11.667 { 00:27:11.667 "name": "BaseBdev3", 00:27:11.667 "uuid": "b415badc-c7ea-5511-bea2-d716adb30c30", 00:27:11.667 "is_configured": true, 00:27:11.667 "data_offset": 2048, 00:27:11.667 "data_size": 63488 00:27:11.667 }, 00:27:11.667 { 00:27:11.667 "name": "BaseBdev4", 00:27:11.667 "uuid": "334c2e1a-fcd2-55a9-b7c7-3d9d2771f4b4", 00:27:11.667 "is_configured": true, 00:27:11.667 "data_offset": 2048, 00:27:11.667 "data_size": 63488 00:27:11.667 } 00:27:11.667 ] 00:27:11.667 }' 00:27:11.667 20:03:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:11.667 20:03:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:11.667 20:03:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:11.667 20:03:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:11.667 20:03:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:11.667 [2024-07-24 20:03:03.173492] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:27:11.926 [2024-07-24 20:03:03.408416] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:11.926 [2024-07-24 20:03:03.509916] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:27:12.184 [2024-07-24 20:03:03.612291] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:12.184 [2024-07-24 20:03:03.623375] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:12.184 [2024-07-24 20:03:03.623418] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:12.184 [2024-07-24 20:03:03.623428] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:12.184 [2024-07-24 20:03:03.660074] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xc42960 00:27:12.184 20:03:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:12.184 20:03:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:12.184 20:03:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:12.184 20:03:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:12.184 20:03:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:12.184 20:03:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:12.184 20:03:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:12.184 20:03:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:12.184 20:03:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:12.184 20:03:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:12.184 20:03:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:12.184 20:03:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:12.443 20:03:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:12.443 "name": "raid_bdev1", 00:27:12.443 "uuid": "1d9ecef4-840c-4598-b75f-b251fa289d33", 00:27:12.443 "strip_size_kb": 0, 00:27:12.444 "state": "online", 00:27:12.444 "raid_level": "raid1", 00:27:12.444 "superblock": true, 00:27:12.444 "num_base_bdevs": 4, 00:27:12.444 "num_base_bdevs_discovered": 3, 00:27:12.444 "num_base_bdevs_operational": 3, 00:27:12.444 "base_bdevs_list": [ 00:27:12.444 { 00:27:12.444 "name": null, 00:27:12.444 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:12.444 "is_configured": false, 00:27:12.444 "data_offset": 2048, 00:27:12.444 "data_size": 63488 00:27:12.444 }, 00:27:12.444 { 00:27:12.444 "name": "BaseBdev2", 00:27:12.444 "uuid": "934ff626-0411-5d8d-8b70-696868535c1b", 00:27:12.444 "is_configured": true, 00:27:12.444 "data_offset": 2048, 00:27:12.444 "data_size": 63488 00:27:12.444 }, 00:27:12.444 { 00:27:12.444 "name": "BaseBdev3", 00:27:12.444 "uuid": "b415badc-c7ea-5511-bea2-d716adb30c30", 00:27:12.444 "is_configured": true, 00:27:12.444 "data_offset": 2048, 00:27:12.444 "data_size": 63488 00:27:12.444 }, 00:27:12.444 { 00:27:12.444 "name": "BaseBdev4", 00:27:12.444 "uuid": "334c2e1a-fcd2-55a9-b7c7-3d9d2771f4b4", 00:27:12.444 "is_configured": true, 00:27:12.444 "data_offset": 2048, 00:27:12.444 "data_size": 63488 00:27:12.444 } 00:27:12.444 ] 00:27:12.444 }' 00:27:12.444 20:03:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:12.444 20:03:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:13.011 20:03:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:13.011 20:03:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:13.011 20:03:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:13.011 20:03:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:13.011 20:03:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:13.011 20:03:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:13.011 20:03:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:13.579 20:03:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:13.579 "name": "raid_bdev1", 00:27:13.579 "uuid": "1d9ecef4-840c-4598-b75f-b251fa289d33", 00:27:13.579 "strip_size_kb": 0, 00:27:13.579 "state": "online", 00:27:13.579 "raid_level": "raid1", 00:27:13.579 "superblock": true, 00:27:13.579 "num_base_bdevs": 4, 00:27:13.579 "num_base_bdevs_discovered": 3, 00:27:13.579 "num_base_bdevs_operational": 3, 00:27:13.579 "base_bdevs_list": [ 00:27:13.579 { 00:27:13.579 "name": null, 00:27:13.579 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:13.579 "is_configured": false, 00:27:13.579 "data_offset": 2048, 00:27:13.579 "data_size": 63488 00:27:13.579 }, 00:27:13.579 { 00:27:13.579 "name": "BaseBdev2", 00:27:13.579 "uuid": "934ff626-0411-5d8d-8b70-696868535c1b", 00:27:13.579 "is_configured": true, 00:27:13.579 "data_offset": 2048, 00:27:13.579 "data_size": 63488 00:27:13.579 }, 00:27:13.579 { 00:27:13.579 "name": "BaseBdev3", 00:27:13.579 "uuid": "b415badc-c7ea-5511-bea2-d716adb30c30", 00:27:13.579 "is_configured": true, 00:27:13.579 "data_offset": 2048, 00:27:13.579 "data_size": 63488 00:27:13.579 }, 00:27:13.579 { 00:27:13.579 "name": "BaseBdev4", 00:27:13.579 "uuid": "334c2e1a-fcd2-55a9-b7c7-3d9d2771f4b4", 00:27:13.579 "is_configured": true, 00:27:13.579 "data_offset": 2048, 00:27:13.579 "data_size": 63488 00:27:13.579 } 00:27:13.579 ] 00:27:13.579 }' 00:27:13.579 20:03:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:13.579 20:03:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:13.579 20:03:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:13.579 20:03:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:13.579 20:03:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:13.838 [2024-07-24 20:03:05.193104] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:13.838 20:03:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@678 -- # sleep 1 00:27:13.838 [2024-07-24 20:03:05.258439] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xaa8c30 00:27:13.838 [2024-07-24 20:03:05.259939] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:13.838 [2024-07-24 20:03:05.371418] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:27:13.838 [2024-07-24 20:03:05.371915] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:27:14.097 [2024-07-24 20:03:05.494134] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:27:14.097 [2024-07-24 20:03:05.494421] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:27:14.356 [2024-07-24 20:03:05.752832] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:27:14.356 [2024-07-24 20:03:05.897696] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:27:14.924 20:03:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:14.924 20:03:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:14.924 20:03:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:14.924 20:03:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:14.924 20:03:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:14.924 20:03:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:14.924 20:03:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:14.924 20:03:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:14.924 "name": "raid_bdev1", 00:27:14.924 "uuid": "1d9ecef4-840c-4598-b75f-b251fa289d33", 00:27:14.924 "strip_size_kb": 0, 00:27:14.924 "state": "online", 00:27:14.924 "raid_level": "raid1", 00:27:14.924 "superblock": true, 00:27:14.924 "num_base_bdevs": 4, 00:27:14.924 "num_base_bdevs_discovered": 4, 00:27:14.924 "num_base_bdevs_operational": 4, 00:27:14.924 "process": { 00:27:14.924 "type": "rebuild", 00:27:14.924 "target": "spare", 00:27:14.924 "progress": { 00:27:14.924 "blocks": 16384, 00:27:14.924 "percent": 25 00:27:14.924 } 00:27:14.924 }, 00:27:14.924 "base_bdevs_list": [ 00:27:14.924 { 00:27:14.924 "name": "spare", 00:27:14.924 "uuid": "33203e59-8ae6-5b52-aff7-1d34780c471d", 00:27:14.924 "is_configured": true, 00:27:14.924 "data_offset": 2048, 00:27:14.924 "data_size": 63488 00:27:14.924 }, 00:27:14.924 { 00:27:14.924 "name": "BaseBdev2", 00:27:14.924 "uuid": "934ff626-0411-5d8d-8b70-696868535c1b", 00:27:14.924 "is_configured": true, 00:27:14.924 "data_offset": 2048, 00:27:14.924 "data_size": 63488 00:27:14.924 }, 00:27:14.924 { 00:27:14.924 "name": "BaseBdev3", 00:27:14.924 "uuid": "b415badc-c7ea-5511-bea2-d716adb30c30", 00:27:14.924 "is_configured": true, 00:27:14.924 "data_offset": 2048, 00:27:14.924 "data_size": 63488 00:27:14.924 }, 00:27:14.924 { 00:27:14.924 "name": "BaseBdev4", 00:27:14.924 "uuid": "334c2e1a-fcd2-55a9-b7c7-3d9d2771f4b4", 00:27:14.924 "is_configured": true, 00:27:14.924 "data_offset": 2048, 00:27:14.924 "data_size": 63488 00:27:14.924 } 00:27:14.924 ] 00:27:14.924 }' 00:27:14.924 20:03:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:15.183 20:03:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:15.183 20:03:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:15.183 20:03:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:15.183 20:03:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@681 -- # '[' true = true ']' 00:27:15.183 20:03:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@681 -- # '[' = false ']' 00:27:15.183 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 681: [: =: unary operator expected 00:27:15.183 20:03:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=4 00:27:15.183 20:03:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:27:15.183 20:03:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # '[' 4 -gt 2 ']' 00:27:15.183 20:03:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:27:15.442 [2024-07-24 20:03:06.873226] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:15.442 [2024-07-24 20:03:07.019129] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:27:15.701 [2024-07-24 20:03:07.195112] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0xc42960 00:27:15.701 [2024-07-24 20:03:07.195140] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0xaa8c30 00:27:15.701 20:03:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@713 -- # base_bdevs[1]= 00:27:15.701 20:03:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # (( num_base_bdevs_operational-- )) 00:27:15.701 20:03:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@717 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:15.701 20:03:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:15.701 20:03:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:15.701 20:03:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:15.701 20:03:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:15.701 20:03:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:15.701 20:03:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:16.268 [2024-07-24 20:03:07.662735] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:27:16.268 20:03:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:16.268 "name": "raid_bdev1", 00:27:16.268 "uuid": "1d9ecef4-840c-4598-b75f-b251fa289d33", 00:27:16.268 "strip_size_kb": 0, 00:27:16.268 "state": "online", 00:27:16.268 "raid_level": "raid1", 00:27:16.268 "superblock": true, 00:27:16.268 "num_base_bdevs": 4, 00:27:16.268 "num_base_bdevs_discovered": 3, 00:27:16.268 "num_base_bdevs_operational": 3, 00:27:16.268 "process": { 00:27:16.268 "type": "rebuild", 00:27:16.268 "target": "spare", 00:27:16.268 "progress": { 00:27:16.268 "blocks": 32768, 00:27:16.268 "percent": 51 00:27:16.268 } 00:27:16.268 }, 00:27:16.268 "base_bdevs_list": [ 00:27:16.268 { 00:27:16.268 "name": "spare", 00:27:16.268 "uuid": "33203e59-8ae6-5b52-aff7-1d34780c471d", 00:27:16.268 "is_configured": true, 00:27:16.268 "data_offset": 2048, 00:27:16.268 "data_size": 63488 00:27:16.268 }, 00:27:16.268 { 00:27:16.268 "name": null, 00:27:16.268 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:16.268 "is_configured": false, 00:27:16.268 "data_offset": 2048, 00:27:16.268 "data_size": 63488 00:27:16.268 }, 00:27:16.268 { 00:27:16.268 "name": "BaseBdev3", 00:27:16.268 "uuid": "b415badc-c7ea-5511-bea2-d716adb30c30", 00:27:16.268 "is_configured": true, 00:27:16.268 "data_offset": 2048, 00:27:16.268 "data_size": 63488 00:27:16.268 }, 00:27:16.268 { 00:27:16.268 "name": "BaseBdev4", 00:27:16.268 "uuid": "334c2e1a-fcd2-55a9-b7c7-3d9d2771f4b4", 00:27:16.268 "is_configured": true, 00:27:16.268 "data_offset": 2048, 00:27:16.268 "data_size": 63488 00:27:16.268 } 00:27:16.268 ] 00:27:16.268 }' 00:27:16.268 20:03:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:16.268 20:03:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:16.268 20:03:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:16.268 20:03:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:16.268 20:03:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # local timeout=1008 00:27:16.268 20:03:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:27:16.268 20:03:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:16.268 20:03:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:16.268 20:03:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:16.268 20:03:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:16.268 20:03:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:16.268 20:03:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:16.268 20:03:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:16.528 20:03:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:16.528 "name": "raid_bdev1", 00:27:16.528 "uuid": "1d9ecef4-840c-4598-b75f-b251fa289d33", 00:27:16.528 "strip_size_kb": 0, 00:27:16.528 "state": "online", 00:27:16.528 "raid_level": "raid1", 00:27:16.528 "superblock": true, 00:27:16.528 "num_base_bdevs": 4, 00:27:16.528 "num_base_bdevs_discovered": 3, 00:27:16.528 "num_base_bdevs_operational": 3, 00:27:16.528 "process": { 00:27:16.528 "type": "rebuild", 00:27:16.528 "target": "spare", 00:27:16.528 "progress": { 00:27:16.528 "blocks": 38912, 00:27:16.528 "percent": 61 00:27:16.528 } 00:27:16.528 }, 00:27:16.528 "base_bdevs_list": [ 00:27:16.528 { 00:27:16.528 "name": "spare", 00:27:16.528 "uuid": "33203e59-8ae6-5b52-aff7-1d34780c471d", 00:27:16.528 "is_configured": true, 00:27:16.528 "data_offset": 2048, 00:27:16.528 "data_size": 63488 00:27:16.528 }, 00:27:16.528 { 00:27:16.528 "name": null, 00:27:16.528 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:16.528 "is_configured": false, 00:27:16.528 "data_offset": 2048, 00:27:16.528 "data_size": 63488 00:27:16.528 }, 00:27:16.528 { 00:27:16.528 "name": "BaseBdev3", 00:27:16.528 "uuid": "b415badc-c7ea-5511-bea2-d716adb30c30", 00:27:16.528 "is_configured": true, 00:27:16.528 "data_offset": 2048, 00:27:16.528 "data_size": 63488 00:27:16.528 }, 00:27:16.528 { 00:27:16.528 "name": "BaseBdev4", 00:27:16.528 "uuid": "334c2e1a-fcd2-55a9-b7c7-3d9d2771f4b4", 00:27:16.528 "is_configured": true, 00:27:16.528 "data_offset": 2048, 00:27:16.528 "data_size": 63488 00:27:16.528 } 00:27:16.528 ] 00:27:16.528 }' 00:27:16.528 20:03:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:16.787 20:03:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:16.787 20:03:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:16.787 20:03:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:16.787 20:03:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:27:17.046 [2024-07-24 20:03:08.493737] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:27:17.614 [2024-07-24 20:03:09.012380] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:27:17.614 [2024-07-24 20:03:09.013256] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:27:17.614 20:03:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:27:17.614 20:03:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:17.614 20:03:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:17.614 20:03:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:17.614 20:03:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:17.614 20:03:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:17.614 20:03:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:17.614 20:03:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:17.872 20:03:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:17.872 "name": "raid_bdev1", 00:27:17.872 "uuid": "1d9ecef4-840c-4598-b75f-b251fa289d33", 00:27:17.872 "strip_size_kb": 0, 00:27:17.872 "state": "online", 00:27:17.872 "raid_level": "raid1", 00:27:17.872 "superblock": true, 00:27:17.872 "num_base_bdevs": 4, 00:27:17.872 "num_base_bdevs_discovered": 3, 00:27:17.872 "num_base_bdevs_operational": 3, 00:27:17.872 "process": { 00:27:17.872 "type": "rebuild", 00:27:17.872 "target": "spare", 00:27:17.872 "progress": { 00:27:17.872 "blocks": 61440, 00:27:17.872 "percent": 96 00:27:17.872 } 00:27:17.872 }, 00:27:17.872 "base_bdevs_list": [ 00:27:17.872 { 00:27:17.872 "name": "spare", 00:27:17.872 "uuid": "33203e59-8ae6-5b52-aff7-1d34780c471d", 00:27:17.872 "is_configured": true, 00:27:17.872 "data_offset": 2048, 00:27:17.872 "data_size": 63488 00:27:17.872 }, 00:27:17.872 { 00:27:17.872 "name": null, 00:27:17.872 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:17.872 "is_configured": false, 00:27:17.872 "data_offset": 2048, 00:27:17.872 "data_size": 63488 00:27:17.872 }, 00:27:17.872 { 00:27:17.872 "name": "BaseBdev3", 00:27:17.872 "uuid": "b415badc-c7ea-5511-bea2-d716adb30c30", 00:27:17.872 "is_configured": true, 00:27:17.872 "data_offset": 2048, 00:27:17.872 "data_size": 63488 00:27:17.872 }, 00:27:17.872 { 00:27:17.872 "name": "BaseBdev4", 00:27:17.872 "uuid": "334c2e1a-fcd2-55a9-b7c7-3d9d2771f4b4", 00:27:17.872 "is_configured": true, 00:27:17.872 "data_offset": 2048, 00:27:17.872 "data_size": 63488 00:27:17.872 } 00:27:17.872 ] 00:27:17.872 }' 00:27:17.872 20:03:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:18.131 [2024-07-24 20:03:09.465233] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:27:18.131 20:03:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:18.131 20:03:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:18.131 20:03:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:18.131 20:03:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:27:18.131 [2024-07-24 20:03:09.565446] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:27:18.131 [2024-07-24 20:03:09.576107] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:19.069 20:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:27:19.069 20:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:19.069 20:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:19.069 20:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:19.069 20:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:19.069 20:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:19.069 20:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:19.069 20:03:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:19.638 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:19.638 "name": "raid_bdev1", 00:27:19.638 "uuid": "1d9ecef4-840c-4598-b75f-b251fa289d33", 00:27:19.638 "strip_size_kb": 0, 00:27:19.638 "state": "online", 00:27:19.638 "raid_level": "raid1", 00:27:19.638 "superblock": true, 00:27:19.638 "num_base_bdevs": 4, 00:27:19.638 "num_base_bdevs_discovered": 3, 00:27:19.638 "num_base_bdevs_operational": 3, 00:27:19.638 "base_bdevs_list": [ 00:27:19.638 { 00:27:19.638 "name": "spare", 00:27:19.638 "uuid": "33203e59-8ae6-5b52-aff7-1d34780c471d", 00:27:19.638 "is_configured": true, 00:27:19.638 "data_offset": 2048, 00:27:19.638 "data_size": 63488 00:27:19.638 }, 00:27:19.638 { 00:27:19.638 "name": null, 00:27:19.638 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:19.638 "is_configured": false, 00:27:19.638 "data_offset": 2048, 00:27:19.638 "data_size": 63488 00:27:19.638 }, 00:27:19.638 { 00:27:19.638 "name": "BaseBdev3", 00:27:19.638 "uuid": "b415badc-c7ea-5511-bea2-d716adb30c30", 00:27:19.638 "is_configured": true, 00:27:19.638 "data_offset": 2048, 00:27:19.638 "data_size": 63488 00:27:19.638 }, 00:27:19.638 { 00:27:19.638 "name": "BaseBdev4", 00:27:19.638 "uuid": "334c2e1a-fcd2-55a9-b7c7-3d9d2771f4b4", 00:27:19.638 "is_configured": true, 00:27:19.638 "data_offset": 2048, 00:27:19.638 "data_size": 63488 00:27:19.638 } 00:27:19.638 ] 00:27:19.638 }' 00:27:19.638 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:19.638 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:27:19.638 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:19.638 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:27:19.638 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # break 00:27:19.638 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:19.638 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:19.638 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:19.638 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:19.638 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:19.638 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:19.897 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:20.156 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:20.156 "name": "raid_bdev1", 00:27:20.156 "uuid": "1d9ecef4-840c-4598-b75f-b251fa289d33", 00:27:20.156 "strip_size_kb": 0, 00:27:20.156 "state": "online", 00:27:20.156 "raid_level": "raid1", 00:27:20.156 "superblock": true, 00:27:20.156 "num_base_bdevs": 4, 00:27:20.156 "num_base_bdevs_discovered": 3, 00:27:20.156 "num_base_bdevs_operational": 3, 00:27:20.156 "base_bdevs_list": [ 00:27:20.156 { 00:27:20.156 "name": "spare", 00:27:20.156 "uuid": "33203e59-8ae6-5b52-aff7-1d34780c471d", 00:27:20.156 "is_configured": true, 00:27:20.156 "data_offset": 2048, 00:27:20.156 "data_size": 63488 00:27:20.156 }, 00:27:20.156 { 00:27:20.156 "name": null, 00:27:20.156 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:20.156 "is_configured": false, 00:27:20.156 "data_offset": 2048, 00:27:20.156 "data_size": 63488 00:27:20.156 }, 00:27:20.156 { 00:27:20.156 "name": "BaseBdev3", 00:27:20.156 "uuid": "b415badc-c7ea-5511-bea2-d716adb30c30", 00:27:20.156 "is_configured": true, 00:27:20.156 "data_offset": 2048, 00:27:20.156 "data_size": 63488 00:27:20.156 }, 00:27:20.156 { 00:27:20.156 "name": "BaseBdev4", 00:27:20.156 "uuid": "334c2e1a-fcd2-55a9-b7c7-3d9d2771f4b4", 00:27:20.156 "is_configured": true, 00:27:20.156 "data_offset": 2048, 00:27:20.156 "data_size": 63488 00:27:20.156 } 00:27:20.156 ] 00:27:20.156 }' 00:27:20.415 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:20.415 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:20.415 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:20.415 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:20.415 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:20.415 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:20.415 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:20.415 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:20.415 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:20.415 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:20.415 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:20.415 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:20.415 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:20.415 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:20.416 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:20.416 20:03:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:20.985 20:03:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:20.985 "name": "raid_bdev1", 00:27:20.985 "uuid": "1d9ecef4-840c-4598-b75f-b251fa289d33", 00:27:20.985 "strip_size_kb": 0, 00:27:20.985 "state": "online", 00:27:20.985 "raid_level": "raid1", 00:27:20.985 "superblock": true, 00:27:20.985 "num_base_bdevs": 4, 00:27:20.985 "num_base_bdevs_discovered": 3, 00:27:20.985 "num_base_bdevs_operational": 3, 00:27:20.985 "base_bdevs_list": [ 00:27:20.985 { 00:27:20.985 "name": "spare", 00:27:20.985 "uuid": "33203e59-8ae6-5b52-aff7-1d34780c471d", 00:27:20.985 "is_configured": true, 00:27:20.985 "data_offset": 2048, 00:27:20.985 "data_size": 63488 00:27:20.985 }, 00:27:20.985 { 00:27:20.985 "name": null, 00:27:20.985 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:20.985 "is_configured": false, 00:27:20.985 "data_offset": 2048, 00:27:20.985 "data_size": 63488 00:27:20.985 }, 00:27:20.985 { 00:27:20.985 "name": "BaseBdev3", 00:27:20.985 "uuid": "b415badc-c7ea-5511-bea2-d716adb30c30", 00:27:20.985 "is_configured": true, 00:27:20.985 "data_offset": 2048, 00:27:20.985 "data_size": 63488 00:27:20.985 }, 00:27:20.985 { 00:27:20.985 "name": "BaseBdev4", 00:27:20.985 "uuid": "334c2e1a-fcd2-55a9-b7c7-3d9d2771f4b4", 00:27:20.985 "is_configured": true, 00:27:20.985 "data_offset": 2048, 00:27:20.985 "data_size": 63488 00:27:20.985 } 00:27:20.985 ] 00:27:20.985 }' 00:27:20.985 20:03:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:20.985 20:03:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:21.924 20:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:22.185 [2024-07-24 20:03:13.528740] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:22.185 [2024-07-24 20:03:13.528772] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:22.185 00:27:22.185 Latency(us) 00:27:22.185 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:22.185 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:27:22.185 raid_bdev1 : 14.01 83.61 250.82 0.00 0.00 17104.29 300.97 125829.12 00:27:22.185 =================================================================================================================== 00:27:22.185 Total : 83.61 250.82 0.00 0.00 17104.29 300.97 125829.12 00:27:22.185 [2024-07-24 20:03:13.564788] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:22.185 [2024-07-24 20:03:13.564817] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:22.185 [2024-07-24 20:03:13.564913] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:22.185 [2024-07-24 20:03:13.564926] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa8e130 name raid_bdev1, state offline 00:27:22.185 0 00:27:22.185 20:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:22.185 20:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # jq length 00:27:22.444 20:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:27:22.444 20:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:27:22.444 20:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@738 -- # '[' true = true ']' 00:27:22.444 20:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@740 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:27:22.444 20:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:22.444 20:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:27:22.444 20:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:22.444 20:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:27:22.444 20:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:22.444 20:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:27:22.444 20:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:22.444 20:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:22.444 20:03:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:27:22.704 /dev/nbd0 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # local i 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@873 -- # break 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:22.704 1+0 records in 00:27:22.704 1+0 records out 00:27:22.704 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000261211 s, 15.7 MB/s 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # size=4096 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@889 -- # return 0 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' -z '' ']' 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@743 -- # continue 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' -z BaseBdev3 ']' 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:22.704 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:27:22.963 /dev/nbd1 00:27:22.963 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:22.963 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:22.963 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:27:22.963 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # local i 00:27:22.963 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:27:22.963 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:27:22.963 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:27:22.963 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@873 -- # break 00:27:22.963 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:27:22.963 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:27:22.963 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:22.963 1+0 records in 00:27:22.963 1+0 records out 00:27:22.963 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000297996 s, 13.7 MB/s 00:27:22.963 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:22.963 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # size=4096 00:27:22.963 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:22.963 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:27:22.963 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@889 -- # return 0 00:27:22.963 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:22.963 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:22.963 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@746 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:27:22.963 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:27:22.963 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:22.963 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:27:22.963 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:22.963 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:27:22.963 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:22.963 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:23.223 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:23.223 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:23.223 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:23.223 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:23.223 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:23.223 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:23.223 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:27:23.223 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:27:23.223 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:27:23.223 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' -z BaseBdev4 ']' 00:27:23.223 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:27:23.223 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:23.223 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:27:23.223 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:23.223 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:27:23.223 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:23.223 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:27:23.223 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:23.223 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:23.223 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:27:23.483 /dev/nbd1 00:27:23.483 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:23.483 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:23.483 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:27:23.483 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # local i 00:27:23.483 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:27:23.483 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:27:23.483 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:27:23.483 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@873 -- # break 00:27:23.483 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:27:23.483 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:27:23.483 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:23.483 1+0 records in 00:27:23.483 1+0 records out 00:27:23.483 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000258367 s, 15.9 MB/s 00:27:23.483 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:23.483 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # size=4096 00:27:23.483 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:23.483 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:27:23.483 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@889 -- # return 0 00:27:23.483 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:23.483 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:23.483 20:03:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@746 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:27:23.483 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:27:23.483 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:23.483 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:27:23.483 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:23.483 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:27:23.483 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:23.483 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:23.743 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:23.743 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:23.743 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:23.743 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:23.743 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:23.743 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:23.743 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:27:23.743 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:27:23.743 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:27:23.743 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:23.743 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:23.743 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:23.743 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:27:23.743 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:23.743 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:24.002 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:24.002 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:24.002 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:24.002 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:24.002 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:24.002 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:24.002 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:27:24.002 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:27:24.002 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@758 -- # '[' true = true ']' 00:27:24.002 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:24.260 20:03:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:24.519 [2024-07-24 20:03:16.051033] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:24.519 [2024-07-24 20:03:16.051079] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:24.519 [2024-07-24 20:03:16.051103] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa8bc80 00:27:24.519 [2024-07-24 20:03:16.051117] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:24.519 [2024-07-24 20:03:16.052755] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:24.519 [2024-07-24 20:03:16.052786] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:24.519 [2024-07-24 20:03:16.052888] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:24.520 [2024-07-24 20:03:16.052917] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:24.520 [2024-07-24 20:03:16.053001] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:27:24.520 [2024-07-24 20:03:16.053072] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:27:24.520 spare 00:27:24.520 20:03:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:24.520 20:03:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:24.520 20:03:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:24.520 20:03:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:24.520 20:03:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:24.520 20:03:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:24.520 20:03:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:24.520 20:03:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:24.520 20:03:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:24.520 20:03:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:24.520 20:03:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:24.520 20:03:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:24.779 [2024-07-24 20:03:16.153396] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xb299a0 00:27:24.779 [2024-07-24 20:03:16.153415] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:27:24.779 [2024-07-24 20:03:16.153618] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa89fc0 00:27:24.779 [2024-07-24 20:03:16.153780] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb299a0 00:27:24.779 [2024-07-24 20:03:16.153790] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb299a0 00:27:24.779 [2024-07-24 20:03:16.153905] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:24.779 20:03:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:24.779 "name": "raid_bdev1", 00:27:24.779 "uuid": "1d9ecef4-840c-4598-b75f-b251fa289d33", 00:27:24.779 "strip_size_kb": 0, 00:27:24.779 "state": "online", 00:27:24.779 "raid_level": "raid1", 00:27:24.779 "superblock": true, 00:27:24.779 "num_base_bdevs": 4, 00:27:24.779 "num_base_bdevs_discovered": 3, 00:27:24.779 "num_base_bdevs_operational": 3, 00:27:24.779 "base_bdevs_list": [ 00:27:24.779 { 00:27:24.779 "name": "spare", 00:27:24.779 "uuid": "33203e59-8ae6-5b52-aff7-1d34780c471d", 00:27:24.779 "is_configured": true, 00:27:24.779 "data_offset": 2048, 00:27:24.779 "data_size": 63488 00:27:24.779 }, 00:27:24.779 { 00:27:24.779 "name": null, 00:27:24.779 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:24.779 "is_configured": false, 00:27:24.779 "data_offset": 2048, 00:27:24.779 "data_size": 63488 00:27:24.779 }, 00:27:24.779 { 00:27:24.779 "name": "BaseBdev3", 00:27:24.779 "uuid": "b415badc-c7ea-5511-bea2-d716adb30c30", 00:27:24.779 "is_configured": true, 00:27:24.779 "data_offset": 2048, 00:27:24.779 "data_size": 63488 00:27:24.779 }, 00:27:24.779 { 00:27:24.779 "name": "BaseBdev4", 00:27:24.779 "uuid": "334c2e1a-fcd2-55a9-b7c7-3d9d2771f4b4", 00:27:24.779 "is_configured": true, 00:27:24.779 "data_offset": 2048, 00:27:24.779 "data_size": 63488 00:27:24.779 } 00:27:24.779 ] 00:27:24.779 }' 00:27:24.779 20:03:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:24.779 20:03:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:25.347 20:03:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:25.347 20:03:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:25.347 20:03:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:25.347 20:03:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:25.347 20:03:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:25.347 20:03:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:25.347 20:03:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:25.607 20:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:25.607 "name": "raid_bdev1", 00:27:25.607 "uuid": "1d9ecef4-840c-4598-b75f-b251fa289d33", 00:27:25.607 "strip_size_kb": 0, 00:27:25.607 "state": "online", 00:27:25.607 "raid_level": "raid1", 00:27:25.607 "superblock": true, 00:27:25.607 "num_base_bdevs": 4, 00:27:25.607 "num_base_bdevs_discovered": 3, 00:27:25.607 "num_base_bdevs_operational": 3, 00:27:25.607 "base_bdevs_list": [ 00:27:25.607 { 00:27:25.607 "name": "spare", 00:27:25.607 "uuid": "33203e59-8ae6-5b52-aff7-1d34780c471d", 00:27:25.608 "is_configured": true, 00:27:25.608 "data_offset": 2048, 00:27:25.608 "data_size": 63488 00:27:25.608 }, 00:27:25.608 { 00:27:25.608 "name": null, 00:27:25.608 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:25.608 "is_configured": false, 00:27:25.608 "data_offset": 2048, 00:27:25.608 "data_size": 63488 00:27:25.608 }, 00:27:25.608 { 00:27:25.608 "name": "BaseBdev3", 00:27:25.608 "uuid": "b415badc-c7ea-5511-bea2-d716adb30c30", 00:27:25.608 "is_configured": true, 00:27:25.608 "data_offset": 2048, 00:27:25.608 "data_size": 63488 00:27:25.608 }, 00:27:25.608 { 00:27:25.608 "name": "BaseBdev4", 00:27:25.608 "uuid": "334c2e1a-fcd2-55a9-b7c7-3d9d2771f4b4", 00:27:25.608 "is_configured": true, 00:27:25.608 "data_offset": 2048, 00:27:25.608 "data_size": 63488 00:27:25.608 } 00:27:25.608 ] 00:27:25.608 }' 00:27:25.608 20:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:25.867 20:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:25.867 20:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:25.867 20:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:25.867 20:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:25.867 20:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@765 -- # jq -r '.[].base_bdevs_list[0].name' 00:27:26.126 20:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@765 -- # [[ spare == \s\p\a\r\e ]] 00:27:26.126 20:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:26.385 [2024-07-24 20:03:17.731859] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:26.385 20:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:26.385 20:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:26.385 20:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:26.385 20:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:26.385 20:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:26.385 20:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:26.385 20:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:26.385 20:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:26.385 20:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:26.385 20:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:26.385 20:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:26.385 20:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:26.644 20:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:26.644 "name": "raid_bdev1", 00:27:26.644 "uuid": "1d9ecef4-840c-4598-b75f-b251fa289d33", 00:27:26.644 "strip_size_kb": 0, 00:27:26.644 "state": "online", 00:27:26.644 "raid_level": "raid1", 00:27:26.644 "superblock": true, 00:27:26.644 "num_base_bdevs": 4, 00:27:26.644 "num_base_bdevs_discovered": 2, 00:27:26.644 "num_base_bdevs_operational": 2, 00:27:26.644 "base_bdevs_list": [ 00:27:26.644 { 00:27:26.644 "name": null, 00:27:26.644 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:26.644 "is_configured": false, 00:27:26.644 "data_offset": 2048, 00:27:26.644 "data_size": 63488 00:27:26.644 }, 00:27:26.644 { 00:27:26.644 "name": null, 00:27:26.644 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:26.644 "is_configured": false, 00:27:26.644 "data_offset": 2048, 00:27:26.644 "data_size": 63488 00:27:26.644 }, 00:27:26.644 { 00:27:26.644 "name": "BaseBdev3", 00:27:26.644 "uuid": "b415badc-c7ea-5511-bea2-d716adb30c30", 00:27:26.644 "is_configured": true, 00:27:26.644 "data_offset": 2048, 00:27:26.644 "data_size": 63488 00:27:26.644 }, 00:27:26.644 { 00:27:26.644 "name": "BaseBdev4", 00:27:26.644 "uuid": "334c2e1a-fcd2-55a9-b7c7-3d9d2771f4b4", 00:27:26.644 "is_configured": true, 00:27:26.644 "data_offset": 2048, 00:27:26.644 "data_size": 63488 00:27:26.644 } 00:27:26.644 ] 00:27:26.644 }' 00:27:26.645 20:03:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:26.645 20:03:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:27.252 20:03:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@770 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:27.252 [2024-07-24 20:03:18.826944] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:27.252 [2024-07-24 20:03:18.827095] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:27:27.252 [2024-07-24 20:03:18.827111] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:27.252 [2024-07-24 20:03:18.827139] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:27.252 [2024-07-24 20:03:18.831568] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa89fc0 00:27:27.252 [2024-07-24 20:03:18.833861] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:27.515 20:03:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # sleep 1 00:27:28.453 20:03:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:28.453 20:03:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:28.453 20:03:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:28.453 20:03:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:28.453 20:03:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:28.453 20:03:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:28.453 20:03:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:28.713 20:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:28.713 "name": "raid_bdev1", 00:27:28.713 "uuid": "1d9ecef4-840c-4598-b75f-b251fa289d33", 00:27:28.713 "strip_size_kb": 0, 00:27:28.713 "state": "online", 00:27:28.713 "raid_level": "raid1", 00:27:28.713 "superblock": true, 00:27:28.713 "num_base_bdevs": 4, 00:27:28.713 "num_base_bdevs_discovered": 3, 00:27:28.713 "num_base_bdevs_operational": 3, 00:27:28.713 "process": { 00:27:28.713 "type": "rebuild", 00:27:28.713 "target": "spare", 00:27:28.713 "progress": { 00:27:28.713 "blocks": 22528, 00:27:28.713 "percent": 35 00:27:28.713 } 00:27:28.713 }, 00:27:28.713 "base_bdevs_list": [ 00:27:28.713 { 00:27:28.713 "name": "spare", 00:27:28.713 "uuid": "33203e59-8ae6-5b52-aff7-1d34780c471d", 00:27:28.713 "is_configured": true, 00:27:28.713 "data_offset": 2048, 00:27:28.713 "data_size": 63488 00:27:28.713 }, 00:27:28.713 { 00:27:28.713 "name": null, 00:27:28.713 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:28.713 "is_configured": false, 00:27:28.713 "data_offset": 2048, 00:27:28.713 "data_size": 63488 00:27:28.713 }, 00:27:28.713 { 00:27:28.713 "name": "BaseBdev3", 00:27:28.713 "uuid": "b415badc-c7ea-5511-bea2-d716adb30c30", 00:27:28.713 "is_configured": true, 00:27:28.713 "data_offset": 2048, 00:27:28.713 "data_size": 63488 00:27:28.713 }, 00:27:28.713 { 00:27:28.713 "name": "BaseBdev4", 00:27:28.713 "uuid": "334c2e1a-fcd2-55a9-b7c7-3d9d2771f4b4", 00:27:28.713 "is_configured": true, 00:27:28.713 "data_offset": 2048, 00:27:28.713 "data_size": 63488 00:27:28.713 } 00:27:28.713 ] 00:27:28.713 }' 00:27:28.713 20:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:28.713 20:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:28.713 20:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:28.713 20:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:28.713 20:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:28.974 [2024-07-24 20:03:20.396944] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:28.974 [2024-07-24 20:03:20.446362] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:28.974 [2024-07-24 20:03:20.446414] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:28.974 [2024-07-24 20:03:20.446431] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:28.974 [2024-07-24 20:03:20.446440] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:28.974 20:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:28.974 20:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:28.974 20:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:28.974 20:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:28.974 20:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:28.974 20:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:28.974 20:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:28.974 20:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:28.974 20:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:28.974 20:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:28.974 20:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:28.974 20:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:29.234 20:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:29.234 "name": "raid_bdev1", 00:27:29.234 "uuid": "1d9ecef4-840c-4598-b75f-b251fa289d33", 00:27:29.234 "strip_size_kb": 0, 00:27:29.234 "state": "online", 00:27:29.234 "raid_level": "raid1", 00:27:29.234 "superblock": true, 00:27:29.234 "num_base_bdevs": 4, 00:27:29.234 "num_base_bdevs_discovered": 2, 00:27:29.234 "num_base_bdevs_operational": 2, 00:27:29.234 "base_bdevs_list": [ 00:27:29.234 { 00:27:29.234 "name": null, 00:27:29.234 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:29.234 "is_configured": false, 00:27:29.234 "data_offset": 2048, 00:27:29.234 "data_size": 63488 00:27:29.234 }, 00:27:29.234 { 00:27:29.234 "name": null, 00:27:29.234 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:29.234 "is_configured": false, 00:27:29.234 "data_offset": 2048, 00:27:29.234 "data_size": 63488 00:27:29.234 }, 00:27:29.234 { 00:27:29.234 "name": "BaseBdev3", 00:27:29.234 "uuid": "b415badc-c7ea-5511-bea2-d716adb30c30", 00:27:29.234 "is_configured": true, 00:27:29.234 "data_offset": 2048, 00:27:29.234 "data_size": 63488 00:27:29.234 }, 00:27:29.234 { 00:27:29.234 "name": "BaseBdev4", 00:27:29.234 "uuid": "334c2e1a-fcd2-55a9-b7c7-3d9d2771f4b4", 00:27:29.234 "is_configured": true, 00:27:29.234 "data_offset": 2048, 00:27:29.234 "data_size": 63488 00:27:29.234 } 00:27:29.234 ] 00:27:29.234 }' 00:27:29.234 20:03:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:29.234 20:03:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:29.802 20:03:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:30.062 [2024-07-24 20:03:21.549599] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:30.062 [2024-07-24 20:03:21.549651] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:30.062 [2024-07-24 20:03:21.549673] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa89a60 00:27:30.062 [2024-07-24 20:03:21.549686] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:30.062 [2024-07-24 20:03:21.550063] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:30.062 [2024-07-24 20:03:21.550083] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:30.062 [2024-07-24 20:03:21.550165] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:30.062 [2024-07-24 20:03:21.550179] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:27:30.062 [2024-07-24 20:03:21.550189] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:30.062 [2024-07-24 20:03:21.550209] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:30.062 [2024-07-24 20:03:21.554674] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa8dc60 00:27:30.062 spare 00:27:30.062 [2024-07-24 20:03:21.556150] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:30.062 20:03:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # sleep 1 00:27:30.999 20:03:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:30.999 20:03:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:30.999 20:03:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:30.999 20:03:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:30.999 20:03:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:30.999 20:03:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:31.000 20:03:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:31.259 20:03:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:31.259 "name": "raid_bdev1", 00:27:31.259 "uuid": "1d9ecef4-840c-4598-b75f-b251fa289d33", 00:27:31.259 "strip_size_kb": 0, 00:27:31.259 "state": "online", 00:27:31.259 "raid_level": "raid1", 00:27:31.259 "superblock": true, 00:27:31.259 "num_base_bdevs": 4, 00:27:31.259 "num_base_bdevs_discovered": 3, 00:27:31.259 "num_base_bdevs_operational": 3, 00:27:31.259 "process": { 00:27:31.259 "type": "rebuild", 00:27:31.259 "target": "spare", 00:27:31.259 "progress": { 00:27:31.259 "blocks": 24576, 00:27:31.259 "percent": 38 00:27:31.259 } 00:27:31.259 }, 00:27:31.259 "base_bdevs_list": [ 00:27:31.259 { 00:27:31.259 "name": "spare", 00:27:31.259 "uuid": "33203e59-8ae6-5b52-aff7-1d34780c471d", 00:27:31.259 "is_configured": true, 00:27:31.259 "data_offset": 2048, 00:27:31.259 "data_size": 63488 00:27:31.259 }, 00:27:31.259 { 00:27:31.259 "name": null, 00:27:31.259 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:31.259 "is_configured": false, 00:27:31.259 "data_offset": 2048, 00:27:31.259 "data_size": 63488 00:27:31.259 }, 00:27:31.259 { 00:27:31.259 "name": "BaseBdev3", 00:27:31.259 "uuid": "b415badc-c7ea-5511-bea2-d716adb30c30", 00:27:31.259 "is_configured": true, 00:27:31.259 "data_offset": 2048, 00:27:31.259 "data_size": 63488 00:27:31.259 }, 00:27:31.259 { 00:27:31.259 "name": "BaseBdev4", 00:27:31.259 "uuid": "334c2e1a-fcd2-55a9-b7c7-3d9d2771f4b4", 00:27:31.259 "is_configured": true, 00:27:31.259 "data_offset": 2048, 00:27:31.259 "data_size": 63488 00:27:31.259 } 00:27:31.259 ] 00:27:31.259 }' 00:27:31.259 20:03:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:31.519 20:03:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:31.519 20:03:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:31.519 20:03:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:31.519 20:03:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:31.778 [2024-07-24 20:03:23.153042] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:31.778 [2024-07-24 20:03:23.169011] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:31.778 [2024-07-24 20:03:23.169057] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:31.778 [2024-07-24 20:03:23.169075] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:31.778 [2024-07-24 20:03:23.169083] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:31.778 20:03:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@783 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:31.778 20:03:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:31.778 20:03:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:31.778 20:03:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:31.778 20:03:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:31.778 20:03:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:31.778 20:03:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:31.779 20:03:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:31.779 20:03:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:31.779 20:03:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:31.779 20:03:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:31.779 20:03:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:32.037 20:03:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:32.037 "name": "raid_bdev1", 00:27:32.037 "uuid": "1d9ecef4-840c-4598-b75f-b251fa289d33", 00:27:32.037 "strip_size_kb": 0, 00:27:32.037 "state": "online", 00:27:32.037 "raid_level": "raid1", 00:27:32.037 "superblock": true, 00:27:32.037 "num_base_bdevs": 4, 00:27:32.037 "num_base_bdevs_discovered": 2, 00:27:32.037 "num_base_bdevs_operational": 2, 00:27:32.037 "base_bdevs_list": [ 00:27:32.037 { 00:27:32.037 "name": null, 00:27:32.037 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:32.037 "is_configured": false, 00:27:32.037 "data_offset": 2048, 00:27:32.037 "data_size": 63488 00:27:32.037 }, 00:27:32.037 { 00:27:32.037 "name": null, 00:27:32.037 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:32.037 "is_configured": false, 00:27:32.037 "data_offset": 2048, 00:27:32.037 "data_size": 63488 00:27:32.037 }, 00:27:32.037 { 00:27:32.037 "name": "BaseBdev3", 00:27:32.037 "uuid": "b415badc-c7ea-5511-bea2-d716adb30c30", 00:27:32.037 "is_configured": true, 00:27:32.037 "data_offset": 2048, 00:27:32.037 "data_size": 63488 00:27:32.037 }, 00:27:32.037 { 00:27:32.037 "name": "BaseBdev4", 00:27:32.037 "uuid": "334c2e1a-fcd2-55a9-b7c7-3d9d2771f4b4", 00:27:32.037 "is_configured": true, 00:27:32.037 "data_offset": 2048, 00:27:32.037 "data_size": 63488 00:27:32.037 } 00:27:32.037 ] 00:27:32.037 }' 00:27:32.038 20:03:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:32.038 20:03:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:32.606 20:03:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:32.606 20:03:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:32.606 20:03:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:32.606 20:03:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:32.606 20:03:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:32.606 20:03:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:32.606 20:03:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:32.866 20:03:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:32.866 "name": "raid_bdev1", 00:27:32.866 "uuid": "1d9ecef4-840c-4598-b75f-b251fa289d33", 00:27:32.866 "strip_size_kb": 0, 00:27:32.866 "state": "online", 00:27:32.866 "raid_level": "raid1", 00:27:32.866 "superblock": true, 00:27:32.866 "num_base_bdevs": 4, 00:27:32.866 "num_base_bdevs_discovered": 2, 00:27:32.866 "num_base_bdevs_operational": 2, 00:27:32.866 "base_bdevs_list": [ 00:27:32.866 { 00:27:32.866 "name": null, 00:27:32.866 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:32.866 "is_configured": false, 00:27:32.866 "data_offset": 2048, 00:27:32.866 "data_size": 63488 00:27:32.866 }, 00:27:32.866 { 00:27:32.866 "name": null, 00:27:32.866 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:32.866 "is_configured": false, 00:27:32.866 "data_offset": 2048, 00:27:32.866 "data_size": 63488 00:27:32.866 }, 00:27:32.866 { 00:27:32.866 "name": "BaseBdev3", 00:27:32.866 "uuid": "b415badc-c7ea-5511-bea2-d716adb30c30", 00:27:32.866 "is_configured": true, 00:27:32.866 "data_offset": 2048, 00:27:32.866 "data_size": 63488 00:27:32.866 }, 00:27:32.866 { 00:27:32.866 "name": "BaseBdev4", 00:27:32.866 "uuid": "334c2e1a-fcd2-55a9-b7c7-3d9d2771f4b4", 00:27:32.866 "is_configured": true, 00:27:32.866 "data_offset": 2048, 00:27:32.866 "data_size": 63488 00:27:32.866 } 00:27:32.866 ] 00:27:32.866 }' 00:27:32.866 20:03:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:32.866 20:03:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:32.866 20:03:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:32.866 20:03:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:32.866 20:03:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@787 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:27:33.126 20:03:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@788 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:33.385 [2024-07-24 20:03:24.858632] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:33.385 [2024-07-24 20:03:24.858680] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:33.385 [2024-07-24 20:03:24.858700] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa91f00 00:27:33.385 [2024-07-24 20:03:24.858713] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:33.385 [2024-07-24 20:03:24.859060] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:33.385 [2024-07-24 20:03:24.859079] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:33.385 [2024-07-24 20:03:24.859147] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:27:33.385 [2024-07-24 20:03:24.859161] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:27:33.385 [2024-07-24 20:03:24.859173] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:33.385 BaseBdev1 00:27:33.385 20:03:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@789 -- # sleep 1 00:27:34.323 20:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@790 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:34.323 20:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:34.323 20:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:34.323 20:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:34.323 20:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:34.323 20:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:34.323 20:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:34.323 20:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:34.323 20:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:34.323 20:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:34.323 20:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:34.323 20:03:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:34.582 20:03:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:34.582 "name": "raid_bdev1", 00:27:34.582 "uuid": "1d9ecef4-840c-4598-b75f-b251fa289d33", 00:27:34.582 "strip_size_kb": 0, 00:27:34.582 "state": "online", 00:27:34.582 "raid_level": "raid1", 00:27:34.582 "superblock": true, 00:27:34.582 "num_base_bdevs": 4, 00:27:34.582 "num_base_bdevs_discovered": 2, 00:27:34.582 "num_base_bdevs_operational": 2, 00:27:34.582 "base_bdevs_list": [ 00:27:34.582 { 00:27:34.582 "name": null, 00:27:34.582 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:34.582 "is_configured": false, 00:27:34.582 "data_offset": 2048, 00:27:34.582 "data_size": 63488 00:27:34.582 }, 00:27:34.582 { 00:27:34.582 "name": null, 00:27:34.582 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:34.582 "is_configured": false, 00:27:34.582 "data_offset": 2048, 00:27:34.582 "data_size": 63488 00:27:34.582 }, 00:27:34.582 { 00:27:34.582 "name": "BaseBdev3", 00:27:34.582 "uuid": "b415badc-c7ea-5511-bea2-d716adb30c30", 00:27:34.582 "is_configured": true, 00:27:34.582 "data_offset": 2048, 00:27:34.582 "data_size": 63488 00:27:34.582 }, 00:27:34.582 { 00:27:34.582 "name": "BaseBdev4", 00:27:34.582 "uuid": "334c2e1a-fcd2-55a9-b7c7-3d9d2771f4b4", 00:27:34.582 "is_configured": true, 00:27:34.582 "data_offset": 2048, 00:27:34.582 "data_size": 63488 00:27:34.582 } 00:27:34.582 ] 00:27:34.582 }' 00:27:34.582 20:03:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:34.582 20:03:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:35.151 20:03:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:35.151 20:03:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:35.151 20:03:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:35.151 20:03:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:35.151 20:03:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:35.410 20:03:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:35.410 20:03:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:35.410 20:03:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:35.410 "name": "raid_bdev1", 00:27:35.410 "uuid": "1d9ecef4-840c-4598-b75f-b251fa289d33", 00:27:35.410 "strip_size_kb": 0, 00:27:35.410 "state": "online", 00:27:35.410 "raid_level": "raid1", 00:27:35.410 "superblock": true, 00:27:35.410 "num_base_bdevs": 4, 00:27:35.410 "num_base_bdevs_discovered": 2, 00:27:35.410 "num_base_bdevs_operational": 2, 00:27:35.410 "base_bdevs_list": [ 00:27:35.410 { 00:27:35.410 "name": null, 00:27:35.410 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:35.410 "is_configured": false, 00:27:35.410 "data_offset": 2048, 00:27:35.410 "data_size": 63488 00:27:35.410 }, 00:27:35.410 { 00:27:35.410 "name": null, 00:27:35.410 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:35.410 "is_configured": false, 00:27:35.410 "data_offset": 2048, 00:27:35.410 "data_size": 63488 00:27:35.410 }, 00:27:35.410 { 00:27:35.410 "name": "BaseBdev3", 00:27:35.410 "uuid": "b415badc-c7ea-5511-bea2-d716adb30c30", 00:27:35.410 "is_configured": true, 00:27:35.410 "data_offset": 2048, 00:27:35.410 "data_size": 63488 00:27:35.410 }, 00:27:35.410 { 00:27:35.410 "name": "BaseBdev4", 00:27:35.410 "uuid": "334c2e1a-fcd2-55a9-b7c7-3d9d2771f4b4", 00:27:35.410 "is_configured": true, 00:27:35.410 "data_offset": 2048, 00:27:35.410 "data_size": 63488 00:27:35.410 } 00:27:35.410 ] 00:27:35.410 }' 00:27:35.410 20:03:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:35.670 20:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:35.670 20:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:35.670 20:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:35.670 20:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@792 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:35.670 20:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # local es=0 00:27:35.670 20:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:35.670 20:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:35.670 20:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:27:35.670 20:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:35.670 20:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:27:35.670 20:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:35.670 20:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:27:35.670 20:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:35.670 20:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:35.670 20:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:35.929 [2024-07-24 20:03:27.321499] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:35.929 [2024-07-24 20:03:27.321622] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:27:35.929 [2024-07-24 20:03:27.321638] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:35.929 request: 00:27:35.929 { 00:27:35.929 "base_bdev": "BaseBdev1", 00:27:35.929 "raid_bdev": "raid_bdev1", 00:27:35.929 "method": "bdev_raid_add_base_bdev", 00:27:35.929 "req_id": 1 00:27:35.929 } 00:27:35.929 Got JSON-RPC error response 00:27:35.929 response: 00:27:35.929 { 00:27:35.929 "code": -22, 00:27:35.929 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:27:35.929 } 00:27:35.929 20:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@653 -- # es=1 00:27:35.929 20:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:27:35.929 20:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:27:35.929 20:03:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:27:35.929 20:03:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@793 -- # sleep 1 00:27:36.866 20:03:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@794 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:36.866 20:03:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:36.866 20:03:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:36.866 20:03:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:36.866 20:03:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:36.866 20:03:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:36.866 20:03:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:36.866 20:03:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:36.866 20:03:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:36.866 20:03:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:36.866 20:03:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:36.866 20:03:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:37.126 20:03:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:37.126 "name": "raid_bdev1", 00:27:37.126 "uuid": "1d9ecef4-840c-4598-b75f-b251fa289d33", 00:27:37.126 "strip_size_kb": 0, 00:27:37.126 "state": "online", 00:27:37.126 "raid_level": "raid1", 00:27:37.126 "superblock": true, 00:27:37.126 "num_base_bdevs": 4, 00:27:37.126 "num_base_bdevs_discovered": 2, 00:27:37.126 "num_base_bdevs_operational": 2, 00:27:37.126 "base_bdevs_list": [ 00:27:37.126 { 00:27:37.126 "name": null, 00:27:37.126 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:37.126 "is_configured": false, 00:27:37.126 "data_offset": 2048, 00:27:37.126 "data_size": 63488 00:27:37.126 }, 00:27:37.126 { 00:27:37.126 "name": null, 00:27:37.126 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:37.126 "is_configured": false, 00:27:37.126 "data_offset": 2048, 00:27:37.126 "data_size": 63488 00:27:37.126 }, 00:27:37.126 { 00:27:37.126 "name": "BaseBdev3", 00:27:37.126 "uuid": "b415badc-c7ea-5511-bea2-d716adb30c30", 00:27:37.126 "is_configured": true, 00:27:37.126 "data_offset": 2048, 00:27:37.126 "data_size": 63488 00:27:37.126 }, 00:27:37.126 { 00:27:37.126 "name": "BaseBdev4", 00:27:37.126 "uuid": "334c2e1a-fcd2-55a9-b7c7-3d9d2771f4b4", 00:27:37.126 "is_configured": true, 00:27:37.126 "data_offset": 2048, 00:27:37.126 "data_size": 63488 00:27:37.126 } 00:27:37.126 ] 00:27:37.126 }' 00:27:37.126 20:03:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:37.126 20:03:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:37.692 20:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@795 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:37.692 20:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:37.692 20:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:37.692 20:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:37.692 20:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:37.692 20:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:37.692 20:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:37.949 20:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:37.949 "name": "raid_bdev1", 00:27:37.949 "uuid": "1d9ecef4-840c-4598-b75f-b251fa289d33", 00:27:37.949 "strip_size_kb": 0, 00:27:37.949 "state": "online", 00:27:37.949 "raid_level": "raid1", 00:27:37.949 "superblock": true, 00:27:37.949 "num_base_bdevs": 4, 00:27:37.949 "num_base_bdevs_discovered": 2, 00:27:37.949 "num_base_bdevs_operational": 2, 00:27:37.949 "base_bdevs_list": [ 00:27:37.949 { 00:27:37.949 "name": null, 00:27:37.949 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:37.949 "is_configured": false, 00:27:37.949 "data_offset": 2048, 00:27:37.949 "data_size": 63488 00:27:37.949 }, 00:27:37.949 { 00:27:37.949 "name": null, 00:27:37.950 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:37.950 "is_configured": false, 00:27:37.950 "data_offset": 2048, 00:27:37.950 "data_size": 63488 00:27:37.950 }, 00:27:37.950 { 00:27:37.950 "name": "BaseBdev3", 00:27:37.950 "uuid": "b415badc-c7ea-5511-bea2-d716adb30c30", 00:27:37.950 "is_configured": true, 00:27:37.950 "data_offset": 2048, 00:27:37.950 "data_size": 63488 00:27:37.950 }, 00:27:37.950 { 00:27:37.950 "name": "BaseBdev4", 00:27:37.950 "uuid": "334c2e1a-fcd2-55a9-b7c7-3d9d2771f4b4", 00:27:37.950 "is_configured": true, 00:27:37.950 "data_offset": 2048, 00:27:37.950 "data_size": 63488 00:27:37.950 } 00:27:37.950 ] 00:27:37.950 }' 00:27:37.950 20:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:37.950 20:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:37.950 20:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:37.950 20:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:37.950 20:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@798 -- # killprocess 1511931 00:27:37.950 20:03:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@950 -- # '[' -z 1511931 ']' 00:27:37.950 20:03:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # kill -0 1511931 00:27:37.950 20:03:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # uname 00:27:38.208 20:03:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:27:38.208 20:03:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1511931 00:27:38.208 20:03:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:27:38.208 20:03:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:27:38.208 20:03:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1511931' 00:27:38.208 killing process with pid 1511931 00:27:38.208 20:03:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@969 -- # kill 1511931 00:27:38.208 Received shutdown signal, test time was about 30.009224 seconds 00:27:38.208 00:27:38.208 Latency(us) 00:27:38.208 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:38.208 =================================================================================================================== 00:27:38.208 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:38.208 [2024-07-24 20:03:29.602777] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:38.208 [2024-07-24 20:03:29.602876] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:38.208 [2024-07-24 20:03:29.602940] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to fr 20:03:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@974 -- # wait 1511931 00:27:38.208 ee all in destruct 00:27:38.208 [2024-07-24 20:03:29.602957] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb299a0 name raid_bdev1, state offline 00:27:38.208 [2024-07-24 20:03:29.644167] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:38.467 20:03:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@800 -- # return 0 00:27:38.467 00:27:38.467 real 0m36.437s 00:27:38.467 user 0m58.573s 00:27:38.467 sys 0m5.641s 00:27:38.467 20:03:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:27:38.467 20:03:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:38.467 ************************************ 00:27:38.467 END TEST raid_rebuild_test_sb_io 00:27:38.467 ************************************ 00:27:38.467 20:03:29 bdev_raid -- bdev/bdev_raid.sh@964 -- # '[' n == y ']' 00:27:38.467 20:03:29 bdev_raid -- bdev/bdev_raid.sh@976 -- # base_blocklen=4096 00:27:38.467 20:03:29 bdev_raid -- bdev/bdev_raid.sh@978 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:27:38.467 20:03:29 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:27:38.467 20:03:29 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:27:38.467 20:03:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:38.467 ************************************ 00:27:38.467 START TEST raid_state_function_test_sb_4k 00:27:38.467 ************************************ 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 2 true 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=1517643 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1517643' 00:27:38.467 Process raid pid: 1517643 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 1517643 /var/tmp/spdk-raid.sock 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@831 -- # '[' -z 1517643 ']' 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@836 -- # local max_retries=100 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:38.467 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@840 -- # xtrace_disable 00:27:38.467 20:03:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:38.467 [2024-07-24 20:03:30.008363] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:27:38.467 [2024-07-24 20:03:30.008434] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:38.726 [2024-07-24 20:03:30.136888] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:38.726 [2024-07-24 20:03:30.238770] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:38.726 [2024-07-24 20:03:30.292632] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:38.726 [2024-07-24 20:03:30.292660] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:39.663 20:03:30 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:27:39.663 20:03:30 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@864 -- # return 0 00:27:39.663 20:03:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:39.663 [2024-07-24 20:03:31.215997] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:39.663 [2024-07-24 20:03:31.216042] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:39.663 [2024-07-24 20:03:31.216054] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:39.663 [2024-07-24 20:03:31.216066] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:39.663 20:03:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:39.663 20:03:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:39.663 20:03:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:39.663 20:03:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:39.663 20:03:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:39.663 20:03:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:39.663 20:03:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:39.663 20:03:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:39.663 20:03:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:39.663 20:03:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:39.663 20:03:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:39.663 20:03:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:40.231 20:03:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:40.231 "name": "Existed_Raid", 00:27:40.231 "uuid": "445b4364-36d1-4012-9557-6aa3ad4ccf05", 00:27:40.231 "strip_size_kb": 0, 00:27:40.231 "state": "configuring", 00:27:40.231 "raid_level": "raid1", 00:27:40.231 "superblock": true, 00:27:40.231 "num_base_bdevs": 2, 00:27:40.231 "num_base_bdevs_discovered": 0, 00:27:40.231 "num_base_bdevs_operational": 2, 00:27:40.231 "base_bdevs_list": [ 00:27:40.231 { 00:27:40.231 "name": "BaseBdev1", 00:27:40.231 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:40.231 "is_configured": false, 00:27:40.231 "data_offset": 0, 00:27:40.231 "data_size": 0 00:27:40.231 }, 00:27:40.231 { 00:27:40.231 "name": "BaseBdev2", 00:27:40.231 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:40.231 "is_configured": false, 00:27:40.231 "data_offset": 0, 00:27:40.231 "data_size": 0 00:27:40.231 } 00:27:40.231 ] 00:27:40.231 }' 00:27:40.231 20:03:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:40.231 20:03:31 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:41.166 20:03:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:41.166 [2024-07-24 20:03:32.627565] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:41.166 [2024-07-24 20:03:32.627597] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26c59f0 name Existed_Raid, state configuring 00:27:41.166 20:03:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:41.734 [2024-07-24 20:03:33.128908] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:41.734 [2024-07-24 20:03:33.128945] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:41.734 [2024-07-24 20:03:33.128955] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:41.734 [2024-07-24 20:03:33.128967] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:41.734 20:03:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:27:42.300 [2024-07-24 20:03:33.652137] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:42.300 BaseBdev1 00:27:42.300 20:03:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:27:42.300 20:03:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:27:42.300 20:03:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:27:42.300 20:03:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # local i 00:27:42.300 20:03:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:27:42.300 20:03:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:27:42.300 20:03:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:42.559 20:03:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:27:42.818 [ 00:27:42.818 { 00:27:42.818 "name": "BaseBdev1", 00:27:42.818 "aliases": [ 00:27:42.818 "ef13ca8f-bdf8-49ba-9a2b-543f5fd3988b" 00:27:42.818 ], 00:27:42.818 "product_name": "Malloc disk", 00:27:42.818 "block_size": 4096, 00:27:42.818 "num_blocks": 8192, 00:27:42.818 "uuid": "ef13ca8f-bdf8-49ba-9a2b-543f5fd3988b", 00:27:42.818 "assigned_rate_limits": { 00:27:42.818 "rw_ios_per_sec": 0, 00:27:42.818 "rw_mbytes_per_sec": 0, 00:27:42.818 "r_mbytes_per_sec": 0, 00:27:42.818 "w_mbytes_per_sec": 0 00:27:42.818 }, 00:27:42.818 "claimed": true, 00:27:42.818 "claim_type": "exclusive_write", 00:27:42.818 "zoned": false, 00:27:42.818 "supported_io_types": { 00:27:42.818 "read": true, 00:27:42.818 "write": true, 00:27:42.818 "unmap": true, 00:27:42.818 "flush": true, 00:27:42.818 "reset": true, 00:27:42.818 "nvme_admin": false, 00:27:42.818 "nvme_io": false, 00:27:42.818 "nvme_io_md": false, 00:27:42.818 "write_zeroes": true, 00:27:42.818 "zcopy": true, 00:27:42.818 "get_zone_info": false, 00:27:42.818 "zone_management": false, 00:27:42.818 "zone_append": false, 00:27:42.818 "compare": false, 00:27:42.818 "compare_and_write": false, 00:27:42.818 "abort": true, 00:27:42.818 "seek_hole": false, 00:27:42.818 "seek_data": false, 00:27:42.818 "copy": true, 00:27:42.818 "nvme_iov_md": false 00:27:42.818 }, 00:27:42.818 "memory_domains": [ 00:27:42.818 { 00:27:42.818 "dma_device_id": "system", 00:27:42.818 "dma_device_type": 1 00:27:42.818 }, 00:27:42.818 { 00:27:42.818 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:42.818 "dma_device_type": 2 00:27:42.818 } 00:27:42.818 ], 00:27:42.818 "driver_specific": {} 00:27:42.818 } 00:27:42.818 ] 00:27:42.818 20:03:34 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@907 -- # return 0 00:27:42.818 20:03:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:42.818 20:03:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:42.818 20:03:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:42.818 20:03:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:42.818 20:03:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:42.818 20:03:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:42.818 20:03:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:42.818 20:03:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:42.818 20:03:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:42.818 20:03:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:42.818 20:03:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:42.818 20:03:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:43.077 20:03:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:43.077 "name": "Existed_Raid", 00:27:43.077 "uuid": "7205c14a-1baf-4901-87d5-9d4c0f8b3cee", 00:27:43.077 "strip_size_kb": 0, 00:27:43.077 "state": "configuring", 00:27:43.077 "raid_level": "raid1", 00:27:43.077 "superblock": true, 00:27:43.077 "num_base_bdevs": 2, 00:27:43.077 "num_base_bdevs_discovered": 1, 00:27:43.077 "num_base_bdevs_operational": 2, 00:27:43.077 "base_bdevs_list": [ 00:27:43.077 { 00:27:43.077 "name": "BaseBdev1", 00:27:43.077 "uuid": "ef13ca8f-bdf8-49ba-9a2b-543f5fd3988b", 00:27:43.077 "is_configured": true, 00:27:43.077 "data_offset": 256, 00:27:43.077 "data_size": 7936 00:27:43.077 }, 00:27:43.077 { 00:27:43.077 "name": "BaseBdev2", 00:27:43.077 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:43.077 "is_configured": false, 00:27:43.077 "data_offset": 0, 00:27:43.077 "data_size": 0 00:27:43.077 } 00:27:43.077 ] 00:27:43.078 }' 00:27:43.078 20:03:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:43.078 20:03:34 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:43.646 20:03:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:43.905 [2024-07-24 20:03:35.256385] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:43.905 [2024-07-24 20:03:35.256429] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26c52e0 name Existed_Raid, state configuring 00:27:43.905 20:03:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:44.164 [2024-07-24 20:03:35.509084] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:44.164 [2024-07-24 20:03:35.510554] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:44.164 [2024-07-24 20:03:35.510585] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:44.164 20:03:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:27:44.164 20:03:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:44.164 20:03:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:44.164 20:03:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:44.164 20:03:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:44.164 20:03:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:44.164 20:03:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:44.164 20:03:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:44.164 20:03:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:44.164 20:03:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:44.164 20:03:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:44.164 20:03:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:44.164 20:03:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:44.164 20:03:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:44.467 20:03:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:44.467 "name": "Existed_Raid", 00:27:44.467 "uuid": "6526caff-1da0-4289-8480-8ca508648e1a", 00:27:44.467 "strip_size_kb": 0, 00:27:44.467 "state": "configuring", 00:27:44.467 "raid_level": "raid1", 00:27:44.467 "superblock": true, 00:27:44.467 "num_base_bdevs": 2, 00:27:44.467 "num_base_bdevs_discovered": 1, 00:27:44.467 "num_base_bdevs_operational": 2, 00:27:44.467 "base_bdevs_list": [ 00:27:44.467 { 00:27:44.467 "name": "BaseBdev1", 00:27:44.467 "uuid": "ef13ca8f-bdf8-49ba-9a2b-543f5fd3988b", 00:27:44.467 "is_configured": true, 00:27:44.467 "data_offset": 256, 00:27:44.467 "data_size": 7936 00:27:44.467 }, 00:27:44.467 { 00:27:44.467 "name": "BaseBdev2", 00:27:44.467 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:44.467 "is_configured": false, 00:27:44.467 "data_offset": 0, 00:27:44.467 "data_size": 0 00:27:44.467 } 00:27:44.467 ] 00:27:44.467 }' 00:27:44.468 20:03:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:44.468 20:03:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:45.060 20:03:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:27:45.060 [2024-07-24 20:03:36.595406] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:45.060 [2024-07-24 20:03:36.595556] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x26c60d0 00:27:45.060 [2024-07-24 20:03:36.595570] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:45.060 [2024-07-24 20:03:36.595744] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2879bd0 00:27:45.060 [2024-07-24 20:03:36.595872] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26c60d0 00:27:45.060 [2024-07-24 20:03:36.595883] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x26c60d0 00:27:45.060 [2024-07-24 20:03:36.595977] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:45.060 BaseBdev2 00:27:45.060 20:03:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:27:45.060 20:03:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:27:45.060 20:03:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:27:45.060 20:03:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # local i 00:27:45.060 20:03:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:27:45.060 20:03:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:27:45.060 20:03:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:45.320 20:03:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:27:45.580 [ 00:27:45.580 { 00:27:45.580 "name": "BaseBdev2", 00:27:45.580 "aliases": [ 00:27:45.580 "a2db087a-966e-4865-a3f1-a00d798ae1c7" 00:27:45.580 ], 00:27:45.580 "product_name": "Malloc disk", 00:27:45.580 "block_size": 4096, 00:27:45.580 "num_blocks": 8192, 00:27:45.580 "uuid": "a2db087a-966e-4865-a3f1-a00d798ae1c7", 00:27:45.580 "assigned_rate_limits": { 00:27:45.580 "rw_ios_per_sec": 0, 00:27:45.580 "rw_mbytes_per_sec": 0, 00:27:45.580 "r_mbytes_per_sec": 0, 00:27:45.580 "w_mbytes_per_sec": 0 00:27:45.580 }, 00:27:45.580 "claimed": true, 00:27:45.580 "claim_type": "exclusive_write", 00:27:45.580 "zoned": false, 00:27:45.580 "supported_io_types": { 00:27:45.580 "read": true, 00:27:45.580 "write": true, 00:27:45.580 "unmap": true, 00:27:45.580 "flush": true, 00:27:45.580 "reset": true, 00:27:45.580 "nvme_admin": false, 00:27:45.580 "nvme_io": false, 00:27:45.580 "nvme_io_md": false, 00:27:45.580 "write_zeroes": true, 00:27:45.580 "zcopy": true, 00:27:45.580 "get_zone_info": false, 00:27:45.580 "zone_management": false, 00:27:45.580 "zone_append": false, 00:27:45.580 "compare": false, 00:27:45.580 "compare_and_write": false, 00:27:45.580 "abort": true, 00:27:45.580 "seek_hole": false, 00:27:45.580 "seek_data": false, 00:27:45.580 "copy": true, 00:27:45.580 "nvme_iov_md": false 00:27:45.580 }, 00:27:45.580 "memory_domains": [ 00:27:45.580 { 00:27:45.580 "dma_device_id": "system", 00:27:45.580 "dma_device_type": 1 00:27:45.580 }, 00:27:45.580 { 00:27:45.580 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:45.580 "dma_device_type": 2 00:27:45.580 } 00:27:45.580 ], 00:27:45.580 "driver_specific": {} 00:27:45.580 } 00:27:45.580 ] 00:27:45.580 20:03:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@907 -- # return 0 00:27:45.580 20:03:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:27:45.580 20:03:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:45.580 20:03:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:27:45.580 20:03:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:45.580 20:03:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:45.580 20:03:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:45.580 20:03:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:45.580 20:03:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:45.580 20:03:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:45.580 20:03:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:45.580 20:03:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:45.580 20:03:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:45.580 20:03:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:45.580 20:03:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:45.840 20:03:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:45.840 "name": "Existed_Raid", 00:27:45.840 "uuid": "6526caff-1da0-4289-8480-8ca508648e1a", 00:27:45.840 "strip_size_kb": 0, 00:27:45.840 "state": "online", 00:27:45.840 "raid_level": "raid1", 00:27:45.840 "superblock": true, 00:27:45.840 "num_base_bdevs": 2, 00:27:45.840 "num_base_bdevs_discovered": 2, 00:27:45.840 "num_base_bdevs_operational": 2, 00:27:45.840 "base_bdevs_list": [ 00:27:45.840 { 00:27:45.840 "name": "BaseBdev1", 00:27:45.840 "uuid": "ef13ca8f-bdf8-49ba-9a2b-543f5fd3988b", 00:27:45.840 "is_configured": true, 00:27:45.840 "data_offset": 256, 00:27:45.840 "data_size": 7936 00:27:45.840 }, 00:27:45.840 { 00:27:45.840 "name": "BaseBdev2", 00:27:45.840 "uuid": "a2db087a-966e-4865-a3f1-a00d798ae1c7", 00:27:45.840 "is_configured": true, 00:27:45.840 "data_offset": 256, 00:27:45.840 "data_size": 7936 00:27:45.840 } 00:27:45.840 ] 00:27:45.840 }' 00:27:45.840 20:03:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:45.840 20:03:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:46.407 20:03:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:27:46.407 20:03:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:27:46.407 20:03:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:46.407 20:03:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:46.407 20:03:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:46.407 20:03:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:27:46.407 20:03:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:27:46.407 20:03:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:46.667 [2024-07-24 20:03:38.211959] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:46.667 20:03:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:46.667 "name": "Existed_Raid", 00:27:46.667 "aliases": [ 00:27:46.667 "6526caff-1da0-4289-8480-8ca508648e1a" 00:27:46.667 ], 00:27:46.667 "product_name": "Raid Volume", 00:27:46.667 "block_size": 4096, 00:27:46.667 "num_blocks": 7936, 00:27:46.667 "uuid": "6526caff-1da0-4289-8480-8ca508648e1a", 00:27:46.667 "assigned_rate_limits": { 00:27:46.667 "rw_ios_per_sec": 0, 00:27:46.667 "rw_mbytes_per_sec": 0, 00:27:46.667 "r_mbytes_per_sec": 0, 00:27:46.667 "w_mbytes_per_sec": 0 00:27:46.667 }, 00:27:46.667 "claimed": false, 00:27:46.667 "zoned": false, 00:27:46.667 "supported_io_types": { 00:27:46.667 "read": true, 00:27:46.667 "write": true, 00:27:46.667 "unmap": false, 00:27:46.667 "flush": false, 00:27:46.667 "reset": true, 00:27:46.667 "nvme_admin": false, 00:27:46.667 "nvme_io": false, 00:27:46.667 "nvme_io_md": false, 00:27:46.667 "write_zeroes": true, 00:27:46.667 "zcopy": false, 00:27:46.667 "get_zone_info": false, 00:27:46.667 "zone_management": false, 00:27:46.667 "zone_append": false, 00:27:46.667 "compare": false, 00:27:46.667 "compare_and_write": false, 00:27:46.667 "abort": false, 00:27:46.667 "seek_hole": false, 00:27:46.667 "seek_data": false, 00:27:46.667 "copy": false, 00:27:46.667 "nvme_iov_md": false 00:27:46.667 }, 00:27:46.667 "memory_domains": [ 00:27:46.667 { 00:27:46.667 "dma_device_id": "system", 00:27:46.667 "dma_device_type": 1 00:27:46.667 }, 00:27:46.667 { 00:27:46.667 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:46.667 "dma_device_type": 2 00:27:46.667 }, 00:27:46.667 { 00:27:46.667 "dma_device_id": "system", 00:27:46.667 "dma_device_type": 1 00:27:46.667 }, 00:27:46.667 { 00:27:46.667 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:46.667 "dma_device_type": 2 00:27:46.667 } 00:27:46.667 ], 00:27:46.667 "driver_specific": { 00:27:46.667 "raid": { 00:27:46.667 "uuid": "6526caff-1da0-4289-8480-8ca508648e1a", 00:27:46.667 "strip_size_kb": 0, 00:27:46.667 "state": "online", 00:27:46.667 "raid_level": "raid1", 00:27:46.667 "superblock": true, 00:27:46.667 "num_base_bdevs": 2, 00:27:46.667 "num_base_bdevs_discovered": 2, 00:27:46.667 "num_base_bdevs_operational": 2, 00:27:46.667 "base_bdevs_list": [ 00:27:46.667 { 00:27:46.667 "name": "BaseBdev1", 00:27:46.667 "uuid": "ef13ca8f-bdf8-49ba-9a2b-543f5fd3988b", 00:27:46.667 "is_configured": true, 00:27:46.667 "data_offset": 256, 00:27:46.667 "data_size": 7936 00:27:46.667 }, 00:27:46.667 { 00:27:46.667 "name": "BaseBdev2", 00:27:46.667 "uuid": "a2db087a-966e-4865-a3f1-a00d798ae1c7", 00:27:46.667 "is_configured": true, 00:27:46.667 "data_offset": 256, 00:27:46.667 "data_size": 7936 00:27:46.667 } 00:27:46.667 ] 00:27:46.667 } 00:27:46.667 } 00:27:46.667 }' 00:27:46.667 20:03:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:46.926 20:03:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:27:46.926 BaseBdev2' 00:27:46.926 20:03:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:46.926 20:03:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:27:46.926 20:03:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:47.185 20:03:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:47.186 "name": "BaseBdev1", 00:27:47.186 "aliases": [ 00:27:47.186 "ef13ca8f-bdf8-49ba-9a2b-543f5fd3988b" 00:27:47.186 ], 00:27:47.186 "product_name": "Malloc disk", 00:27:47.186 "block_size": 4096, 00:27:47.186 "num_blocks": 8192, 00:27:47.186 "uuid": "ef13ca8f-bdf8-49ba-9a2b-543f5fd3988b", 00:27:47.186 "assigned_rate_limits": { 00:27:47.186 "rw_ios_per_sec": 0, 00:27:47.186 "rw_mbytes_per_sec": 0, 00:27:47.186 "r_mbytes_per_sec": 0, 00:27:47.186 "w_mbytes_per_sec": 0 00:27:47.186 }, 00:27:47.186 "claimed": true, 00:27:47.186 "claim_type": "exclusive_write", 00:27:47.186 "zoned": false, 00:27:47.186 "supported_io_types": { 00:27:47.186 "read": true, 00:27:47.186 "write": true, 00:27:47.186 "unmap": true, 00:27:47.186 "flush": true, 00:27:47.186 "reset": true, 00:27:47.186 "nvme_admin": false, 00:27:47.186 "nvme_io": false, 00:27:47.186 "nvme_io_md": false, 00:27:47.186 "write_zeroes": true, 00:27:47.186 "zcopy": true, 00:27:47.186 "get_zone_info": false, 00:27:47.186 "zone_management": false, 00:27:47.186 "zone_append": false, 00:27:47.186 "compare": false, 00:27:47.186 "compare_and_write": false, 00:27:47.186 "abort": true, 00:27:47.186 "seek_hole": false, 00:27:47.186 "seek_data": false, 00:27:47.186 "copy": true, 00:27:47.186 "nvme_iov_md": false 00:27:47.186 }, 00:27:47.186 "memory_domains": [ 00:27:47.186 { 00:27:47.186 "dma_device_id": "system", 00:27:47.186 "dma_device_type": 1 00:27:47.186 }, 00:27:47.186 { 00:27:47.186 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:47.186 "dma_device_type": 2 00:27:47.186 } 00:27:47.186 ], 00:27:47.186 "driver_specific": {} 00:27:47.186 }' 00:27:47.186 20:03:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:47.186 20:03:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:47.186 20:03:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:47.186 20:03:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:47.186 20:03:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:47.186 20:03:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:47.186 20:03:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:47.186 20:03:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:47.445 20:03:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:47.445 20:03:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:47.445 20:03:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:47.445 20:03:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:47.445 20:03:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:47.445 20:03:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:27:47.445 20:03:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:47.704 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:47.704 "name": "BaseBdev2", 00:27:47.704 "aliases": [ 00:27:47.704 "a2db087a-966e-4865-a3f1-a00d798ae1c7" 00:27:47.704 ], 00:27:47.704 "product_name": "Malloc disk", 00:27:47.704 "block_size": 4096, 00:27:47.704 "num_blocks": 8192, 00:27:47.704 "uuid": "a2db087a-966e-4865-a3f1-a00d798ae1c7", 00:27:47.704 "assigned_rate_limits": { 00:27:47.704 "rw_ios_per_sec": 0, 00:27:47.704 "rw_mbytes_per_sec": 0, 00:27:47.704 "r_mbytes_per_sec": 0, 00:27:47.704 "w_mbytes_per_sec": 0 00:27:47.704 }, 00:27:47.704 "claimed": true, 00:27:47.704 "claim_type": "exclusive_write", 00:27:47.704 "zoned": false, 00:27:47.705 "supported_io_types": { 00:27:47.705 "read": true, 00:27:47.705 "write": true, 00:27:47.705 "unmap": true, 00:27:47.705 "flush": true, 00:27:47.705 "reset": true, 00:27:47.705 "nvme_admin": false, 00:27:47.705 "nvme_io": false, 00:27:47.705 "nvme_io_md": false, 00:27:47.705 "write_zeroes": true, 00:27:47.705 "zcopy": true, 00:27:47.705 "get_zone_info": false, 00:27:47.705 "zone_management": false, 00:27:47.705 "zone_append": false, 00:27:47.705 "compare": false, 00:27:47.705 "compare_and_write": false, 00:27:47.705 "abort": true, 00:27:47.705 "seek_hole": false, 00:27:47.705 "seek_data": false, 00:27:47.705 "copy": true, 00:27:47.705 "nvme_iov_md": false 00:27:47.705 }, 00:27:47.705 "memory_domains": [ 00:27:47.705 { 00:27:47.705 "dma_device_id": "system", 00:27:47.705 "dma_device_type": 1 00:27:47.705 }, 00:27:47.705 { 00:27:47.705 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:47.705 "dma_device_type": 2 00:27:47.705 } 00:27:47.705 ], 00:27:47.705 "driver_specific": {} 00:27:47.705 }' 00:27:47.705 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:47.705 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:47.705 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:47.705 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:47.705 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:47.963 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:47.963 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:47.963 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:47.963 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:47.963 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:47.963 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:47.963 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:47.963 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:27:48.222 [2024-07-24 20:03:39.715731] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:48.222 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:27:48.222 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:27:48.222 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:48.222 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:27:48.222 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:27:48.222 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:27:48.222 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:48.222 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:48.222 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:48.222 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:48.222 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:48.222 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:48.222 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:48.222 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:48.222 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:48.222 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:48.222 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:48.481 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:48.481 "name": "Existed_Raid", 00:27:48.481 "uuid": "6526caff-1da0-4289-8480-8ca508648e1a", 00:27:48.481 "strip_size_kb": 0, 00:27:48.481 "state": "online", 00:27:48.481 "raid_level": "raid1", 00:27:48.481 "superblock": true, 00:27:48.481 "num_base_bdevs": 2, 00:27:48.481 "num_base_bdevs_discovered": 1, 00:27:48.481 "num_base_bdevs_operational": 1, 00:27:48.481 "base_bdevs_list": [ 00:27:48.481 { 00:27:48.481 "name": null, 00:27:48.481 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:48.481 "is_configured": false, 00:27:48.481 "data_offset": 256, 00:27:48.481 "data_size": 7936 00:27:48.481 }, 00:27:48.481 { 00:27:48.481 "name": "BaseBdev2", 00:27:48.481 "uuid": "a2db087a-966e-4865-a3f1-a00d798ae1c7", 00:27:48.481 "is_configured": true, 00:27:48.481 "data_offset": 256, 00:27:48.481 "data_size": 7936 00:27:48.481 } 00:27:48.481 ] 00:27:48.481 }' 00:27:48.481 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:48.481 20:03:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:49.049 20:03:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:27:49.050 20:03:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:49.050 20:03:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:49.050 20:03:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:27:49.309 20:03:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:27:49.309 20:03:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:27:49.309 20:03:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:27:49.568 [2024-07-24 20:03:41.060688] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:49.568 [2024-07-24 20:03:41.060772] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:49.568 [2024-07-24 20:03:41.071486] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:49.568 [2024-07-24 20:03:41.071519] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:49.568 [2024-07-24 20:03:41.071532] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26c60d0 name Existed_Raid, state offline 00:27:49.568 20:03:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:27:49.568 20:03:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:49.568 20:03:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:49.568 20:03:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:27:49.828 20:03:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:27:49.828 20:03:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:27:49.828 20:03:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:27:49.828 20:03:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 1517643 00:27:49.828 20:03:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@950 -- # '[' -z 1517643 ']' 00:27:49.828 20:03:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # kill -0 1517643 00:27:49.828 20:03:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@955 -- # uname 00:27:49.828 20:03:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:27:49.828 20:03:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1517643 00:27:49.828 20:03:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:27:49.828 20:03:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:27:49.828 20:03:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1517643' 00:27:49.828 killing process with pid 1517643 00:27:49.828 20:03:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@969 -- # kill 1517643 00:27:49.828 [2024-07-24 20:03:41.387110] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:49.828 20:03:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@974 -- # wait 1517643 00:27:49.828 [2024-07-24 20:03:41.387974] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:50.087 20:03:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:27:50.087 00:27:50.087 real 0m11.651s 00:27:50.087 user 0m20.821s 00:27:50.087 sys 0m2.163s 00:27:50.087 20:03:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1126 -- # xtrace_disable 00:27:50.087 20:03:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:50.087 ************************************ 00:27:50.087 END TEST raid_state_function_test_sb_4k 00:27:50.087 ************************************ 00:27:50.087 20:03:41 bdev_raid -- bdev/bdev_raid.sh@979 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:27:50.087 20:03:41 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:27:50.087 20:03:41 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:27:50.087 20:03:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:50.087 ************************************ 00:27:50.087 START TEST raid_superblock_test_4k 00:27:50.087 ************************************ 00:27:50.088 20:03:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 2 00:27:50.088 20:03:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@408 -- # local raid_level=raid1 00:27:50.088 20:03:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=2 00:27:50.088 20:03:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:27:50.088 20:03:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:27:50.088 20:03:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:27:50.088 20:03:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:27:50.088 20:03:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:27:50.088 20:03:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:27:50.088 20:03:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:27:50.088 20:03:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@414 -- # local strip_size 00:27:50.088 20:03:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:27:50.088 20:03:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:27:50.088 20:03:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:27:50.088 20:03:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@419 -- # '[' raid1 '!=' raid1 ']' 00:27:50.088 20:03:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@423 -- # strip_size=0 00:27:50.347 20:03:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@427 -- # raid_pid=1519273 00:27:50.347 20:03:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:27:50.347 20:03:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@428 -- # waitforlisten 1519273 /var/tmp/spdk-raid.sock 00:27:50.347 20:03:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@831 -- # '[' -z 1519273 ']' 00:27:50.347 20:03:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:50.347 20:03:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@836 -- # local max_retries=100 00:27:50.347 20:03:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:50.347 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:50.347 20:03:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@840 -- # xtrace_disable 00:27:50.347 20:03:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:50.347 [2024-07-24 20:03:41.734765] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:27:50.347 [2024-07-24 20:03:41.734833] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1519273 ] 00:27:50.347 [2024-07-24 20:03:41.853413] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:50.606 [2024-07-24 20:03:41.960608] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:50.606 [2024-07-24 20:03:42.035711] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:50.606 [2024-07-24 20:03:42.035754] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:51.174 20:03:42 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:27:51.174 20:03:42 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@864 -- # return 0 00:27:51.174 20:03:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:27:51.174 20:03:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:27:51.174 20:03:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:27:51.174 20:03:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:27:51.174 20:03:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:27:51.174 20:03:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:51.174 20:03:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:27:51.174 20:03:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:51.174 20:03:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:27:51.432 malloc1 00:27:51.432 20:03:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:51.692 [2024-07-24 20:03:43.148280] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:51.692 [2024-07-24 20:03:43.148327] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:51.692 [2024-07-24 20:03:43.148347] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xed2590 00:27:51.692 [2024-07-24 20:03:43.148360] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:51.692 [2024-07-24 20:03:43.150092] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:51.692 [2024-07-24 20:03:43.150122] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:51.692 pt1 00:27:51.692 20:03:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:27:51.692 20:03:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:27:51.692 20:03:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:27:51.692 20:03:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:27:51.692 20:03:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:27:51.692 20:03:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:51.692 20:03:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:27:51.692 20:03:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:51.692 20:03:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:27:51.960 malloc2 00:27:51.961 20:03:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:52.219 [2024-07-24 20:03:43.666452] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:52.219 [2024-07-24 20:03:43.666501] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:52.219 [2024-07-24 20:03:43.666519] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1078690 00:27:52.219 [2024-07-24 20:03:43.666531] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:52.219 [2024-07-24 20:03:43.668134] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:52.219 [2024-07-24 20:03:43.668164] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:52.219 pt2 00:27:52.219 20:03:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:27:52.219 20:03:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:27:52.219 20:03:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:27:52.479 [2024-07-24 20:03:43.915122] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:52.479 [2024-07-24 20:03:43.916368] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:52.479 [2024-07-24 20:03:43.916526] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1079980 00:27:52.479 [2024-07-24 20:03:43.916540] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:52.479 [2024-07-24 20:03:43.916728] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x107a730 00:27:52.479 [2024-07-24 20:03:43.916879] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1079980 00:27:52.479 [2024-07-24 20:03:43.916890] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1079980 00:27:52.479 [2024-07-24 20:03:43.916989] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:52.479 20:03:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:52.479 20:03:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:52.479 20:03:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:52.479 20:03:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:52.479 20:03:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:52.479 20:03:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:52.479 20:03:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:52.479 20:03:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:52.479 20:03:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:52.479 20:03:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:52.479 20:03:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:52.479 20:03:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:52.739 20:03:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:52.739 "name": "raid_bdev1", 00:27:52.739 "uuid": "b4b81a19-604c-459f-b1c6-3c18751fe8cc", 00:27:52.739 "strip_size_kb": 0, 00:27:52.739 "state": "online", 00:27:52.739 "raid_level": "raid1", 00:27:52.739 "superblock": true, 00:27:52.739 "num_base_bdevs": 2, 00:27:52.739 "num_base_bdevs_discovered": 2, 00:27:52.739 "num_base_bdevs_operational": 2, 00:27:52.739 "base_bdevs_list": [ 00:27:52.739 { 00:27:52.739 "name": "pt1", 00:27:52.739 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:52.739 "is_configured": true, 00:27:52.739 "data_offset": 256, 00:27:52.739 "data_size": 7936 00:27:52.739 }, 00:27:52.739 { 00:27:52.739 "name": "pt2", 00:27:52.739 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:52.739 "is_configured": true, 00:27:52.739 "data_offset": 256, 00:27:52.739 "data_size": 7936 00:27:52.739 } 00:27:52.739 ] 00:27:52.739 }' 00:27:52.739 20:03:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:52.739 20:03:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:53.308 20:03:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:27:53.308 20:03:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:53.308 20:03:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:53.308 20:03:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:53.308 20:03:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:53.308 20:03:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:27:53.308 20:03:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:53.308 20:03:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:53.567 [2024-07-24 20:03:45.026281] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:53.567 20:03:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:53.567 "name": "raid_bdev1", 00:27:53.567 "aliases": [ 00:27:53.567 "b4b81a19-604c-459f-b1c6-3c18751fe8cc" 00:27:53.567 ], 00:27:53.567 "product_name": "Raid Volume", 00:27:53.567 "block_size": 4096, 00:27:53.567 "num_blocks": 7936, 00:27:53.567 "uuid": "b4b81a19-604c-459f-b1c6-3c18751fe8cc", 00:27:53.567 "assigned_rate_limits": { 00:27:53.567 "rw_ios_per_sec": 0, 00:27:53.567 "rw_mbytes_per_sec": 0, 00:27:53.567 "r_mbytes_per_sec": 0, 00:27:53.567 "w_mbytes_per_sec": 0 00:27:53.567 }, 00:27:53.567 "claimed": false, 00:27:53.567 "zoned": false, 00:27:53.567 "supported_io_types": { 00:27:53.567 "read": true, 00:27:53.567 "write": true, 00:27:53.567 "unmap": false, 00:27:53.567 "flush": false, 00:27:53.567 "reset": true, 00:27:53.567 "nvme_admin": false, 00:27:53.567 "nvme_io": false, 00:27:53.567 "nvme_io_md": false, 00:27:53.567 "write_zeroes": true, 00:27:53.567 "zcopy": false, 00:27:53.567 "get_zone_info": false, 00:27:53.567 "zone_management": false, 00:27:53.567 "zone_append": false, 00:27:53.567 "compare": false, 00:27:53.567 "compare_and_write": false, 00:27:53.567 "abort": false, 00:27:53.567 "seek_hole": false, 00:27:53.567 "seek_data": false, 00:27:53.567 "copy": false, 00:27:53.567 "nvme_iov_md": false 00:27:53.567 }, 00:27:53.568 "memory_domains": [ 00:27:53.568 { 00:27:53.568 "dma_device_id": "system", 00:27:53.568 "dma_device_type": 1 00:27:53.568 }, 00:27:53.568 { 00:27:53.568 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:53.568 "dma_device_type": 2 00:27:53.568 }, 00:27:53.568 { 00:27:53.568 "dma_device_id": "system", 00:27:53.568 "dma_device_type": 1 00:27:53.568 }, 00:27:53.568 { 00:27:53.568 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:53.568 "dma_device_type": 2 00:27:53.568 } 00:27:53.568 ], 00:27:53.568 "driver_specific": { 00:27:53.568 "raid": { 00:27:53.568 "uuid": "b4b81a19-604c-459f-b1c6-3c18751fe8cc", 00:27:53.568 "strip_size_kb": 0, 00:27:53.568 "state": "online", 00:27:53.568 "raid_level": "raid1", 00:27:53.568 "superblock": true, 00:27:53.568 "num_base_bdevs": 2, 00:27:53.568 "num_base_bdevs_discovered": 2, 00:27:53.568 "num_base_bdevs_operational": 2, 00:27:53.568 "base_bdevs_list": [ 00:27:53.568 { 00:27:53.568 "name": "pt1", 00:27:53.568 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:53.568 "is_configured": true, 00:27:53.568 "data_offset": 256, 00:27:53.568 "data_size": 7936 00:27:53.568 }, 00:27:53.568 { 00:27:53.568 "name": "pt2", 00:27:53.568 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:53.568 "is_configured": true, 00:27:53.568 "data_offset": 256, 00:27:53.568 "data_size": 7936 00:27:53.568 } 00:27:53.568 ] 00:27:53.568 } 00:27:53.568 } 00:27:53.568 }' 00:27:53.568 20:03:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:53.568 20:03:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:53.568 pt2' 00:27:53.568 20:03:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:53.568 20:03:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:53.568 20:03:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:53.827 20:03:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:53.827 "name": "pt1", 00:27:53.827 "aliases": [ 00:27:53.827 "00000000-0000-0000-0000-000000000001" 00:27:53.827 ], 00:27:53.827 "product_name": "passthru", 00:27:53.827 "block_size": 4096, 00:27:53.827 "num_blocks": 8192, 00:27:53.827 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:53.827 "assigned_rate_limits": { 00:27:53.827 "rw_ios_per_sec": 0, 00:27:53.827 "rw_mbytes_per_sec": 0, 00:27:53.827 "r_mbytes_per_sec": 0, 00:27:53.827 "w_mbytes_per_sec": 0 00:27:53.827 }, 00:27:53.827 "claimed": true, 00:27:53.827 "claim_type": "exclusive_write", 00:27:53.827 "zoned": false, 00:27:53.827 "supported_io_types": { 00:27:53.827 "read": true, 00:27:53.827 "write": true, 00:27:53.827 "unmap": true, 00:27:53.827 "flush": true, 00:27:53.827 "reset": true, 00:27:53.827 "nvme_admin": false, 00:27:53.827 "nvme_io": false, 00:27:53.827 "nvme_io_md": false, 00:27:53.827 "write_zeroes": true, 00:27:53.827 "zcopy": true, 00:27:53.827 "get_zone_info": false, 00:27:53.827 "zone_management": false, 00:27:53.827 "zone_append": false, 00:27:53.827 "compare": false, 00:27:53.827 "compare_and_write": false, 00:27:53.827 "abort": true, 00:27:53.827 "seek_hole": false, 00:27:53.827 "seek_data": false, 00:27:53.827 "copy": true, 00:27:53.827 "nvme_iov_md": false 00:27:53.827 }, 00:27:53.827 "memory_domains": [ 00:27:53.827 { 00:27:53.827 "dma_device_id": "system", 00:27:53.827 "dma_device_type": 1 00:27:53.827 }, 00:27:53.827 { 00:27:53.827 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:53.827 "dma_device_type": 2 00:27:53.827 } 00:27:53.827 ], 00:27:53.827 "driver_specific": { 00:27:53.827 "passthru": { 00:27:53.827 "name": "pt1", 00:27:53.827 "base_bdev_name": "malloc1" 00:27:53.827 } 00:27:53.828 } 00:27:53.828 }' 00:27:53.828 20:03:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:53.828 20:03:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:54.087 20:03:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:54.087 20:03:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:54.087 20:03:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:54.087 20:03:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:54.087 20:03:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:54.087 20:03:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:54.087 20:03:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:54.087 20:03:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:54.087 20:03:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:54.346 20:03:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:54.346 20:03:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:54.346 20:03:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:54.346 20:03:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:54.346 20:03:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:54.346 "name": "pt2", 00:27:54.346 "aliases": [ 00:27:54.346 "00000000-0000-0000-0000-000000000002" 00:27:54.346 ], 00:27:54.346 "product_name": "passthru", 00:27:54.346 "block_size": 4096, 00:27:54.346 "num_blocks": 8192, 00:27:54.346 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:54.346 "assigned_rate_limits": { 00:27:54.346 "rw_ios_per_sec": 0, 00:27:54.346 "rw_mbytes_per_sec": 0, 00:27:54.346 "r_mbytes_per_sec": 0, 00:27:54.346 "w_mbytes_per_sec": 0 00:27:54.346 }, 00:27:54.346 "claimed": true, 00:27:54.346 "claim_type": "exclusive_write", 00:27:54.346 "zoned": false, 00:27:54.346 "supported_io_types": { 00:27:54.346 "read": true, 00:27:54.346 "write": true, 00:27:54.346 "unmap": true, 00:27:54.346 "flush": true, 00:27:54.346 "reset": true, 00:27:54.346 "nvme_admin": false, 00:27:54.346 "nvme_io": false, 00:27:54.346 "nvme_io_md": false, 00:27:54.346 "write_zeroes": true, 00:27:54.346 "zcopy": true, 00:27:54.346 "get_zone_info": false, 00:27:54.346 "zone_management": false, 00:27:54.346 "zone_append": false, 00:27:54.346 "compare": false, 00:27:54.346 "compare_and_write": false, 00:27:54.346 "abort": true, 00:27:54.346 "seek_hole": false, 00:27:54.346 "seek_data": false, 00:27:54.346 "copy": true, 00:27:54.346 "nvme_iov_md": false 00:27:54.346 }, 00:27:54.346 "memory_domains": [ 00:27:54.346 { 00:27:54.346 "dma_device_id": "system", 00:27:54.346 "dma_device_type": 1 00:27:54.346 }, 00:27:54.346 { 00:27:54.346 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:54.346 "dma_device_type": 2 00:27:54.346 } 00:27:54.346 ], 00:27:54.347 "driver_specific": { 00:27:54.347 "passthru": { 00:27:54.347 "name": "pt2", 00:27:54.347 "base_bdev_name": "malloc2" 00:27:54.347 } 00:27:54.347 } 00:27:54.347 }' 00:27:54.347 20:03:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:54.605 20:03:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:54.605 20:03:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:54.605 20:03:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:54.605 20:03:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:54.605 20:03:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:54.605 20:03:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:54.605 20:03:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:54.864 20:03:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:54.864 20:03:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:54.864 20:03:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:54.864 20:03:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:54.864 20:03:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:54.864 20:03:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:27:55.123 [2024-07-24 20:03:46.518222] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:55.123 20:03:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=b4b81a19-604c-459f-b1c6-3c18751fe8cc 00:27:55.123 20:03:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@451 -- # '[' -z b4b81a19-604c-459f-b1c6-3c18751fe8cc ']' 00:27:55.123 20:03:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:55.692 [2024-07-24 20:03:47.027337] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:55.692 [2024-07-24 20:03:47.027363] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:55.692 [2024-07-24 20:03:47.027426] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:55.692 [2024-07-24 20:03:47.027480] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:55.692 [2024-07-24 20:03:47.027492] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1079980 name raid_bdev1, state offline 00:27:55.692 20:03:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:55.692 20:03:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:27:56.260 20:03:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:27:56.260 20:03:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:27:56.260 20:03:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:27:56.260 20:03:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:56.260 20:03:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:27:56.260 20:03:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:56.828 20:03:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:27:56.828 20:03:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:27:57.087 20:03:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:27:57.088 20:03:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:57.088 20:03:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@650 -- # local es=0 00:27:57.088 20:03:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:57.088 20:03:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:57.088 20:03:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:27:57.088 20:03:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:57.088 20:03:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:27:57.088 20:03:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:57.088 20:03:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:27:57.088 20:03:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:57.088 20:03:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:57.088 20:03:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:57.347 [2024-07-24 20:03:48.872133] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:27:57.347 [2024-07-24 20:03:48.873535] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:27:57.347 [2024-07-24 20:03:48.873591] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:27:57.347 [2024-07-24 20:03:48.873633] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:27:57.347 [2024-07-24 20:03:48.873652] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:57.347 [2024-07-24 20:03:48.873662] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1077760 name raid_bdev1, state configuring 00:27:57.347 request: 00:27:57.347 { 00:27:57.347 "name": "raid_bdev1", 00:27:57.347 "raid_level": "raid1", 00:27:57.347 "base_bdevs": [ 00:27:57.347 "malloc1", 00:27:57.347 "malloc2" 00:27:57.347 ], 00:27:57.347 "superblock": false, 00:27:57.347 "method": "bdev_raid_create", 00:27:57.347 "req_id": 1 00:27:57.347 } 00:27:57.347 Got JSON-RPC error response 00:27:57.347 response: 00:27:57.347 { 00:27:57.347 "code": -17, 00:27:57.347 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:27:57.347 } 00:27:57.347 20:03:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@653 -- # es=1 00:27:57.347 20:03:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:27:57.347 20:03:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:27:57.347 20:03:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:27:57.347 20:03:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:57.347 20:03:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:27:57.915 20:03:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:27:57.915 20:03:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:27:57.915 20:03:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:58.174 [2024-07-24 20:03:49.686192] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:58.174 [2024-07-24 20:03:49.686235] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:58.174 [2024-07-24 20:03:49.686253] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1078460 00:27:58.174 [2024-07-24 20:03:49.686266] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:58.174 [2024-07-24 20:03:49.687839] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:58.174 [2024-07-24 20:03:49.687868] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:58.174 [2024-07-24 20:03:49.687940] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:58.174 [2024-07-24 20:03:49.687964] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:58.174 pt1 00:27:58.174 20:03:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:27:58.174 20:03:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:58.174 20:03:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:58.174 20:03:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:58.174 20:03:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:58.174 20:03:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:58.174 20:03:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:58.174 20:03:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:58.174 20:03:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:58.174 20:03:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:58.174 20:03:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:58.174 20:03:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:58.743 20:03:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:58.743 "name": "raid_bdev1", 00:27:58.743 "uuid": "b4b81a19-604c-459f-b1c6-3c18751fe8cc", 00:27:58.743 "strip_size_kb": 0, 00:27:58.743 "state": "configuring", 00:27:58.743 "raid_level": "raid1", 00:27:58.743 "superblock": true, 00:27:58.743 "num_base_bdevs": 2, 00:27:58.743 "num_base_bdevs_discovered": 1, 00:27:58.743 "num_base_bdevs_operational": 2, 00:27:58.743 "base_bdevs_list": [ 00:27:58.743 { 00:27:58.743 "name": "pt1", 00:27:58.743 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:58.743 "is_configured": true, 00:27:58.743 "data_offset": 256, 00:27:58.743 "data_size": 7936 00:27:58.743 }, 00:27:58.743 { 00:27:58.743 "name": null, 00:27:58.743 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:58.743 "is_configured": false, 00:27:58.743 "data_offset": 256, 00:27:58.743 "data_size": 7936 00:27:58.743 } 00:27:58.743 ] 00:27:58.743 }' 00:27:58.743 20:03:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:58.743 20:03:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:59.309 20:03:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@485 -- # '[' 2 -gt 2 ']' 00:27:59.309 20:03:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:27:59.309 20:03:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:27:59.309 20:03:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:59.568 [2024-07-24 20:03:51.049817] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:59.568 [2024-07-24 20:03:51.049869] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:59.568 [2024-07-24 20:03:51.049890] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1077230 00:27:59.568 [2024-07-24 20:03:51.049902] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:59.568 [2024-07-24 20:03:51.050249] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:59.568 [2024-07-24 20:03:51.050268] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:59.568 [2024-07-24 20:03:51.050332] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:59.568 [2024-07-24 20:03:51.050351] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:59.568 [2024-07-24 20:03:51.050460] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xed17e0 00:27:59.568 [2024-07-24 20:03:51.050471] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:59.568 [2024-07-24 20:03:51.050637] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x107d4c0 00:27:59.568 [2024-07-24 20:03:51.050768] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xed17e0 00:27:59.568 [2024-07-24 20:03:51.050778] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xed17e0 00:27:59.568 [2024-07-24 20:03:51.050877] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:59.568 pt2 00:27:59.568 20:03:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:27:59.568 20:03:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:27:59.568 20:03:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:59.568 20:03:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:59.568 20:03:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:59.568 20:03:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:59.568 20:03:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:59.568 20:03:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:59.568 20:03:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:59.568 20:03:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:59.568 20:03:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:59.568 20:03:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:59.568 20:03:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:59.568 20:03:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:59.827 20:03:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:59.827 "name": "raid_bdev1", 00:27:59.827 "uuid": "b4b81a19-604c-459f-b1c6-3c18751fe8cc", 00:27:59.827 "strip_size_kb": 0, 00:27:59.827 "state": "online", 00:27:59.827 "raid_level": "raid1", 00:27:59.827 "superblock": true, 00:27:59.827 "num_base_bdevs": 2, 00:27:59.827 "num_base_bdevs_discovered": 2, 00:27:59.827 "num_base_bdevs_operational": 2, 00:27:59.827 "base_bdevs_list": [ 00:27:59.827 { 00:27:59.827 "name": "pt1", 00:27:59.827 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:59.827 "is_configured": true, 00:27:59.827 "data_offset": 256, 00:27:59.827 "data_size": 7936 00:27:59.827 }, 00:27:59.827 { 00:27:59.827 "name": "pt2", 00:27:59.827 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:59.827 "is_configured": true, 00:27:59.827 "data_offset": 256, 00:27:59.827 "data_size": 7936 00:27:59.827 } 00:27:59.827 ] 00:27:59.827 }' 00:27:59.827 20:03:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:59.827 20:03:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:00.393 20:03:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:28:00.393 20:03:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:00.393 20:03:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:00.393 20:03:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:00.393 20:03:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:00.393 20:03:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:28:00.393 20:03:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:00.393 20:03:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:00.651 [2024-07-24 20:03:52.144979] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:00.651 20:03:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:00.651 "name": "raid_bdev1", 00:28:00.651 "aliases": [ 00:28:00.651 "b4b81a19-604c-459f-b1c6-3c18751fe8cc" 00:28:00.651 ], 00:28:00.651 "product_name": "Raid Volume", 00:28:00.651 "block_size": 4096, 00:28:00.651 "num_blocks": 7936, 00:28:00.651 "uuid": "b4b81a19-604c-459f-b1c6-3c18751fe8cc", 00:28:00.651 "assigned_rate_limits": { 00:28:00.651 "rw_ios_per_sec": 0, 00:28:00.651 "rw_mbytes_per_sec": 0, 00:28:00.651 "r_mbytes_per_sec": 0, 00:28:00.651 "w_mbytes_per_sec": 0 00:28:00.651 }, 00:28:00.651 "claimed": false, 00:28:00.651 "zoned": false, 00:28:00.651 "supported_io_types": { 00:28:00.651 "read": true, 00:28:00.651 "write": true, 00:28:00.651 "unmap": false, 00:28:00.651 "flush": false, 00:28:00.651 "reset": true, 00:28:00.651 "nvme_admin": false, 00:28:00.651 "nvme_io": false, 00:28:00.651 "nvme_io_md": false, 00:28:00.651 "write_zeroes": true, 00:28:00.651 "zcopy": false, 00:28:00.651 "get_zone_info": false, 00:28:00.651 "zone_management": false, 00:28:00.651 "zone_append": false, 00:28:00.651 "compare": false, 00:28:00.651 "compare_and_write": false, 00:28:00.651 "abort": false, 00:28:00.651 "seek_hole": false, 00:28:00.652 "seek_data": false, 00:28:00.652 "copy": false, 00:28:00.652 "nvme_iov_md": false 00:28:00.652 }, 00:28:00.652 "memory_domains": [ 00:28:00.652 { 00:28:00.652 "dma_device_id": "system", 00:28:00.652 "dma_device_type": 1 00:28:00.652 }, 00:28:00.652 { 00:28:00.652 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:00.652 "dma_device_type": 2 00:28:00.652 }, 00:28:00.652 { 00:28:00.652 "dma_device_id": "system", 00:28:00.652 "dma_device_type": 1 00:28:00.652 }, 00:28:00.652 { 00:28:00.652 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:00.652 "dma_device_type": 2 00:28:00.652 } 00:28:00.652 ], 00:28:00.652 "driver_specific": { 00:28:00.652 "raid": { 00:28:00.652 "uuid": "b4b81a19-604c-459f-b1c6-3c18751fe8cc", 00:28:00.652 "strip_size_kb": 0, 00:28:00.652 "state": "online", 00:28:00.652 "raid_level": "raid1", 00:28:00.652 "superblock": true, 00:28:00.652 "num_base_bdevs": 2, 00:28:00.652 "num_base_bdevs_discovered": 2, 00:28:00.652 "num_base_bdevs_operational": 2, 00:28:00.652 "base_bdevs_list": [ 00:28:00.652 { 00:28:00.652 "name": "pt1", 00:28:00.652 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:00.652 "is_configured": true, 00:28:00.652 "data_offset": 256, 00:28:00.652 "data_size": 7936 00:28:00.652 }, 00:28:00.652 { 00:28:00.652 "name": "pt2", 00:28:00.652 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:00.652 "is_configured": true, 00:28:00.652 "data_offset": 256, 00:28:00.652 "data_size": 7936 00:28:00.652 } 00:28:00.652 ] 00:28:00.652 } 00:28:00.652 } 00:28:00.652 }' 00:28:00.652 20:03:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:00.652 20:03:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:00.652 pt2' 00:28:00.652 20:03:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:00.652 20:03:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:00.652 20:03:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:00.910 20:03:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:00.910 "name": "pt1", 00:28:00.910 "aliases": [ 00:28:00.910 "00000000-0000-0000-0000-000000000001" 00:28:00.910 ], 00:28:00.910 "product_name": "passthru", 00:28:00.910 "block_size": 4096, 00:28:00.910 "num_blocks": 8192, 00:28:00.910 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:00.910 "assigned_rate_limits": { 00:28:00.910 "rw_ios_per_sec": 0, 00:28:00.910 "rw_mbytes_per_sec": 0, 00:28:00.910 "r_mbytes_per_sec": 0, 00:28:00.910 "w_mbytes_per_sec": 0 00:28:00.910 }, 00:28:00.910 "claimed": true, 00:28:00.910 "claim_type": "exclusive_write", 00:28:00.910 "zoned": false, 00:28:00.910 "supported_io_types": { 00:28:00.910 "read": true, 00:28:00.910 "write": true, 00:28:00.910 "unmap": true, 00:28:00.910 "flush": true, 00:28:00.910 "reset": true, 00:28:00.910 "nvme_admin": false, 00:28:00.910 "nvme_io": false, 00:28:00.910 "nvme_io_md": false, 00:28:00.910 "write_zeroes": true, 00:28:00.910 "zcopy": true, 00:28:00.910 "get_zone_info": false, 00:28:00.910 "zone_management": false, 00:28:00.910 "zone_append": false, 00:28:00.910 "compare": false, 00:28:00.910 "compare_and_write": false, 00:28:00.910 "abort": true, 00:28:00.910 "seek_hole": false, 00:28:00.910 "seek_data": false, 00:28:00.910 "copy": true, 00:28:00.910 "nvme_iov_md": false 00:28:00.910 }, 00:28:00.910 "memory_domains": [ 00:28:00.910 { 00:28:00.910 "dma_device_id": "system", 00:28:00.910 "dma_device_type": 1 00:28:00.910 }, 00:28:00.910 { 00:28:00.910 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:00.910 "dma_device_type": 2 00:28:00.910 } 00:28:00.910 ], 00:28:00.910 "driver_specific": { 00:28:00.910 "passthru": { 00:28:00.910 "name": "pt1", 00:28:00.910 "base_bdev_name": "malloc1" 00:28:00.910 } 00:28:00.910 } 00:28:00.910 }' 00:28:00.910 20:03:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:01.221 20:03:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:01.221 20:03:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:01.221 20:03:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:01.221 20:03:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:01.221 20:03:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:01.221 20:03:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:01.221 20:03:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:01.221 20:03:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:01.221 20:03:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:01.221 20:03:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:01.480 20:03:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:01.480 20:03:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:01.480 20:03:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:01.480 20:03:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:01.738 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:01.738 "name": "pt2", 00:28:01.738 "aliases": [ 00:28:01.738 "00000000-0000-0000-0000-000000000002" 00:28:01.738 ], 00:28:01.738 "product_name": "passthru", 00:28:01.738 "block_size": 4096, 00:28:01.738 "num_blocks": 8192, 00:28:01.738 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:01.738 "assigned_rate_limits": { 00:28:01.738 "rw_ios_per_sec": 0, 00:28:01.738 "rw_mbytes_per_sec": 0, 00:28:01.738 "r_mbytes_per_sec": 0, 00:28:01.738 "w_mbytes_per_sec": 0 00:28:01.738 }, 00:28:01.738 "claimed": true, 00:28:01.738 "claim_type": "exclusive_write", 00:28:01.738 "zoned": false, 00:28:01.738 "supported_io_types": { 00:28:01.738 "read": true, 00:28:01.738 "write": true, 00:28:01.738 "unmap": true, 00:28:01.738 "flush": true, 00:28:01.738 "reset": true, 00:28:01.738 "nvme_admin": false, 00:28:01.738 "nvme_io": false, 00:28:01.738 "nvme_io_md": false, 00:28:01.738 "write_zeroes": true, 00:28:01.738 "zcopy": true, 00:28:01.738 "get_zone_info": false, 00:28:01.738 "zone_management": false, 00:28:01.738 "zone_append": false, 00:28:01.739 "compare": false, 00:28:01.739 "compare_and_write": false, 00:28:01.739 "abort": true, 00:28:01.739 "seek_hole": false, 00:28:01.739 "seek_data": false, 00:28:01.739 "copy": true, 00:28:01.739 "nvme_iov_md": false 00:28:01.739 }, 00:28:01.739 "memory_domains": [ 00:28:01.739 { 00:28:01.739 "dma_device_id": "system", 00:28:01.739 "dma_device_type": 1 00:28:01.739 }, 00:28:01.739 { 00:28:01.739 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:01.739 "dma_device_type": 2 00:28:01.739 } 00:28:01.739 ], 00:28:01.739 "driver_specific": { 00:28:01.739 "passthru": { 00:28:01.739 "name": "pt2", 00:28:01.739 "base_bdev_name": "malloc2" 00:28:01.739 } 00:28:01.739 } 00:28:01.739 }' 00:28:01.739 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:01.739 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:01.739 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:01.739 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:01.739 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:01.739 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:01.739 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:01.739 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:02.002 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:02.002 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:02.002 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:02.002 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:02.002 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:02.002 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:28:02.262 [2024-07-24 20:03:53.673032] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:02.262 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@502 -- # '[' b4b81a19-604c-459f-b1c6-3c18751fe8cc '!=' b4b81a19-604c-459f-b1c6-3c18751fe8cc ']' 00:28:02.262 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # has_redundancy raid1 00:28:02.262 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:02.262 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:28:02.262 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@508 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:02.521 [2024-07-24 20:03:53.921482] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:28:02.521 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@511 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:02.521 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:02.521 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:02.521 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:02.521 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:02.521 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:02.521 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:02.521 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:02.521 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:02.521 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:02.521 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:02.521 20:03:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:02.779 20:03:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:02.779 "name": "raid_bdev1", 00:28:02.779 "uuid": "b4b81a19-604c-459f-b1c6-3c18751fe8cc", 00:28:02.779 "strip_size_kb": 0, 00:28:02.779 "state": "online", 00:28:02.779 "raid_level": "raid1", 00:28:02.779 "superblock": true, 00:28:02.779 "num_base_bdevs": 2, 00:28:02.779 "num_base_bdevs_discovered": 1, 00:28:02.779 "num_base_bdevs_operational": 1, 00:28:02.779 "base_bdevs_list": [ 00:28:02.779 { 00:28:02.779 "name": null, 00:28:02.779 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:02.779 "is_configured": false, 00:28:02.779 "data_offset": 256, 00:28:02.779 "data_size": 7936 00:28:02.779 }, 00:28:02.779 { 00:28:02.779 "name": "pt2", 00:28:02.779 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:02.779 "is_configured": true, 00:28:02.779 "data_offset": 256, 00:28:02.779 "data_size": 7936 00:28:02.779 } 00:28:02.779 ] 00:28:02.779 }' 00:28:02.779 20:03:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:02.779 20:03:54 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:03.346 20:03:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:03.604 [2024-07-24 20:03:55.024358] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:03.604 [2024-07-24 20:03:55.024387] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:03.604 [2024-07-24 20:03:55.024442] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:03.604 [2024-07-24 20:03:55.024483] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:03.604 [2024-07-24 20:03:55.024494] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xed17e0 name raid_bdev1, state offline 00:28:03.604 20:03:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@515 -- # jq -r '.[]' 00:28:03.604 20:03:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@515 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:03.862 20:03:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@515 -- # raid_bdev= 00:28:03.862 20:03:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@516 -- # '[' -n '' ']' 00:28:03.862 20:03:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@521 -- # (( i = 1 )) 00:28:03.862 20:03:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:28:03.862 20:03:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:04.120 20:03:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:28:04.120 20:03:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:28:04.120 20:03:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # (( i = 1 )) 00:28:04.120 20:03:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:28:04.120 20:03:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@534 -- # i=1 00:28:04.120 20:03:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:04.379 [2024-07-24 20:03:55.770304] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:04.379 [2024-07-24 20:03:55.770346] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:04.379 [2024-07-24 20:03:55.770364] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1076e50 00:28:04.379 [2024-07-24 20:03:55.770376] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:04.379 [2024-07-24 20:03:55.771963] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:04.379 [2024-07-24 20:03:55.771993] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:04.379 [2024-07-24 20:03:55.772055] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:04.379 [2024-07-24 20:03:55.772080] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:04.379 [2024-07-24 20:03:55.772163] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x107ce60 00:28:04.379 [2024-07-24 20:03:55.772174] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:04.379 [2024-07-24 20:03:55.772340] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1078ed0 00:28:04.379 [2024-07-24 20:03:55.772468] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x107ce60 00:28:04.379 [2024-07-24 20:03:55.772478] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x107ce60 00:28:04.379 [2024-07-24 20:03:55.772571] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:04.379 pt2 00:28:04.379 20:03:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@538 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:04.379 20:03:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:04.379 20:03:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:04.379 20:03:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:04.379 20:03:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:04.379 20:03:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:04.379 20:03:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:04.379 20:03:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:04.379 20:03:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:04.379 20:03:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:04.379 20:03:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:04.379 20:03:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:04.637 20:03:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:04.637 "name": "raid_bdev1", 00:28:04.637 "uuid": "b4b81a19-604c-459f-b1c6-3c18751fe8cc", 00:28:04.637 "strip_size_kb": 0, 00:28:04.637 "state": "online", 00:28:04.637 "raid_level": "raid1", 00:28:04.637 "superblock": true, 00:28:04.637 "num_base_bdevs": 2, 00:28:04.637 "num_base_bdevs_discovered": 1, 00:28:04.637 "num_base_bdevs_operational": 1, 00:28:04.637 "base_bdevs_list": [ 00:28:04.637 { 00:28:04.637 "name": null, 00:28:04.637 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:04.637 "is_configured": false, 00:28:04.637 "data_offset": 256, 00:28:04.637 "data_size": 7936 00:28:04.637 }, 00:28:04.637 { 00:28:04.637 "name": "pt2", 00:28:04.637 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:04.637 "is_configured": true, 00:28:04.637 "data_offset": 256, 00:28:04.637 "data_size": 7936 00:28:04.637 } 00:28:04.637 ] 00:28:04.637 }' 00:28:04.637 20:03:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:04.637 20:03:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:05.204 20:03:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:05.462 [2024-07-24 20:03:56.877252] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:05.462 [2024-07-24 20:03:56.877280] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:05.462 [2024-07-24 20:03:56.877332] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:05.462 [2024-07-24 20:03:56.877378] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:05.462 [2024-07-24 20:03:56.877397] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x107ce60 name raid_bdev1, state offline 00:28:05.462 20:03:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@542 -- # jq -r '.[]' 00:28:05.462 20:03:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:05.721 20:03:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@542 -- # raid_bdev= 00:28:05.721 20:03:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@543 -- # '[' -n '' ']' 00:28:05.721 20:03:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@547 -- # '[' 2 -gt 2 ']' 00:28:05.721 20:03:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:05.979 [2024-07-24 20:03:57.378552] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:05.979 [2024-07-24 20:03:57.378593] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:05.979 [2024-07-24 20:03:57.378611] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1079d20 00:28:05.979 [2024-07-24 20:03:57.378623] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:05.979 [2024-07-24 20:03:57.380205] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:05.979 [2024-07-24 20:03:57.380234] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:05.979 [2024-07-24 20:03:57.380296] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:05.979 [2024-07-24 20:03:57.380320] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:05.979 [2024-07-24 20:03:57.380432] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:28:05.979 [2024-07-24 20:03:57.380446] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:05.979 [2024-07-24 20:03:57.380459] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x107af40 name raid_bdev1, state configuring 00:28:05.979 [2024-07-24 20:03:57.380482] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:05.979 [2024-07-24 20:03:57.380538] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x10788c0 00:28:05.979 [2024-07-24 20:03:57.380548] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:05.979 [2024-07-24 20:03:57.380710] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1076a60 00:28:05.979 [2024-07-24 20:03:57.380830] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10788c0 00:28:05.979 [2024-07-24 20:03:57.380839] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x10788c0 00:28:05.979 [2024-07-24 20:03:57.380937] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:05.979 pt1 00:28:05.979 20:03:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' 2 -gt 2 ']' 00:28:05.979 20:03:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@569 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:05.979 20:03:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:05.979 20:03:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:05.979 20:03:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:05.979 20:03:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:05.979 20:03:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:05.979 20:03:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:05.979 20:03:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:05.979 20:03:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:05.979 20:03:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:05.979 20:03:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:05.979 20:03:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:06.238 20:03:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:06.238 "name": "raid_bdev1", 00:28:06.238 "uuid": "b4b81a19-604c-459f-b1c6-3c18751fe8cc", 00:28:06.238 "strip_size_kb": 0, 00:28:06.238 "state": "online", 00:28:06.238 "raid_level": "raid1", 00:28:06.238 "superblock": true, 00:28:06.238 "num_base_bdevs": 2, 00:28:06.238 "num_base_bdevs_discovered": 1, 00:28:06.238 "num_base_bdevs_operational": 1, 00:28:06.238 "base_bdevs_list": [ 00:28:06.238 { 00:28:06.238 "name": null, 00:28:06.238 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:06.238 "is_configured": false, 00:28:06.238 "data_offset": 256, 00:28:06.238 "data_size": 7936 00:28:06.238 }, 00:28:06.238 { 00:28:06.238 "name": "pt2", 00:28:06.238 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:06.238 "is_configured": true, 00:28:06.238 "data_offset": 256, 00:28:06.238 "data_size": 7936 00:28:06.238 } 00:28:06.238 ] 00:28:06.238 }' 00:28:06.238 20:03:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:06.238 20:03:57 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:06.804 20:03:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@570 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:28:06.804 20:03:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@570 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:28:07.062 20:03:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@570 -- # [[ false == \f\a\l\s\e ]] 00:28:07.062 20:03:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@573 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:07.062 20:03:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@573 -- # jq -r '.[] | .uuid' 00:28:07.321 [2024-07-24 20:03:58.674222] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:07.321 20:03:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@573 -- # '[' b4b81a19-604c-459f-b1c6-3c18751fe8cc '!=' b4b81a19-604c-459f-b1c6-3c18751fe8cc ']' 00:28:07.321 20:03:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@578 -- # killprocess 1519273 00:28:07.321 20:03:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@950 -- # '[' -z 1519273 ']' 00:28:07.321 20:03:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # kill -0 1519273 00:28:07.321 20:03:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@955 -- # uname 00:28:07.321 20:03:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:28:07.321 20:03:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1519273 00:28:07.321 20:03:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:28:07.321 20:03:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:28:07.321 20:03:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1519273' 00:28:07.321 killing process with pid 1519273 00:28:07.321 20:03:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@969 -- # kill 1519273 00:28:07.321 [2024-07-24 20:03:58.741517] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:07.321 [2024-07-24 20:03:58.741572] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:07.321 [2024-07-24 20:03:58.741614] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:07.321 [2024-07-24 20:03:58.741626] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10788c0 name raid_bdev1, state offline 00:28:07.321 20:03:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@974 -- # wait 1519273 00:28:07.321 [2024-07-24 20:03:58.757826] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:07.579 20:03:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@580 -- # return 0 00:28:07.579 00:28:07.579 real 0m17.283s 00:28:07.579 user 0m31.479s 00:28:07.580 sys 0m3.154s 00:28:07.580 20:03:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1126 -- # xtrace_disable 00:28:07.580 20:03:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:07.580 ************************************ 00:28:07.580 END TEST raid_superblock_test_4k 00:28:07.580 ************************************ 00:28:07.580 20:03:59 bdev_raid -- bdev/bdev_raid.sh@980 -- # '[' true = true ']' 00:28:07.580 20:03:59 bdev_raid -- bdev/bdev_raid.sh@981 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:28:07.580 20:03:59 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:28:07.580 20:03:59 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:28:07.580 20:03:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:07.580 ************************************ 00:28:07.580 START TEST raid_rebuild_test_sb_4k 00:28:07.580 ************************************ 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 true false true 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=2 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@586 -- # local superblock=true 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@587 -- # local background_io=false 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # local verify=true 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # local strip_size 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # local create_arg 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@594 -- # local data_offset 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # '[' true = true ']' 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # create_arg+=' -s' 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # raid_pid=1521860 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@613 -- # waitforlisten 1521860 /var/tmp/spdk-raid.sock 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@831 -- # '[' -z 1521860 ']' 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@836 -- # local max_retries=100 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:07.580 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@840 -- # xtrace_disable 00:28:07.580 20:03:59 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:07.580 [2024-07-24 20:03:59.125696] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:28:07.580 [2024-07-24 20:03:59.125770] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1521860 ] 00:28:07.580 I/O size of 3145728 is greater than zero copy threshold (65536). 00:28:07.580 Zero copy mechanism will not be used. 00:28:07.839 [2024-07-24 20:03:59.255459] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:07.839 [2024-07-24 20:03:59.357257] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:07.839 [2024-07-24 20:03:59.420977] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:07.839 [2024-07-24 20:03:59.421022] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:08.773 20:04:00 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:28:08.773 20:04:00 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@864 -- # return 0 00:28:08.773 20:04:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:28:08.773 20:04:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:28:08.773 BaseBdev1_malloc 00:28:08.773 20:04:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:09.031 [2024-07-24 20:04:00.491523] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:09.031 [2024-07-24 20:04:00.491571] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:09.031 [2024-07-24 20:04:00.491594] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdd7cd0 00:28:09.031 [2024-07-24 20:04:00.491607] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:09.031 [2024-07-24 20:04:00.493237] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:09.031 [2024-07-24 20:04:00.493267] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:09.031 BaseBdev1 00:28:09.031 20:04:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:28:09.031 20:04:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:28:09.289 BaseBdev2_malloc 00:28:09.289 20:04:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:28:09.546 [2024-07-24 20:04:00.985559] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:28:09.547 [2024-07-24 20:04:00.985605] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:09.547 [2024-07-24 20:04:00.985623] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xddb460 00:28:09.547 [2024-07-24 20:04:00.985635] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:09.547 [2024-07-24 20:04:00.987170] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:09.547 [2024-07-24 20:04:00.987199] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:28:09.547 BaseBdev2 00:28:09.547 20:04:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:28:09.805 spare_malloc 00:28:09.805 20:04:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:28:10.370 spare_delay 00:28:10.370 20:04:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:10.628 [2024-07-24 20:04:01.992807] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:10.628 [2024-07-24 20:04:01.992850] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:10.628 [2024-07-24 20:04:01.992870] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdcfc70 00:28:10.628 [2024-07-24 20:04:01.992883] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:10.628 [2024-07-24 20:04:01.994312] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:10.628 [2024-07-24 20:04:01.994341] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:10.628 spare 00:28:10.628 20:04:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:28:10.886 [2024-07-24 20:04:02.253529] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:10.886 [2024-07-24 20:04:02.254885] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:10.886 [2024-07-24 20:04:02.255057] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xe9ac90 00:28:10.886 [2024-07-24 20:04:02.255070] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:10.886 [2024-07-24 20:04:02.255269] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdcff00 00:28:10.886 [2024-07-24 20:04:02.255430] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe9ac90 00:28:10.886 [2024-07-24 20:04:02.255441] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe9ac90 00:28:10.886 [2024-07-24 20:04:02.255547] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:10.886 20:04:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:10.886 20:04:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:10.886 20:04:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:10.886 20:04:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:10.886 20:04:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:10.886 20:04:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:10.886 20:04:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:10.886 20:04:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:10.886 20:04:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:10.886 20:04:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:10.886 20:04:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:10.887 20:04:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:11.145 20:04:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:11.145 "name": "raid_bdev1", 00:28:11.145 "uuid": "653eeeb9-ada4-4b8d-9b24-0f57b09af808", 00:28:11.145 "strip_size_kb": 0, 00:28:11.145 "state": "online", 00:28:11.145 "raid_level": "raid1", 00:28:11.145 "superblock": true, 00:28:11.145 "num_base_bdevs": 2, 00:28:11.145 "num_base_bdevs_discovered": 2, 00:28:11.145 "num_base_bdevs_operational": 2, 00:28:11.145 "base_bdevs_list": [ 00:28:11.145 { 00:28:11.145 "name": "BaseBdev1", 00:28:11.145 "uuid": "b246ca4f-1a65-5ff1-a440-455b13af49b7", 00:28:11.145 "is_configured": true, 00:28:11.145 "data_offset": 256, 00:28:11.145 "data_size": 7936 00:28:11.145 }, 00:28:11.145 { 00:28:11.145 "name": "BaseBdev2", 00:28:11.145 "uuid": "4c71b4a4-92c5-541d-8637-2044f7100146", 00:28:11.145 "is_configured": true, 00:28:11.145 "data_offset": 256, 00:28:11.145 "data_size": 7936 00:28:11.145 } 00:28:11.145 ] 00:28:11.145 }' 00:28:11.145 20:04:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:11.145 20:04:02 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:11.712 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:28:11.712 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:11.712 [2024-07-24 20:04:03.304517] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:11.969 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=7936 00:28:11.969 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:11.969 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:28:12.229 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # data_offset=256 00:28:12.229 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@636 -- # '[' false = true ']' 00:28:12.229 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # '[' true = true ']' 00:28:12.229 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@640 -- # local write_unit_size 00:28:12.229 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@643 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:28:12.229 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:12.229 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:28:12.229 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:12.229 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:28:12.229 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:12.229 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:28:12.229 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:12.229 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:12.229 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:28:12.229 [2024-07-24 20:04:03.805638] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdcff00 00:28:12.229 /dev/nbd0 00:28:12.488 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:12.488 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:12.488 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:28:12.488 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # local i 00:28:12.488 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:28:12.488 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:28:12.488 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:28:12.488 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@873 -- # break 00:28:12.488 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:28:12.488 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:28:12.488 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:12.488 1+0 records in 00:28:12.488 1+0 records out 00:28:12.488 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000252087 s, 16.2 MB/s 00:28:12.488 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:12.488 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # size=4096 00:28:12.488 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:12.488 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:28:12.488 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@889 -- # return 0 00:28:12.488 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:12.488 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:12.488 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@644 -- # '[' raid1 = raid5f ']' 00:28:12.488 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@648 -- # write_unit_size=1 00:28:12.488 20:04:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@650 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:28:13.423 7936+0 records in 00:28:13.423 7936+0 records out 00:28:13.423 32505856 bytes (33 MB, 31 MiB) copied, 0.799683 s, 40.6 MB/s 00:28:13.423 20:04:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@651 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:28:13.423 20:04:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:13.423 20:04:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:28:13.423 20:04:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:13.423 20:04:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:28:13.423 20:04:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:13.423 20:04:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:28:13.423 20:04:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:13.423 20:04:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:13.423 20:04:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:13.423 20:04:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:13.423 20:04:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:13.423 20:04:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:13.423 20:04:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@39 -- # sleep 0.1 00:28:13.423 [2024-07-24 20:04:04.956572] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:13.715 20:04:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i++ )) 00:28:13.715 20:04:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:13.715 20:04:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:13.715 20:04:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:28:13.715 20:04:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:28:13.715 20:04:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:28:13.715 [2024-07-24 20:04:05.280767] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:13.715 20:04:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:13.715 20:04:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:13.715 20:04:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:13.715 20:04:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:13.715 20:04:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:13.715 20:04:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:13.715 20:04:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:13.715 20:04:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:13.715 20:04:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:13.715 20:04:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:13.974 20:04:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:13.974 20:04:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:13.974 20:04:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:13.974 "name": "raid_bdev1", 00:28:13.974 "uuid": "653eeeb9-ada4-4b8d-9b24-0f57b09af808", 00:28:13.974 "strip_size_kb": 0, 00:28:13.974 "state": "online", 00:28:13.974 "raid_level": "raid1", 00:28:13.974 "superblock": true, 00:28:13.974 "num_base_bdevs": 2, 00:28:13.974 "num_base_bdevs_discovered": 1, 00:28:13.974 "num_base_bdevs_operational": 1, 00:28:13.974 "base_bdevs_list": [ 00:28:13.974 { 00:28:13.974 "name": null, 00:28:13.974 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:13.974 "is_configured": false, 00:28:13.974 "data_offset": 256, 00:28:13.974 "data_size": 7936 00:28:13.974 }, 00:28:13.974 { 00:28:13.974 "name": "BaseBdev2", 00:28:13.974 "uuid": "4c71b4a4-92c5-541d-8637-2044f7100146", 00:28:13.974 "is_configured": true, 00:28:13.974 "data_offset": 256, 00:28:13.974 "data_size": 7936 00:28:13.974 } 00:28:13.974 ] 00:28:13.974 }' 00:28:13.974 20:04:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:13.974 20:04:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:14.541 20:04:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:14.800 [2024-07-24 20:04:06.355621] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:14.800 [2024-07-24 20:04:06.360599] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdcf4e0 00:28:14.800 [2024-07-24 20:04:06.362936] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:14.800 20:04:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:28:16.178 20:04:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:16.178 20:04:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:16.178 20:04:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:16.178 20:04:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:16.178 20:04:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:16.178 20:04:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:16.178 20:04:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:16.178 20:04:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:16.178 "name": "raid_bdev1", 00:28:16.178 "uuid": "653eeeb9-ada4-4b8d-9b24-0f57b09af808", 00:28:16.178 "strip_size_kb": 0, 00:28:16.178 "state": "online", 00:28:16.178 "raid_level": "raid1", 00:28:16.178 "superblock": true, 00:28:16.178 "num_base_bdevs": 2, 00:28:16.178 "num_base_bdevs_discovered": 2, 00:28:16.178 "num_base_bdevs_operational": 2, 00:28:16.178 "process": { 00:28:16.178 "type": "rebuild", 00:28:16.178 "target": "spare", 00:28:16.178 "progress": { 00:28:16.178 "blocks": 2816, 00:28:16.178 "percent": 35 00:28:16.178 } 00:28:16.178 }, 00:28:16.178 "base_bdevs_list": [ 00:28:16.178 { 00:28:16.178 "name": "spare", 00:28:16.178 "uuid": "3d4c2dbf-00b1-570a-abfa-b49e76480ac3", 00:28:16.178 "is_configured": true, 00:28:16.178 "data_offset": 256, 00:28:16.178 "data_size": 7936 00:28:16.178 }, 00:28:16.178 { 00:28:16.178 "name": "BaseBdev2", 00:28:16.178 "uuid": "4c71b4a4-92c5-541d-8637-2044f7100146", 00:28:16.178 "is_configured": true, 00:28:16.178 "data_offset": 256, 00:28:16.178 "data_size": 7936 00:28:16.178 } 00:28:16.178 ] 00:28:16.178 }' 00:28:16.178 20:04:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:16.178 20:04:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:16.178 20:04:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:16.178 20:04:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:16.178 20:04:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:16.438 [2024-07-24 20:04:07.889469] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:16.438 [2024-07-24 20:04:07.975474] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:16.438 [2024-07-24 20:04:07.975521] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:16.438 [2024-07-24 20:04:07.975537] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:16.438 [2024-07-24 20:04:07.975545] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:16.438 20:04:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:16.438 20:04:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:16.438 20:04:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:16.438 20:04:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:16.438 20:04:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:16.438 20:04:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:16.438 20:04:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:16.438 20:04:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:16.438 20:04:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:16.438 20:04:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:16.438 20:04:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:16.438 20:04:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:16.697 20:04:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:16.697 "name": "raid_bdev1", 00:28:16.697 "uuid": "653eeeb9-ada4-4b8d-9b24-0f57b09af808", 00:28:16.697 "strip_size_kb": 0, 00:28:16.697 "state": "online", 00:28:16.697 "raid_level": "raid1", 00:28:16.697 "superblock": true, 00:28:16.697 "num_base_bdevs": 2, 00:28:16.697 "num_base_bdevs_discovered": 1, 00:28:16.697 "num_base_bdevs_operational": 1, 00:28:16.697 "base_bdevs_list": [ 00:28:16.697 { 00:28:16.697 "name": null, 00:28:16.697 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:16.697 "is_configured": false, 00:28:16.697 "data_offset": 256, 00:28:16.697 "data_size": 7936 00:28:16.697 }, 00:28:16.697 { 00:28:16.697 "name": "BaseBdev2", 00:28:16.697 "uuid": "4c71b4a4-92c5-541d-8637-2044f7100146", 00:28:16.697 "is_configured": true, 00:28:16.697 "data_offset": 256, 00:28:16.697 "data_size": 7936 00:28:16.697 } 00:28:16.697 ] 00:28:16.697 }' 00:28:16.697 20:04:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:16.697 20:04:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:17.636 20:04:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:17.636 20:04:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:17.636 20:04:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:17.636 20:04:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:17.636 20:04:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:17.636 20:04:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:17.636 20:04:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:17.636 20:04:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:17.636 "name": "raid_bdev1", 00:28:17.636 "uuid": "653eeeb9-ada4-4b8d-9b24-0f57b09af808", 00:28:17.636 "strip_size_kb": 0, 00:28:17.636 "state": "online", 00:28:17.636 "raid_level": "raid1", 00:28:17.636 "superblock": true, 00:28:17.636 "num_base_bdevs": 2, 00:28:17.636 "num_base_bdevs_discovered": 1, 00:28:17.636 "num_base_bdevs_operational": 1, 00:28:17.636 "base_bdevs_list": [ 00:28:17.636 { 00:28:17.636 "name": null, 00:28:17.636 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:17.636 "is_configured": false, 00:28:17.636 "data_offset": 256, 00:28:17.636 "data_size": 7936 00:28:17.636 }, 00:28:17.636 { 00:28:17.636 "name": "BaseBdev2", 00:28:17.636 "uuid": "4c71b4a4-92c5-541d-8637-2044f7100146", 00:28:17.636 "is_configured": true, 00:28:17.636 "data_offset": 256, 00:28:17.636 "data_size": 7936 00:28:17.636 } 00:28:17.636 ] 00:28:17.636 }' 00:28:17.636 20:04:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:17.636 20:04:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:17.636 20:04:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:17.895 20:04:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:17.895 20:04:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:17.895 [2024-07-24 20:04:09.467598] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:17.895 [2024-07-24 20:04:09.473226] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdcee40 00:28:17.895 [2024-07-24 20:04:09.474751] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:18.155 20:04:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@678 -- # sleep 1 00:28:19.121 20:04:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:19.121 20:04:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:19.121 20:04:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:19.121 20:04:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:19.121 20:04:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:19.121 20:04:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:19.121 20:04:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:19.381 20:04:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:19.381 "name": "raid_bdev1", 00:28:19.381 "uuid": "653eeeb9-ada4-4b8d-9b24-0f57b09af808", 00:28:19.381 "strip_size_kb": 0, 00:28:19.381 "state": "online", 00:28:19.381 "raid_level": "raid1", 00:28:19.381 "superblock": true, 00:28:19.381 "num_base_bdevs": 2, 00:28:19.381 "num_base_bdevs_discovered": 2, 00:28:19.381 "num_base_bdevs_operational": 2, 00:28:19.381 "process": { 00:28:19.381 "type": "rebuild", 00:28:19.381 "target": "spare", 00:28:19.381 "progress": { 00:28:19.381 "blocks": 3072, 00:28:19.381 "percent": 38 00:28:19.381 } 00:28:19.381 }, 00:28:19.381 "base_bdevs_list": [ 00:28:19.381 { 00:28:19.381 "name": "spare", 00:28:19.381 "uuid": "3d4c2dbf-00b1-570a-abfa-b49e76480ac3", 00:28:19.381 "is_configured": true, 00:28:19.381 "data_offset": 256, 00:28:19.381 "data_size": 7936 00:28:19.381 }, 00:28:19.381 { 00:28:19.381 "name": "BaseBdev2", 00:28:19.381 "uuid": "4c71b4a4-92c5-541d-8637-2044f7100146", 00:28:19.381 "is_configured": true, 00:28:19.381 "data_offset": 256, 00:28:19.381 "data_size": 7936 00:28:19.381 } 00:28:19.381 ] 00:28:19.381 }' 00:28:19.381 20:04:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:19.381 20:04:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:19.381 20:04:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:19.381 20:04:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:19.381 20:04:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@681 -- # '[' true = true ']' 00:28:19.381 20:04:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@681 -- # '[' = false ']' 00:28:19.381 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 681: [: =: unary operator expected 00:28:19.381 20:04:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=2 00:28:19.381 20:04:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:28:19.381 20:04:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # '[' 2 -gt 2 ']' 00:28:19.381 20:04:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # local timeout=1071 00:28:19.381 20:04:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:28:19.381 20:04:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:19.381 20:04:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:19.381 20:04:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:19.381 20:04:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:19.381 20:04:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:19.381 20:04:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:19.381 20:04:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:19.640 20:04:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:19.640 "name": "raid_bdev1", 00:28:19.640 "uuid": "653eeeb9-ada4-4b8d-9b24-0f57b09af808", 00:28:19.640 "strip_size_kb": 0, 00:28:19.640 "state": "online", 00:28:19.641 "raid_level": "raid1", 00:28:19.641 "superblock": true, 00:28:19.641 "num_base_bdevs": 2, 00:28:19.641 "num_base_bdevs_discovered": 2, 00:28:19.641 "num_base_bdevs_operational": 2, 00:28:19.641 "process": { 00:28:19.641 "type": "rebuild", 00:28:19.641 "target": "spare", 00:28:19.641 "progress": { 00:28:19.641 "blocks": 3840, 00:28:19.641 "percent": 48 00:28:19.641 } 00:28:19.641 }, 00:28:19.641 "base_bdevs_list": [ 00:28:19.641 { 00:28:19.641 "name": "spare", 00:28:19.641 "uuid": "3d4c2dbf-00b1-570a-abfa-b49e76480ac3", 00:28:19.641 "is_configured": true, 00:28:19.641 "data_offset": 256, 00:28:19.641 "data_size": 7936 00:28:19.641 }, 00:28:19.641 { 00:28:19.641 "name": "BaseBdev2", 00:28:19.641 "uuid": "4c71b4a4-92c5-541d-8637-2044f7100146", 00:28:19.641 "is_configured": true, 00:28:19.641 "data_offset": 256, 00:28:19.641 "data_size": 7936 00:28:19.641 } 00:28:19.641 ] 00:28:19.641 }' 00:28:19.641 20:04:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:19.641 20:04:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:19.641 20:04:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:19.641 20:04:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:19.641 20:04:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@726 -- # sleep 1 00:28:21.019 20:04:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:28:21.019 20:04:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:21.019 20:04:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:21.019 20:04:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:21.019 20:04:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:21.019 20:04:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:21.019 20:04:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:21.019 20:04:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:21.019 20:04:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:21.019 "name": "raid_bdev1", 00:28:21.019 "uuid": "653eeeb9-ada4-4b8d-9b24-0f57b09af808", 00:28:21.019 "strip_size_kb": 0, 00:28:21.019 "state": "online", 00:28:21.019 "raid_level": "raid1", 00:28:21.019 "superblock": true, 00:28:21.019 "num_base_bdevs": 2, 00:28:21.019 "num_base_bdevs_discovered": 2, 00:28:21.019 "num_base_bdevs_operational": 2, 00:28:21.019 "process": { 00:28:21.019 "type": "rebuild", 00:28:21.019 "target": "spare", 00:28:21.019 "progress": { 00:28:21.019 "blocks": 7424, 00:28:21.019 "percent": 93 00:28:21.019 } 00:28:21.019 }, 00:28:21.019 "base_bdevs_list": [ 00:28:21.019 { 00:28:21.019 "name": "spare", 00:28:21.019 "uuid": "3d4c2dbf-00b1-570a-abfa-b49e76480ac3", 00:28:21.019 "is_configured": true, 00:28:21.019 "data_offset": 256, 00:28:21.019 "data_size": 7936 00:28:21.019 }, 00:28:21.019 { 00:28:21.019 "name": "BaseBdev2", 00:28:21.019 "uuid": "4c71b4a4-92c5-541d-8637-2044f7100146", 00:28:21.019 "is_configured": true, 00:28:21.019 "data_offset": 256, 00:28:21.019 "data_size": 7936 00:28:21.019 } 00:28:21.019 ] 00:28:21.019 }' 00:28:21.019 20:04:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:21.020 20:04:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:21.020 20:04:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:21.020 20:04:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:21.020 20:04:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@726 -- # sleep 1 00:28:21.020 [2024-07-24 20:04:12.598792] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:28:21.020 [2024-07-24 20:04:12.598850] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:28:21.020 [2024-07-24 20:04:12.598934] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:21.962 20:04:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:28:21.962 20:04:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:21.962 20:04:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:21.962 20:04:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:21.962 20:04:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:21.962 20:04:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:21.962 20:04:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:21.962 20:04:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:22.223 20:04:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:22.223 "name": "raid_bdev1", 00:28:22.223 "uuid": "653eeeb9-ada4-4b8d-9b24-0f57b09af808", 00:28:22.223 "strip_size_kb": 0, 00:28:22.223 "state": "online", 00:28:22.223 "raid_level": "raid1", 00:28:22.223 "superblock": true, 00:28:22.223 "num_base_bdevs": 2, 00:28:22.223 "num_base_bdevs_discovered": 2, 00:28:22.223 "num_base_bdevs_operational": 2, 00:28:22.223 "base_bdevs_list": [ 00:28:22.223 { 00:28:22.223 "name": "spare", 00:28:22.223 "uuid": "3d4c2dbf-00b1-570a-abfa-b49e76480ac3", 00:28:22.223 "is_configured": true, 00:28:22.223 "data_offset": 256, 00:28:22.223 "data_size": 7936 00:28:22.223 }, 00:28:22.223 { 00:28:22.223 "name": "BaseBdev2", 00:28:22.223 "uuid": "4c71b4a4-92c5-541d-8637-2044f7100146", 00:28:22.223 "is_configured": true, 00:28:22.223 "data_offset": 256, 00:28:22.223 "data_size": 7936 00:28:22.223 } 00:28:22.223 ] 00:28:22.223 }' 00:28:22.223 20:04:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:22.223 20:04:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:28:22.482 20:04:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:22.482 20:04:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:28:22.482 20:04:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@724 -- # break 00:28:22.482 20:04:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:22.482 20:04:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:22.482 20:04:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:22.482 20:04:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:22.482 20:04:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:22.482 20:04:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:22.482 20:04:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:22.741 20:04:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:22.741 "name": "raid_bdev1", 00:28:22.741 "uuid": "653eeeb9-ada4-4b8d-9b24-0f57b09af808", 00:28:22.741 "strip_size_kb": 0, 00:28:22.741 "state": "online", 00:28:22.741 "raid_level": "raid1", 00:28:22.741 "superblock": true, 00:28:22.741 "num_base_bdevs": 2, 00:28:22.741 "num_base_bdevs_discovered": 2, 00:28:22.741 "num_base_bdevs_operational": 2, 00:28:22.741 "base_bdevs_list": [ 00:28:22.741 { 00:28:22.741 "name": "spare", 00:28:22.741 "uuid": "3d4c2dbf-00b1-570a-abfa-b49e76480ac3", 00:28:22.741 "is_configured": true, 00:28:22.741 "data_offset": 256, 00:28:22.741 "data_size": 7936 00:28:22.741 }, 00:28:22.741 { 00:28:22.741 "name": "BaseBdev2", 00:28:22.741 "uuid": "4c71b4a4-92c5-541d-8637-2044f7100146", 00:28:22.741 "is_configured": true, 00:28:22.741 "data_offset": 256, 00:28:22.741 "data_size": 7936 00:28:22.741 } 00:28:22.741 ] 00:28:22.741 }' 00:28:22.741 20:04:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:22.741 20:04:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:22.741 20:04:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:22.741 20:04:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:22.741 20:04:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:22.741 20:04:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:22.741 20:04:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:22.742 20:04:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:22.742 20:04:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:22.742 20:04:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:22.742 20:04:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:22.742 20:04:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:22.742 20:04:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:22.742 20:04:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:22.742 20:04:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:22.742 20:04:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:23.001 20:04:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:23.001 "name": "raid_bdev1", 00:28:23.001 "uuid": "653eeeb9-ada4-4b8d-9b24-0f57b09af808", 00:28:23.001 "strip_size_kb": 0, 00:28:23.001 "state": "online", 00:28:23.001 "raid_level": "raid1", 00:28:23.001 "superblock": true, 00:28:23.001 "num_base_bdevs": 2, 00:28:23.001 "num_base_bdevs_discovered": 2, 00:28:23.001 "num_base_bdevs_operational": 2, 00:28:23.001 "base_bdevs_list": [ 00:28:23.001 { 00:28:23.001 "name": "spare", 00:28:23.001 "uuid": "3d4c2dbf-00b1-570a-abfa-b49e76480ac3", 00:28:23.001 "is_configured": true, 00:28:23.001 "data_offset": 256, 00:28:23.001 "data_size": 7936 00:28:23.001 }, 00:28:23.001 { 00:28:23.001 "name": "BaseBdev2", 00:28:23.001 "uuid": "4c71b4a4-92c5-541d-8637-2044f7100146", 00:28:23.001 "is_configured": true, 00:28:23.001 "data_offset": 256, 00:28:23.001 "data_size": 7936 00:28:23.001 } 00:28:23.001 ] 00:28:23.001 }' 00:28:23.001 20:04:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:23.001 20:04:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:23.569 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:23.829 [2024-07-24 20:04:15.254705] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:23.829 [2024-07-24 20:04:15.254734] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:23.829 [2024-07-24 20:04:15.254793] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:23.829 [2024-07-24 20:04:15.254851] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:23.829 [2024-07-24 20:04:15.254863] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe9ac90 name raid_bdev1, state offline 00:28:23.829 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:23.829 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@735 -- # jq length 00:28:24.088 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:28:24.088 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:28:24.088 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # '[' false = true ']' 00:28:24.088 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:28:24.088 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:24.088 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:28:24.088 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:24.088 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:24.088 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:24.088 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:28:24.088 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:24.088 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:24.088 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:28:24.348 /dev/nbd0 00:28:24.348 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:24.348 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:24.348 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:28:24.348 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # local i 00:28:24.348 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:28:24.348 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:28:24.348 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:28:24.348 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@873 -- # break 00:28:24.348 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:28:24.348 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:28:24.348 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:24.348 1+0 records in 00:28:24.348 1+0 records out 00:28:24.348 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000179196 s, 22.9 MB/s 00:28:24.348 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:24.348 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # size=4096 00:28:24.348 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:24.348 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:28:24.348 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@889 -- # return 0 00:28:24.348 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:24.348 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:24.348 20:04:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:28:24.607 /dev/nbd1 00:28:24.607 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:28:24.607 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:28:24.607 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:28:24.607 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # local i 00:28:24.607 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:28:24.607 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:28:24.607 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:28:24.607 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@873 -- # break 00:28:24.607 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:28:24.607 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:28:24.607 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:24.607 1+0 records in 00:28:24.607 1+0 records out 00:28:24.607 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000312704 s, 13.1 MB/s 00:28:24.607 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:24.607 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # size=4096 00:28:24.607 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:24.607 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:28:24.607 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@889 -- # return 0 00:28:24.607 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:24.607 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:24.607 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:28:24.607 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:28:24.607 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:24.607 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:24.607 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:24.607 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:28:24.607 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:24.608 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:28:24.867 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:24.867 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:24.867 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:24.867 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:24.867 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:24.867 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:24.867 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:28:24.867 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:28:24.867 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:24.867 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:28:25.126 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:25.126 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:25.126 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:25.126 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:25.126 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:25.126 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:25.126 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:28:25.126 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:28:25.126 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@758 -- # '[' true = true ']' 00:28:25.126 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:25.694 20:04:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:25.694 [2024-07-24 20:04:17.208326] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:25.694 [2024-07-24 20:04:17.208374] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:25.694 [2024-07-24 20:04:17.208400] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xea3e40 00:28:25.694 [2024-07-24 20:04:17.208414] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:25.694 [2024-07-24 20:04:17.210043] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:25.694 [2024-07-24 20:04:17.210074] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:25.694 [2024-07-24 20:04:17.210155] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:25.694 [2024-07-24 20:04:17.210183] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:25.694 [2024-07-24 20:04:17.210293] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:25.694 spare 00:28:25.694 20:04:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:25.694 20:04:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:25.694 20:04:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:25.694 20:04:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:25.694 20:04:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:25.694 20:04:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:25.694 20:04:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:25.694 20:04:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:25.694 20:04:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:25.694 20:04:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:25.694 20:04:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:25.694 20:04:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:25.954 [2024-07-24 20:04:17.310607] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xdd1070 00:28:25.954 [2024-07-24 20:04:17.310626] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:25.954 [2024-07-24 20:04:17.310817] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdd8370 00:28:25.954 [2024-07-24 20:04:17.310976] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xdd1070 00:28:25.954 [2024-07-24 20:04:17.310986] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xdd1070 00:28:25.954 [2024-07-24 20:04:17.311093] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:25.954 20:04:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:25.954 "name": "raid_bdev1", 00:28:25.954 "uuid": "653eeeb9-ada4-4b8d-9b24-0f57b09af808", 00:28:25.954 "strip_size_kb": 0, 00:28:25.954 "state": "online", 00:28:25.954 "raid_level": "raid1", 00:28:25.954 "superblock": true, 00:28:25.954 "num_base_bdevs": 2, 00:28:25.954 "num_base_bdevs_discovered": 2, 00:28:25.954 "num_base_bdevs_operational": 2, 00:28:25.954 "base_bdevs_list": [ 00:28:25.954 { 00:28:25.954 "name": "spare", 00:28:25.954 "uuid": "3d4c2dbf-00b1-570a-abfa-b49e76480ac3", 00:28:25.954 "is_configured": true, 00:28:25.954 "data_offset": 256, 00:28:25.954 "data_size": 7936 00:28:25.954 }, 00:28:25.954 { 00:28:25.954 "name": "BaseBdev2", 00:28:25.954 "uuid": "4c71b4a4-92c5-541d-8637-2044f7100146", 00:28:25.954 "is_configured": true, 00:28:25.954 "data_offset": 256, 00:28:25.954 "data_size": 7936 00:28:25.954 } 00:28:25.954 ] 00:28:25.954 }' 00:28:25.954 20:04:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:25.954 20:04:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:26.522 20:04:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:26.522 20:04:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:26.522 20:04:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:26.522 20:04:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:26.522 20:04:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:26.522 20:04:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:26.522 20:04:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:26.782 20:04:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:26.782 "name": "raid_bdev1", 00:28:26.782 "uuid": "653eeeb9-ada4-4b8d-9b24-0f57b09af808", 00:28:26.782 "strip_size_kb": 0, 00:28:26.782 "state": "online", 00:28:26.782 "raid_level": "raid1", 00:28:26.782 "superblock": true, 00:28:26.782 "num_base_bdevs": 2, 00:28:26.782 "num_base_bdevs_discovered": 2, 00:28:26.782 "num_base_bdevs_operational": 2, 00:28:26.782 "base_bdevs_list": [ 00:28:26.782 { 00:28:26.782 "name": "spare", 00:28:26.782 "uuid": "3d4c2dbf-00b1-570a-abfa-b49e76480ac3", 00:28:26.782 "is_configured": true, 00:28:26.782 "data_offset": 256, 00:28:26.782 "data_size": 7936 00:28:26.782 }, 00:28:26.782 { 00:28:26.782 "name": "BaseBdev2", 00:28:26.782 "uuid": "4c71b4a4-92c5-541d-8637-2044f7100146", 00:28:26.782 "is_configured": true, 00:28:26.782 "data_offset": 256, 00:28:26.782 "data_size": 7936 00:28:26.782 } 00:28:26.782 ] 00:28:26.782 }' 00:28:26.782 20:04:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:26.782 20:04:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:26.782 20:04:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:27.042 20:04:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:27.042 20:04:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:27.042 20:04:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@765 -- # jq -r '.[].base_bdevs_list[0].name' 00:28:27.301 20:04:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@765 -- # [[ spare == \s\p\a\r\e ]] 00:28:27.301 20:04:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:27.301 [2024-07-24 20:04:18.820721] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:27.301 20:04:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:27.301 20:04:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:27.301 20:04:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:27.301 20:04:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:27.301 20:04:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:27.301 20:04:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:27.301 20:04:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:27.301 20:04:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:27.301 20:04:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:27.301 20:04:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:27.301 20:04:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:27.301 20:04:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:27.560 20:04:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:27.560 "name": "raid_bdev1", 00:28:27.560 "uuid": "653eeeb9-ada4-4b8d-9b24-0f57b09af808", 00:28:27.560 "strip_size_kb": 0, 00:28:27.560 "state": "online", 00:28:27.560 "raid_level": "raid1", 00:28:27.560 "superblock": true, 00:28:27.560 "num_base_bdevs": 2, 00:28:27.560 "num_base_bdevs_discovered": 1, 00:28:27.560 "num_base_bdevs_operational": 1, 00:28:27.560 "base_bdevs_list": [ 00:28:27.560 { 00:28:27.560 "name": null, 00:28:27.560 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:27.560 "is_configured": false, 00:28:27.560 "data_offset": 256, 00:28:27.560 "data_size": 7936 00:28:27.560 }, 00:28:27.560 { 00:28:27.560 "name": "BaseBdev2", 00:28:27.560 "uuid": "4c71b4a4-92c5-541d-8637-2044f7100146", 00:28:27.560 "is_configured": true, 00:28:27.560 "data_offset": 256, 00:28:27.560 "data_size": 7936 00:28:27.560 } 00:28:27.560 ] 00:28:27.560 }' 00:28:27.560 20:04:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:27.560 20:04:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:28.129 20:04:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@770 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:28.389 [2024-07-24 20:04:19.851492] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:28.389 [2024-07-24 20:04:19.851644] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:28.389 [2024-07-24 20:04:19.851662] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:28.389 [2024-07-24 20:04:19.851695] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:28.389 [2024-07-24 20:04:19.856552] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdd3050 00:28:28.389 [2024-07-24 20:04:19.857886] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:28.389 20:04:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # sleep 1 00:28:29.327 20:04:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:29.327 20:04:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:29.327 20:04:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:29.327 20:04:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:29.327 20:04:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:29.327 20:04:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:29.327 20:04:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:29.586 20:04:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:29.586 "name": "raid_bdev1", 00:28:29.586 "uuid": "653eeeb9-ada4-4b8d-9b24-0f57b09af808", 00:28:29.586 "strip_size_kb": 0, 00:28:29.586 "state": "online", 00:28:29.586 "raid_level": "raid1", 00:28:29.586 "superblock": true, 00:28:29.586 "num_base_bdevs": 2, 00:28:29.586 "num_base_bdevs_discovered": 2, 00:28:29.586 "num_base_bdevs_operational": 2, 00:28:29.586 "process": { 00:28:29.586 "type": "rebuild", 00:28:29.586 "target": "spare", 00:28:29.586 "progress": { 00:28:29.586 "blocks": 3072, 00:28:29.586 "percent": 38 00:28:29.586 } 00:28:29.586 }, 00:28:29.586 "base_bdevs_list": [ 00:28:29.586 { 00:28:29.586 "name": "spare", 00:28:29.586 "uuid": "3d4c2dbf-00b1-570a-abfa-b49e76480ac3", 00:28:29.586 "is_configured": true, 00:28:29.586 "data_offset": 256, 00:28:29.586 "data_size": 7936 00:28:29.586 }, 00:28:29.586 { 00:28:29.586 "name": "BaseBdev2", 00:28:29.586 "uuid": "4c71b4a4-92c5-541d-8637-2044f7100146", 00:28:29.586 "is_configured": true, 00:28:29.586 "data_offset": 256, 00:28:29.586 "data_size": 7936 00:28:29.586 } 00:28:29.586 ] 00:28:29.586 }' 00:28:29.586 20:04:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:29.845 20:04:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:29.845 20:04:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:29.845 20:04:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:29.845 20:04:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:30.105 [2024-07-24 20:04:21.445648] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:30.105 [2024-07-24 20:04:21.470665] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:30.105 [2024-07-24 20:04:21.470712] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:30.105 [2024-07-24 20:04:21.470727] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:30.105 [2024-07-24 20:04:21.470735] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:30.105 20:04:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:30.105 20:04:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:30.105 20:04:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:30.105 20:04:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:30.105 20:04:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:30.105 20:04:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:30.105 20:04:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:30.105 20:04:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:30.105 20:04:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:30.105 20:04:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:30.105 20:04:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:30.105 20:04:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:30.363 20:04:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:30.363 "name": "raid_bdev1", 00:28:30.364 "uuid": "653eeeb9-ada4-4b8d-9b24-0f57b09af808", 00:28:30.364 "strip_size_kb": 0, 00:28:30.364 "state": "online", 00:28:30.364 "raid_level": "raid1", 00:28:30.364 "superblock": true, 00:28:30.364 "num_base_bdevs": 2, 00:28:30.364 "num_base_bdevs_discovered": 1, 00:28:30.364 "num_base_bdevs_operational": 1, 00:28:30.364 "base_bdevs_list": [ 00:28:30.364 { 00:28:30.364 "name": null, 00:28:30.364 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:30.364 "is_configured": false, 00:28:30.364 "data_offset": 256, 00:28:30.364 "data_size": 7936 00:28:30.364 }, 00:28:30.364 { 00:28:30.364 "name": "BaseBdev2", 00:28:30.364 "uuid": "4c71b4a4-92c5-541d-8637-2044f7100146", 00:28:30.364 "is_configured": true, 00:28:30.364 "data_offset": 256, 00:28:30.364 "data_size": 7936 00:28:30.364 } 00:28:30.364 ] 00:28:30.364 }' 00:28:30.364 20:04:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:30.364 20:04:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:30.932 20:04:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:31.191 [2024-07-24 20:04:22.602222] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:31.191 [2024-07-24 20:04:22.602270] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:31.191 [2024-07-24 20:04:22.602293] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdd0d50 00:28:31.191 [2024-07-24 20:04:22.602306] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:31.191 [2024-07-24 20:04:22.602686] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:31.191 [2024-07-24 20:04:22.602707] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:31.191 [2024-07-24 20:04:22.602791] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:31.191 [2024-07-24 20:04:22.602803] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:31.191 [2024-07-24 20:04:22.602816] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:31.191 [2024-07-24 20:04:22.602836] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:31.191 [2024-07-24 20:04:22.608028] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdd79a0 00:28:31.191 spare 00:28:31.191 [2024-07-24 20:04:22.609380] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:31.191 20:04:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # sleep 1 00:28:32.128 20:04:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:32.128 20:04:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:32.128 20:04:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:32.128 20:04:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:32.128 20:04:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:32.128 20:04:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:32.128 20:04:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:32.387 20:04:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:32.387 "name": "raid_bdev1", 00:28:32.387 "uuid": "653eeeb9-ada4-4b8d-9b24-0f57b09af808", 00:28:32.387 "strip_size_kb": 0, 00:28:32.387 "state": "online", 00:28:32.387 "raid_level": "raid1", 00:28:32.387 "superblock": true, 00:28:32.387 "num_base_bdevs": 2, 00:28:32.387 "num_base_bdevs_discovered": 2, 00:28:32.387 "num_base_bdevs_operational": 2, 00:28:32.387 "process": { 00:28:32.387 "type": "rebuild", 00:28:32.387 "target": "spare", 00:28:32.387 "progress": { 00:28:32.387 "blocks": 3072, 00:28:32.387 "percent": 38 00:28:32.387 } 00:28:32.387 }, 00:28:32.387 "base_bdevs_list": [ 00:28:32.387 { 00:28:32.387 "name": "spare", 00:28:32.387 "uuid": "3d4c2dbf-00b1-570a-abfa-b49e76480ac3", 00:28:32.387 "is_configured": true, 00:28:32.387 "data_offset": 256, 00:28:32.387 "data_size": 7936 00:28:32.387 }, 00:28:32.387 { 00:28:32.387 "name": "BaseBdev2", 00:28:32.387 "uuid": "4c71b4a4-92c5-541d-8637-2044f7100146", 00:28:32.387 "is_configured": true, 00:28:32.387 "data_offset": 256, 00:28:32.387 "data_size": 7936 00:28:32.387 } 00:28:32.387 ] 00:28:32.387 }' 00:28:32.387 20:04:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:32.387 20:04:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:32.387 20:04:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:32.387 20:04:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:32.387 20:04:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:32.645 [2024-07-24 20:04:24.188613] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:32.645 [2024-07-24 20:04:24.221986] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:32.645 [2024-07-24 20:04:24.222033] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:32.645 [2024-07-24 20:04:24.222048] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:32.646 [2024-07-24 20:04:24.222057] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:32.903 20:04:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@783 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:32.903 20:04:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:32.903 20:04:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:32.903 20:04:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:32.903 20:04:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:32.903 20:04:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:32.903 20:04:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:32.903 20:04:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:32.903 20:04:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:32.903 20:04:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:32.903 20:04:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:32.903 20:04:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:33.162 20:04:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:33.162 "name": "raid_bdev1", 00:28:33.162 "uuid": "653eeeb9-ada4-4b8d-9b24-0f57b09af808", 00:28:33.162 "strip_size_kb": 0, 00:28:33.162 "state": "online", 00:28:33.162 "raid_level": "raid1", 00:28:33.162 "superblock": true, 00:28:33.162 "num_base_bdevs": 2, 00:28:33.162 "num_base_bdevs_discovered": 1, 00:28:33.162 "num_base_bdevs_operational": 1, 00:28:33.162 "base_bdevs_list": [ 00:28:33.162 { 00:28:33.162 "name": null, 00:28:33.162 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:33.162 "is_configured": false, 00:28:33.162 "data_offset": 256, 00:28:33.162 "data_size": 7936 00:28:33.162 }, 00:28:33.162 { 00:28:33.162 "name": "BaseBdev2", 00:28:33.162 "uuid": "4c71b4a4-92c5-541d-8637-2044f7100146", 00:28:33.162 "is_configured": true, 00:28:33.162 "data_offset": 256, 00:28:33.162 "data_size": 7936 00:28:33.162 } 00:28:33.162 ] 00:28:33.162 }' 00:28:33.162 20:04:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:33.162 20:04:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:33.731 20:04:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:33.731 20:04:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:33.731 20:04:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:33.731 20:04:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:33.731 20:04:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:33.731 20:04:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:33.731 20:04:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:33.731 20:04:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:33.731 "name": "raid_bdev1", 00:28:33.731 "uuid": "653eeeb9-ada4-4b8d-9b24-0f57b09af808", 00:28:33.731 "strip_size_kb": 0, 00:28:33.731 "state": "online", 00:28:33.731 "raid_level": "raid1", 00:28:33.731 "superblock": true, 00:28:33.731 "num_base_bdevs": 2, 00:28:33.731 "num_base_bdevs_discovered": 1, 00:28:33.731 "num_base_bdevs_operational": 1, 00:28:33.731 "base_bdevs_list": [ 00:28:33.731 { 00:28:33.731 "name": null, 00:28:33.731 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:33.731 "is_configured": false, 00:28:33.731 "data_offset": 256, 00:28:33.731 "data_size": 7936 00:28:33.731 }, 00:28:33.731 { 00:28:33.731 "name": "BaseBdev2", 00:28:33.731 "uuid": "4c71b4a4-92c5-541d-8637-2044f7100146", 00:28:33.731 "is_configured": true, 00:28:33.731 "data_offset": 256, 00:28:33.731 "data_size": 7936 00:28:33.731 } 00:28:33.731 ] 00:28:33.731 }' 00:28:33.991 20:04:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:33.991 20:04:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:33.991 20:04:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:33.991 20:04:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:33.991 20:04:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@787 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:28:34.250 20:04:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@788 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:34.509 [2024-07-24 20:04:25.886838] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:34.509 [2024-07-24 20:04:25.886885] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:34.509 [2024-07-24 20:04:25.886905] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdd7f00 00:28:34.509 [2024-07-24 20:04:25.886917] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:34.509 [2024-07-24 20:04:25.887250] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:34.509 [2024-07-24 20:04:25.887269] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:34.509 [2024-07-24 20:04:25.887334] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:28:34.509 [2024-07-24 20:04:25.887348] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:34.509 [2024-07-24 20:04:25.887359] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:34.509 BaseBdev1 00:28:34.509 20:04:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@789 -- # sleep 1 00:28:35.447 20:04:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@790 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:35.447 20:04:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:35.447 20:04:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:35.447 20:04:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:35.447 20:04:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:35.447 20:04:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:35.447 20:04:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:35.447 20:04:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:35.447 20:04:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:35.447 20:04:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:35.447 20:04:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:35.447 20:04:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:35.706 20:04:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:35.706 "name": "raid_bdev1", 00:28:35.706 "uuid": "653eeeb9-ada4-4b8d-9b24-0f57b09af808", 00:28:35.706 "strip_size_kb": 0, 00:28:35.706 "state": "online", 00:28:35.706 "raid_level": "raid1", 00:28:35.706 "superblock": true, 00:28:35.706 "num_base_bdevs": 2, 00:28:35.706 "num_base_bdevs_discovered": 1, 00:28:35.706 "num_base_bdevs_operational": 1, 00:28:35.706 "base_bdevs_list": [ 00:28:35.706 { 00:28:35.706 "name": null, 00:28:35.706 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:35.706 "is_configured": false, 00:28:35.706 "data_offset": 256, 00:28:35.706 "data_size": 7936 00:28:35.706 }, 00:28:35.706 { 00:28:35.706 "name": "BaseBdev2", 00:28:35.706 "uuid": "4c71b4a4-92c5-541d-8637-2044f7100146", 00:28:35.706 "is_configured": true, 00:28:35.706 "data_offset": 256, 00:28:35.706 "data_size": 7936 00:28:35.706 } 00:28:35.706 ] 00:28:35.706 }' 00:28:35.706 20:04:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:35.706 20:04:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:36.324 20:04:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:36.324 20:04:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:36.324 20:04:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:36.324 20:04:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:36.324 20:04:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:36.324 20:04:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:36.324 20:04:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:36.582 20:04:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:36.582 "name": "raid_bdev1", 00:28:36.582 "uuid": "653eeeb9-ada4-4b8d-9b24-0f57b09af808", 00:28:36.582 "strip_size_kb": 0, 00:28:36.582 "state": "online", 00:28:36.582 "raid_level": "raid1", 00:28:36.582 "superblock": true, 00:28:36.582 "num_base_bdevs": 2, 00:28:36.582 "num_base_bdevs_discovered": 1, 00:28:36.582 "num_base_bdevs_operational": 1, 00:28:36.582 "base_bdevs_list": [ 00:28:36.582 { 00:28:36.582 "name": null, 00:28:36.582 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:36.582 "is_configured": false, 00:28:36.582 "data_offset": 256, 00:28:36.582 "data_size": 7936 00:28:36.582 }, 00:28:36.582 { 00:28:36.582 "name": "BaseBdev2", 00:28:36.582 "uuid": "4c71b4a4-92c5-541d-8637-2044f7100146", 00:28:36.582 "is_configured": true, 00:28:36.582 "data_offset": 256, 00:28:36.582 "data_size": 7936 00:28:36.582 } 00:28:36.582 ] 00:28:36.582 }' 00:28:36.582 20:04:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:36.582 20:04:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:36.582 20:04:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:36.582 20:04:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:36.582 20:04:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@792 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:36.582 20:04:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@650 -- # local es=0 00:28:36.582 20:04:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:36.582 20:04:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:36.582 20:04:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:28:36.582 20:04:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:36.582 20:04:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:28:36.582 20:04:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:36.582 20:04:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:28:36.582 20:04:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:36.582 20:04:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:36.582 20:04:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:36.841 [2024-07-24 20:04:28.293250] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:36.841 [2024-07-24 20:04:28.293373] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:36.841 [2024-07-24 20:04:28.293397] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:36.841 request: 00:28:36.841 { 00:28:36.841 "base_bdev": "BaseBdev1", 00:28:36.841 "raid_bdev": "raid_bdev1", 00:28:36.841 "method": "bdev_raid_add_base_bdev", 00:28:36.841 "req_id": 1 00:28:36.841 } 00:28:36.841 Got JSON-RPC error response 00:28:36.841 response: 00:28:36.841 { 00:28:36.841 "code": -22, 00:28:36.841 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:28:36.841 } 00:28:36.841 20:04:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@653 -- # es=1 00:28:36.841 20:04:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:28:36.841 20:04:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:28:36.841 20:04:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:28:36.841 20:04:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@793 -- # sleep 1 00:28:37.778 20:04:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@794 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:37.778 20:04:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:37.778 20:04:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:37.778 20:04:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:37.778 20:04:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:37.778 20:04:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:37.778 20:04:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:37.778 20:04:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:37.778 20:04:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:37.778 20:04:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:37.778 20:04:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:37.778 20:04:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:38.038 20:04:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:38.038 "name": "raid_bdev1", 00:28:38.038 "uuid": "653eeeb9-ada4-4b8d-9b24-0f57b09af808", 00:28:38.038 "strip_size_kb": 0, 00:28:38.038 "state": "online", 00:28:38.038 "raid_level": "raid1", 00:28:38.038 "superblock": true, 00:28:38.038 "num_base_bdevs": 2, 00:28:38.038 "num_base_bdevs_discovered": 1, 00:28:38.038 "num_base_bdevs_operational": 1, 00:28:38.038 "base_bdevs_list": [ 00:28:38.038 { 00:28:38.038 "name": null, 00:28:38.038 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:38.038 "is_configured": false, 00:28:38.038 "data_offset": 256, 00:28:38.038 "data_size": 7936 00:28:38.038 }, 00:28:38.038 { 00:28:38.038 "name": "BaseBdev2", 00:28:38.038 "uuid": "4c71b4a4-92c5-541d-8637-2044f7100146", 00:28:38.038 "is_configured": true, 00:28:38.038 "data_offset": 256, 00:28:38.038 "data_size": 7936 00:28:38.038 } 00:28:38.038 ] 00:28:38.038 }' 00:28:38.038 20:04:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:38.038 20:04:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:38.606 20:04:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@795 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:38.606 20:04:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:38.606 20:04:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:38.606 20:04:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:38.606 20:04:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:38.606 20:04:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:38.606 20:04:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:38.866 20:04:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:38.866 "name": "raid_bdev1", 00:28:38.866 "uuid": "653eeeb9-ada4-4b8d-9b24-0f57b09af808", 00:28:38.866 "strip_size_kb": 0, 00:28:38.866 "state": "online", 00:28:38.866 "raid_level": "raid1", 00:28:38.866 "superblock": true, 00:28:38.866 "num_base_bdevs": 2, 00:28:38.866 "num_base_bdevs_discovered": 1, 00:28:38.866 "num_base_bdevs_operational": 1, 00:28:38.866 "base_bdevs_list": [ 00:28:38.866 { 00:28:38.866 "name": null, 00:28:38.866 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:38.866 "is_configured": false, 00:28:38.866 "data_offset": 256, 00:28:38.866 "data_size": 7936 00:28:38.866 }, 00:28:38.866 { 00:28:38.866 "name": "BaseBdev2", 00:28:38.866 "uuid": "4c71b4a4-92c5-541d-8637-2044f7100146", 00:28:38.866 "is_configured": true, 00:28:38.866 "data_offset": 256, 00:28:38.866 "data_size": 7936 00:28:38.866 } 00:28:38.866 ] 00:28:38.866 }' 00:28:38.866 20:04:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:38.866 20:04:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:38.866 20:04:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:38.866 20:04:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:38.866 20:04:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@798 -- # killprocess 1521860 00:28:38.866 20:04:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@950 -- # '[' -z 1521860 ']' 00:28:38.866 20:04:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # kill -0 1521860 00:28:38.866 20:04:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@955 -- # uname 00:28:38.866 20:04:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:28:38.866 20:04:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1521860 00:28:39.125 20:04:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:28:39.125 20:04:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:28:39.125 20:04:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1521860' 00:28:39.125 killing process with pid 1521860 00:28:39.125 20:04:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@969 -- # kill 1521860 00:28:39.125 Received shutdown signal, test time was about 60.000000 seconds 00:28:39.125 00:28:39.125 Latency(us) 00:28:39.125 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:39.125 =================================================================================================================== 00:28:39.125 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:28:39.125 [2024-07-24 20:04:30.478488] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:39.125 [2024-07-24 20:04:30.478575] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:39.125 [2024-07-24 20:04:30.478620] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:39.125 [2024-07-24 20:04:30.478632] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdd1070 name raid_bdev1, state offline 00:28:39.125 20:04:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@974 -- # wait 1521860 00:28:39.125 [2024-07-24 20:04:30.504706] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:39.125 20:04:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@800 -- # return 0 00:28:39.125 00:28:39.125 real 0m31.656s 00:28:39.125 user 0m49.172s 00:28:39.125 sys 0m5.255s 00:28:39.125 20:04:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1126 -- # xtrace_disable 00:28:39.125 20:04:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:39.125 ************************************ 00:28:39.125 END TEST raid_rebuild_test_sb_4k 00:28:39.125 ************************************ 00:28:39.385 20:04:30 bdev_raid -- bdev/bdev_raid.sh@984 -- # base_malloc_params='-m 32' 00:28:39.385 20:04:30 bdev_raid -- bdev/bdev_raid.sh@985 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:28:39.385 20:04:30 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:28:39.385 20:04:30 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:28:39.385 20:04:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:39.385 ************************************ 00:28:39.385 START TEST raid_state_function_test_sb_md_separate 00:28:39.385 ************************************ 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 2 true 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=1526351 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1526351' 00:28:39.385 Process raid pid: 1526351 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 1526351 /var/tmp/spdk-raid.sock 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@831 -- # '[' -z 1526351 ']' 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@836 -- # local max_retries=100 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:39.385 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@840 -- # xtrace_disable 00:28:39.385 20:04:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:39.385 [2024-07-24 20:04:30.906374] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:28:39.385 [2024-07-24 20:04:30.906521] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:39.644 [2024-07-24 20:04:31.099973] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:39.644 [2024-07-24 20:04:31.197643] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:39.902 [2024-07-24 20:04:31.270734] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:39.902 [2024-07-24 20:04:31.270795] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:40.161 20:04:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:28:40.161 20:04:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@864 -- # return 0 00:28:40.161 20:04:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:40.730 [2024-07-24 20:04:32.219253] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:40.730 [2024-07-24 20:04:32.219293] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:40.730 [2024-07-24 20:04:32.219304] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:40.730 [2024-07-24 20:04:32.219315] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:40.730 20:04:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:40.730 20:04:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:40.730 20:04:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:40.730 20:04:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:40.730 20:04:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:40.730 20:04:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:40.730 20:04:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:40.730 20:04:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:40.730 20:04:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:40.730 20:04:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:40.730 20:04:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:40.730 20:04:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:41.298 20:04:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:41.298 "name": "Existed_Raid", 00:28:41.298 "uuid": "233c1cf4-ae9b-4378-bf78-6eac587c09db", 00:28:41.298 "strip_size_kb": 0, 00:28:41.298 "state": "configuring", 00:28:41.298 "raid_level": "raid1", 00:28:41.298 "superblock": true, 00:28:41.298 "num_base_bdevs": 2, 00:28:41.298 "num_base_bdevs_discovered": 0, 00:28:41.298 "num_base_bdevs_operational": 2, 00:28:41.298 "base_bdevs_list": [ 00:28:41.298 { 00:28:41.298 "name": "BaseBdev1", 00:28:41.298 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:41.298 "is_configured": false, 00:28:41.298 "data_offset": 0, 00:28:41.298 "data_size": 0 00:28:41.298 }, 00:28:41.298 { 00:28:41.298 "name": "BaseBdev2", 00:28:41.298 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:41.298 "is_configured": false, 00:28:41.298 "data_offset": 0, 00:28:41.298 "data_size": 0 00:28:41.298 } 00:28:41.298 ] 00:28:41.298 }' 00:28:41.298 20:04:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:41.298 20:04:32 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:41.867 20:04:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:42.126 [2024-07-24 20:04:33.654890] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:42.126 [2024-07-24 20:04:33.654919] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ae69f0 name Existed_Raid, state configuring 00:28:42.126 20:04:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:42.384 [2024-07-24 20:04:33.903568] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:42.384 [2024-07-24 20:04:33.903599] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:42.384 [2024-07-24 20:04:33.903608] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:42.385 [2024-07-24 20:04:33.903620] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:42.385 20:04:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:28:42.643 [2024-07-24 20:04:34.162702] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:42.643 BaseBdev1 00:28:42.643 20:04:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:28:42.643 20:04:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:28:42.643 20:04:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:28:42.643 20:04:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # local i 00:28:42.643 20:04:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:28:42.643 20:04:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:28:42.643 20:04:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:42.902 20:04:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:28:43.162 [ 00:28:43.162 { 00:28:43.162 "name": "BaseBdev1", 00:28:43.162 "aliases": [ 00:28:43.162 "45fcff15-d785-4291-8f08-647a512e879f" 00:28:43.162 ], 00:28:43.162 "product_name": "Malloc disk", 00:28:43.162 "block_size": 4096, 00:28:43.162 "num_blocks": 8192, 00:28:43.162 "uuid": "45fcff15-d785-4291-8f08-647a512e879f", 00:28:43.162 "md_size": 32, 00:28:43.162 "md_interleave": false, 00:28:43.162 "dif_type": 0, 00:28:43.162 "assigned_rate_limits": { 00:28:43.162 "rw_ios_per_sec": 0, 00:28:43.162 "rw_mbytes_per_sec": 0, 00:28:43.162 "r_mbytes_per_sec": 0, 00:28:43.162 "w_mbytes_per_sec": 0 00:28:43.162 }, 00:28:43.162 "claimed": true, 00:28:43.162 "claim_type": "exclusive_write", 00:28:43.162 "zoned": false, 00:28:43.162 "supported_io_types": { 00:28:43.162 "read": true, 00:28:43.162 "write": true, 00:28:43.162 "unmap": true, 00:28:43.162 "flush": true, 00:28:43.162 "reset": true, 00:28:43.162 "nvme_admin": false, 00:28:43.162 "nvme_io": false, 00:28:43.162 "nvme_io_md": false, 00:28:43.162 "write_zeroes": true, 00:28:43.162 "zcopy": true, 00:28:43.162 "get_zone_info": false, 00:28:43.162 "zone_management": false, 00:28:43.162 "zone_append": false, 00:28:43.162 "compare": false, 00:28:43.162 "compare_and_write": false, 00:28:43.162 "abort": true, 00:28:43.162 "seek_hole": false, 00:28:43.162 "seek_data": false, 00:28:43.162 "copy": true, 00:28:43.162 "nvme_iov_md": false 00:28:43.162 }, 00:28:43.162 "memory_domains": [ 00:28:43.162 { 00:28:43.162 "dma_device_id": "system", 00:28:43.162 "dma_device_type": 1 00:28:43.162 }, 00:28:43.162 { 00:28:43.162 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:43.162 "dma_device_type": 2 00:28:43.162 } 00:28:43.162 ], 00:28:43.162 "driver_specific": {} 00:28:43.162 } 00:28:43.162 ] 00:28:43.162 20:04:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@907 -- # return 0 00:28:43.162 20:04:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:43.162 20:04:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:43.162 20:04:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:43.162 20:04:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:43.162 20:04:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:43.162 20:04:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:43.162 20:04:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:43.162 20:04:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:43.162 20:04:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:43.162 20:04:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:43.162 20:04:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:43.162 20:04:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:43.421 20:04:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:43.421 "name": "Existed_Raid", 00:28:43.421 "uuid": "2653bddd-900c-435c-aa19-de859d686262", 00:28:43.421 "strip_size_kb": 0, 00:28:43.421 "state": "configuring", 00:28:43.421 "raid_level": "raid1", 00:28:43.421 "superblock": true, 00:28:43.421 "num_base_bdevs": 2, 00:28:43.421 "num_base_bdevs_discovered": 1, 00:28:43.421 "num_base_bdevs_operational": 2, 00:28:43.421 "base_bdevs_list": [ 00:28:43.421 { 00:28:43.421 "name": "BaseBdev1", 00:28:43.421 "uuid": "45fcff15-d785-4291-8f08-647a512e879f", 00:28:43.421 "is_configured": true, 00:28:43.421 "data_offset": 256, 00:28:43.421 "data_size": 7936 00:28:43.421 }, 00:28:43.421 { 00:28:43.421 "name": "BaseBdev2", 00:28:43.421 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:43.421 "is_configured": false, 00:28:43.421 "data_offset": 0, 00:28:43.421 "data_size": 0 00:28:43.421 } 00:28:43.421 ] 00:28:43.421 }' 00:28:43.421 20:04:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:43.421 20:04:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:43.989 20:04:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:44.248 [2024-07-24 20:04:35.739008] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:44.249 [2024-07-24 20:04:35.739047] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ae62e0 name Existed_Raid, state configuring 00:28:44.249 20:04:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:44.508 [2024-07-24 20:04:35.987705] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:44.508 [2024-07-24 20:04:35.989123] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:44.508 [2024-07-24 20:04:35.989155] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:44.508 20:04:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:28:44.508 20:04:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:44.508 20:04:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:44.508 20:04:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:44.508 20:04:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:44.508 20:04:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:44.508 20:04:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:44.508 20:04:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:44.508 20:04:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:44.508 20:04:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:44.508 20:04:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:44.508 20:04:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:44.508 20:04:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:44.508 20:04:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:44.767 20:04:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:44.767 "name": "Existed_Raid", 00:28:44.767 "uuid": "a5c18b01-b96a-44cb-ac87-47b24aa370e0", 00:28:44.767 "strip_size_kb": 0, 00:28:44.767 "state": "configuring", 00:28:44.767 "raid_level": "raid1", 00:28:44.767 "superblock": true, 00:28:44.767 "num_base_bdevs": 2, 00:28:44.767 "num_base_bdevs_discovered": 1, 00:28:44.767 "num_base_bdevs_operational": 2, 00:28:44.767 "base_bdevs_list": [ 00:28:44.767 { 00:28:44.767 "name": "BaseBdev1", 00:28:44.767 "uuid": "45fcff15-d785-4291-8f08-647a512e879f", 00:28:44.767 "is_configured": true, 00:28:44.767 "data_offset": 256, 00:28:44.767 "data_size": 7936 00:28:44.767 }, 00:28:44.767 { 00:28:44.767 "name": "BaseBdev2", 00:28:44.767 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:44.767 "is_configured": false, 00:28:44.767 "data_offset": 0, 00:28:44.767 "data_size": 0 00:28:44.767 } 00:28:44.767 ] 00:28:44.767 }' 00:28:44.767 20:04:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:44.767 20:04:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:45.335 20:04:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:28:45.594 [2024-07-24 20:04:37.018512] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:45.594 [2024-07-24 20:04:37.018655] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ae8250 00:28:45.594 [2024-07-24 20:04:37.018669] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:45.594 [2024-07-24 20:04:37.018728] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ae6880 00:28:45.594 [2024-07-24 20:04:37.018831] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ae8250 00:28:45.594 [2024-07-24 20:04:37.018841] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1ae8250 00:28:45.594 [2024-07-24 20:04:37.018905] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:45.594 BaseBdev2 00:28:45.594 20:04:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:28:45.594 20:04:37 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:28:45.594 20:04:37 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:28:45.594 20:04:37 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # local i 00:28:45.594 20:04:37 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:28:45.594 20:04:37 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:28:45.594 20:04:37 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:45.853 20:04:37 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:28:46.112 [ 00:28:46.112 { 00:28:46.112 "name": "BaseBdev2", 00:28:46.112 "aliases": [ 00:28:46.112 "51538553-3c36-42be-95dd-80aa1aff32ed" 00:28:46.112 ], 00:28:46.112 "product_name": "Malloc disk", 00:28:46.112 "block_size": 4096, 00:28:46.112 "num_blocks": 8192, 00:28:46.112 "uuid": "51538553-3c36-42be-95dd-80aa1aff32ed", 00:28:46.112 "md_size": 32, 00:28:46.112 "md_interleave": false, 00:28:46.112 "dif_type": 0, 00:28:46.112 "assigned_rate_limits": { 00:28:46.112 "rw_ios_per_sec": 0, 00:28:46.112 "rw_mbytes_per_sec": 0, 00:28:46.112 "r_mbytes_per_sec": 0, 00:28:46.112 "w_mbytes_per_sec": 0 00:28:46.112 }, 00:28:46.112 "claimed": true, 00:28:46.112 "claim_type": "exclusive_write", 00:28:46.112 "zoned": false, 00:28:46.112 "supported_io_types": { 00:28:46.112 "read": true, 00:28:46.112 "write": true, 00:28:46.112 "unmap": true, 00:28:46.112 "flush": true, 00:28:46.112 "reset": true, 00:28:46.112 "nvme_admin": false, 00:28:46.112 "nvme_io": false, 00:28:46.112 "nvme_io_md": false, 00:28:46.112 "write_zeroes": true, 00:28:46.112 "zcopy": true, 00:28:46.112 "get_zone_info": false, 00:28:46.112 "zone_management": false, 00:28:46.112 "zone_append": false, 00:28:46.112 "compare": false, 00:28:46.112 "compare_and_write": false, 00:28:46.112 "abort": true, 00:28:46.112 "seek_hole": false, 00:28:46.112 "seek_data": false, 00:28:46.112 "copy": true, 00:28:46.112 "nvme_iov_md": false 00:28:46.112 }, 00:28:46.112 "memory_domains": [ 00:28:46.112 { 00:28:46.112 "dma_device_id": "system", 00:28:46.112 "dma_device_type": 1 00:28:46.112 }, 00:28:46.112 { 00:28:46.112 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:46.112 "dma_device_type": 2 00:28:46.112 } 00:28:46.112 ], 00:28:46.112 "driver_specific": {} 00:28:46.112 } 00:28:46.112 ] 00:28:46.112 20:04:37 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@907 -- # return 0 00:28:46.112 20:04:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:28:46.112 20:04:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:46.112 20:04:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:28:46.112 20:04:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:46.112 20:04:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:46.112 20:04:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:46.112 20:04:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:46.112 20:04:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:46.112 20:04:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:46.112 20:04:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:46.112 20:04:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:46.112 20:04:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:46.112 20:04:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:46.112 20:04:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:46.370 20:04:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:46.370 "name": "Existed_Raid", 00:28:46.370 "uuid": "a5c18b01-b96a-44cb-ac87-47b24aa370e0", 00:28:46.370 "strip_size_kb": 0, 00:28:46.370 "state": "online", 00:28:46.370 "raid_level": "raid1", 00:28:46.370 "superblock": true, 00:28:46.370 "num_base_bdevs": 2, 00:28:46.370 "num_base_bdevs_discovered": 2, 00:28:46.370 "num_base_bdevs_operational": 2, 00:28:46.370 "base_bdevs_list": [ 00:28:46.370 { 00:28:46.370 "name": "BaseBdev1", 00:28:46.370 "uuid": "45fcff15-d785-4291-8f08-647a512e879f", 00:28:46.370 "is_configured": true, 00:28:46.370 "data_offset": 256, 00:28:46.370 "data_size": 7936 00:28:46.370 }, 00:28:46.370 { 00:28:46.370 "name": "BaseBdev2", 00:28:46.371 "uuid": "51538553-3c36-42be-95dd-80aa1aff32ed", 00:28:46.371 "is_configured": true, 00:28:46.371 "data_offset": 256, 00:28:46.371 "data_size": 7936 00:28:46.371 } 00:28:46.371 ] 00:28:46.371 }' 00:28:46.371 20:04:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:46.371 20:04:37 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:46.938 20:04:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:28:46.938 20:04:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:28:46.938 20:04:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:46.938 20:04:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:46.938 20:04:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:46.938 20:04:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:28:46.938 20:04:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:28:46.938 20:04:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:47.197 [2024-07-24 20:04:38.711303] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:47.197 20:04:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:47.197 "name": "Existed_Raid", 00:28:47.197 "aliases": [ 00:28:47.197 "a5c18b01-b96a-44cb-ac87-47b24aa370e0" 00:28:47.197 ], 00:28:47.197 "product_name": "Raid Volume", 00:28:47.197 "block_size": 4096, 00:28:47.197 "num_blocks": 7936, 00:28:47.197 "uuid": "a5c18b01-b96a-44cb-ac87-47b24aa370e0", 00:28:47.197 "md_size": 32, 00:28:47.197 "md_interleave": false, 00:28:47.197 "dif_type": 0, 00:28:47.197 "assigned_rate_limits": { 00:28:47.197 "rw_ios_per_sec": 0, 00:28:47.197 "rw_mbytes_per_sec": 0, 00:28:47.197 "r_mbytes_per_sec": 0, 00:28:47.197 "w_mbytes_per_sec": 0 00:28:47.197 }, 00:28:47.197 "claimed": false, 00:28:47.197 "zoned": false, 00:28:47.197 "supported_io_types": { 00:28:47.197 "read": true, 00:28:47.197 "write": true, 00:28:47.197 "unmap": false, 00:28:47.197 "flush": false, 00:28:47.197 "reset": true, 00:28:47.197 "nvme_admin": false, 00:28:47.197 "nvme_io": false, 00:28:47.197 "nvme_io_md": false, 00:28:47.197 "write_zeroes": true, 00:28:47.197 "zcopy": false, 00:28:47.197 "get_zone_info": false, 00:28:47.197 "zone_management": false, 00:28:47.197 "zone_append": false, 00:28:47.197 "compare": false, 00:28:47.197 "compare_and_write": false, 00:28:47.197 "abort": false, 00:28:47.197 "seek_hole": false, 00:28:47.197 "seek_data": false, 00:28:47.197 "copy": false, 00:28:47.197 "nvme_iov_md": false 00:28:47.197 }, 00:28:47.197 "memory_domains": [ 00:28:47.197 { 00:28:47.197 "dma_device_id": "system", 00:28:47.197 "dma_device_type": 1 00:28:47.197 }, 00:28:47.197 { 00:28:47.197 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:47.197 "dma_device_type": 2 00:28:47.197 }, 00:28:47.197 { 00:28:47.197 "dma_device_id": "system", 00:28:47.197 "dma_device_type": 1 00:28:47.197 }, 00:28:47.197 { 00:28:47.197 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:47.197 "dma_device_type": 2 00:28:47.197 } 00:28:47.197 ], 00:28:47.197 "driver_specific": { 00:28:47.197 "raid": { 00:28:47.197 "uuid": "a5c18b01-b96a-44cb-ac87-47b24aa370e0", 00:28:47.197 "strip_size_kb": 0, 00:28:47.197 "state": "online", 00:28:47.197 "raid_level": "raid1", 00:28:47.197 "superblock": true, 00:28:47.197 "num_base_bdevs": 2, 00:28:47.197 "num_base_bdevs_discovered": 2, 00:28:47.197 "num_base_bdevs_operational": 2, 00:28:47.197 "base_bdevs_list": [ 00:28:47.197 { 00:28:47.197 "name": "BaseBdev1", 00:28:47.197 "uuid": "45fcff15-d785-4291-8f08-647a512e879f", 00:28:47.197 "is_configured": true, 00:28:47.197 "data_offset": 256, 00:28:47.197 "data_size": 7936 00:28:47.197 }, 00:28:47.197 { 00:28:47.197 "name": "BaseBdev2", 00:28:47.197 "uuid": "51538553-3c36-42be-95dd-80aa1aff32ed", 00:28:47.197 "is_configured": true, 00:28:47.197 "data_offset": 256, 00:28:47.197 "data_size": 7936 00:28:47.197 } 00:28:47.197 ] 00:28:47.197 } 00:28:47.197 } 00:28:47.197 }' 00:28:47.197 20:04:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:47.197 20:04:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:28:47.197 BaseBdev2' 00:28:47.197 20:04:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:47.197 20:04:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:28:47.197 20:04:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:47.766 20:04:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:47.766 "name": "BaseBdev1", 00:28:47.766 "aliases": [ 00:28:47.766 "45fcff15-d785-4291-8f08-647a512e879f" 00:28:47.766 ], 00:28:47.766 "product_name": "Malloc disk", 00:28:47.766 "block_size": 4096, 00:28:47.766 "num_blocks": 8192, 00:28:47.766 "uuid": "45fcff15-d785-4291-8f08-647a512e879f", 00:28:47.766 "md_size": 32, 00:28:47.766 "md_interleave": false, 00:28:47.766 "dif_type": 0, 00:28:47.766 "assigned_rate_limits": { 00:28:47.766 "rw_ios_per_sec": 0, 00:28:47.766 "rw_mbytes_per_sec": 0, 00:28:47.766 "r_mbytes_per_sec": 0, 00:28:47.766 "w_mbytes_per_sec": 0 00:28:47.766 }, 00:28:47.766 "claimed": true, 00:28:47.766 "claim_type": "exclusive_write", 00:28:47.766 "zoned": false, 00:28:47.766 "supported_io_types": { 00:28:47.766 "read": true, 00:28:47.766 "write": true, 00:28:47.766 "unmap": true, 00:28:47.766 "flush": true, 00:28:47.766 "reset": true, 00:28:47.766 "nvme_admin": false, 00:28:47.766 "nvme_io": false, 00:28:47.766 "nvme_io_md": false, 00:28:47.766 "write_zeroes": true, 00:28:47.766 "zcopy": true, 00:28:47.766 "get_zone_info": false, 00:28:47.766 "zone_management": false, 00:28:47.766 "zone_append": false, 00:28:47.766 "compare": false, 00:28:47.766 "compare_and_write": false, 00:28:47.766 "abort": true, 00:28:47.766 "seek_hole": false, 00:28:47.766 "seek_data": false, 00:28:47.766 "copy": true, 00:28:47.766 "nvme_iov_md": false 00:28:47.766 }, 00:28:47.766 "memory_domains": [ 00:28:47.766 { 00:28:47.766 "dma_device_id": "system", 00:28:47.766 "dma_device_type": 1 00:28:47.766 }, 00:28:47.766 { 00:28:47.766 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:47.766 "dma_device_type": 2 00:28:47.766 } 00:28:47.766 ], 00:28:47.766 "driver_specific": {} 00:28:47.766 }' 00:28:47.766 20:04:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:47.766 20:04:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:48.025 20:04:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:48.025 20:04:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:48.025 20:04:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:48.025 20:04:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:48.025 20:04:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:48.025 20:04:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:48.284 20:04:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:48.284 20:04:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:48.284 20:04:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:48.284 20:04:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:48.284 20:04:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:48.284 20:04:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:48.284 20:04:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:28:48.543 20:04:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:48.543 "name": "BaseBdev2", 00:28:48.543 "aliases": [ 00:28:48.543 "51538553-3c36-42be-95dd-80aa1aff32ed" 00:28:48.543 ], 00:28:48.543 "product_name": "Malloc disk", 00:28:48.543 "block_size": 4096, 00:28:48.543 "num_blocks": 8192, 00:28:48.543 "uuid": "51538553-3c36-42be-95dd-80aa1aff32ed", 00:28:48.543 "md_size": 32, 00:28:48.543 "md_interleave": false, 00:28:48.543 "dif_type": 0, 00:28:48.543 "assigned_rate_limits": { 00:28:48.543 "rw_ios_per_sec": 0, 00:28:48.543 "rw_mbytes_per_sec": 0, 00:28:48.543 "r_mbytes_per_sec": 0, 00:28:48.543 "w_mbytes_per_sec": 0 00:28:48.543 }, 00:28:48.543 "claimed": true, 00:28:48.543 "claim_type": "exclusive_write", 00:28:48.543 "zoned": false, 00:28:48.543 "supported_io_types": { 00:28:48.543 "read": true, 00:28:48.543 "write": true, 00:28:48.543 "unmap": true, 00:28:48.543 "flush": true, 00:28:48.543 "reset": true, 00:28:48.543 "nvme_admin": false, 00:28:48.543 "nvme_io": false, 00:28:48.543 "nvme_io_md": false, 00:28:48.543 "write_zeroes": true, 00:28:48.543 "zcopy": true, 00:28:48.543 "get_zone_info": false, 00:28:48.543 "zone_management": false, 00:28:48.543 "zone_append": false, 00:28:48.543 "compare": false, 00:28:48.543 "compare_and_write": false, 00:28:48.543 "abort": true, 00:28:48.543 "seek_hole": false, 00:28:48.543 "seek_data": false, 00:28:48.543 "copy": true, 00:28:48.543 "nvme_iov_md": false 00:28:48.543 }, 00:28:48.543 "memory_domains": [ 00:28:48.543 { 00:28:48.543 "dma_device_id": "system", 00:28:48.543 "dma_device_type": 1 00:28:48.543 }, 00:28:48.543 { 00:28:48.543 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:48.543 "dma_device_type": 2 00:28:48.543 } 00:28:48.543 ], 00:28:48.543 "driver_specific": {} 00:28:48.543 }' 00:28:48.543 20:04:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:48.543 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:48.543 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:48.543 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:48.802 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:48.802 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:48.802 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:48.802 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:48.802 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:48.802 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:49.061 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:49.061 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:49.061 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:28:49.321 [2024-07-24 20:04:40.800611] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:49.321 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:28:49.321 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:28:49.321 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:49.321 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:28:49.321 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:28:49.321 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:28:49.321 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:49.321 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:49.321 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:49.321 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:49.321 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:49.321 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:49.321 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:49.321 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:49.321 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:49.321 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:49.321 20:04:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:49.889 20:04:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:49.889 "name": "Existed_Raid", 00:28:49.889 "uuid": "a5c18b01-b96a-44cb-ac87-47b24aa370e0", 00:28:49.889 "strip_size_kb": 0, 00:28:49.889 "state": "online", 00:28:49.889 "raid_level": "raid1", 00:28:49.889 "superblock": true, 00:28:49.889 "num_base_bdevs": 2, 00:28:49.889 "num_base_bdevs_discovered": 1, 00:28:49.889 "num_base_bdevs_operational": 1, 00:28:49.889 "base_bdevs_list": [ 00:28:49.889 { 00:28:49.889 "name": null, 00:28:49.889 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:49.889 "is_configured": false, 00:28:49.889 "data_offset": 256, 00:28:49.889 "data_size": 7936 00:28:49.889 }, 00:28:49.889 { 00:28:49.889 "name": "BaseBdev2", 00:28:49.889 "uuid": "51538553-3c36-42be-95dd-80aa1aff32ed", 00:28:49.889 "is_configured": true, 00:28:49.889 "data_offset": 256, 00:28:49.889 "data_size": 7936 00:28:49.889 } 00:28:49.889 ] 00:28:49.889 }' 00:28:49.889 20:04:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:49.889 20:04:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:50.826 20:04:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:28:50.826 20:04:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:50.826 20:04:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:28:50.826 20:04:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:51.085 20:04:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:28:51.085 20:04:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:28:51.085 20:04:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:28:51.652 [2024-07-24 20:04:42.970896] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:28:51.652 [2024-07-24 20:04:42.970983] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:51.652 [2024-07-24 20:04:42.984438] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:51.652 [2024-07-24 20:04:42.984475] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:51.652 [2024-07-24 20:04:42.984486] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ae8250 name Existed_Raid, state offline 00:28:51.652 20:04:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:28:51.652 20:04:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:51.653 20:04:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:51.653 20:04:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:28:52.230 20:04:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:28:52.230 20:04:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:28:52.230 20:04:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:28:52.231 20:04:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 1526351 00:28:52.231 20:04:43 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@950 -- # '[' -z 1526351 ']' 00:28:52.231 20:04:43 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # kill -0 1526351 00:28:52.231 20:04:43 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@955 -- # uname 00:28:52.231 20:04:43 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:28:52.231 20:04:43 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1526351 00:28:52.231 20:04:43 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:28:52.231 20:04:43 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:28:52.231 20:04:43 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1526351' 00:28:52.231 killing process with pid 1526351 00:28:52.231 20:04:43 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@969 -- # kill 1526351 00:28:52.231 [2024-07-24 20:04:43.581860] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:52.231 20:04:43 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@974 -- # wait 1526351 00:28:52.231 [2024-07-24 20:04:43.582834] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:52.231 20:04:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:28:52.231 00:28:52.231 real 0m13.013s 00:28:52.231 user 0m23.332s 00:28:52.231 sys 0m2.279s 00:28:52.231 20:04:43 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1126 -- # xtrace_disable 00:28:52.231 20:04:43 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:52.231 ************************************ 00:28:52.231 END TEST raid_state_function_test_sb_md_separate 00:28:52.231 ************************************ 00:28:52.490 20:04:43 bdev_raid -- bdev/bdev_raid.sh@986 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:28:52.490 20:04:43 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:28:52.490 20:04:43 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:28:52.490 20:04:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:52.490 ************************************ 00:28:52.490 START TEST raid_superblock_test_md_separate 00:28:52.490 ************************************ 00:28:52.490 20:04:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 2 00:28:52.490 20:04:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@408 -- # local raid_level=raid1 00:28:52.490 20:04:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=2 00:28:52.490 20:04:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:28:52.490 20:04:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:28:52.490 20:04:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:28:52.490 20:04:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:28:52.490 20:04:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:28:52.490 20:04:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:28:52.490 20:04:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:28:52.490 20:04:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@414 -- # local strip_size 00:28:52.490 20:04:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:28:52.490 20:04:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:28:52.490 20:04:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:28:52.490 20:04:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@419 -- # '[' raid1 '!=' raid1 ']' 00:28:52.490 20:04:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@423 -- # strip_size=0 00:28:52.490 20:04:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@427 -- # raid_pid=1528164 00:28:52.490 20:04:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@428 -- # waitforlisten 1528164 /var/tmp/spdk-raid.sock 00:28:52.490 20:04:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:28:52.490 20:04:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@831 -- # '[' -z 1528164 ']' 00:28:52.490 20:04:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:52.490 20:04:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@836 -- # local max_retries=100 00:28:52.490 20:04:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:52.490 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:52.490 20:04:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@840 -- # xtrace_disable 00:28:52.490 20:04:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:52.490 [2024-07-24 20:04:43.965428] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:28:52.490 [2024-07-24 20:04:43.965497] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1528164 ] 00:28:52.749 [2024-07-24 20:04:44.095413] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:52.749 [2024-07-24 20:04:44.198150] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:52.749 [2024-07-24 20:04:44.257638] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:52.749 [2024-07-24 20:04:44.257672] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:53.721 20:04:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:28:53.721 20:04:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@864 -- # return 0 00:28:53.721 20:04:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:28:53.721 20:04:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:28:53.721 20:04:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:28:53.721 20:04:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:28:53.721 20:04:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:28:53.721 20:04:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:53.721 20:04:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:28:53.721 20:04:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:53.721 20:04:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:28:53.980 malloc1 00:28:53.980 20:04:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:54.239 [2024-07-24 20:04:45.642888] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:54.239 [2024-07-24 20:04:45.642938] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:54.239 [2024-07-24 20:04:45.642961] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26c9fc0 00:28:54.239 [2024-07-24 20:04:45.642973] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:54.239 [2024-07-24 20:04:45.644587] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:54.239 [2024-07-24 20:04:45.644617] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:54.239 pt1 00:28:54.239 20:04:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:28:54.239 20:04:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:28:54.239 20:04:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:28:54.239 20:04:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:28:54.239 20:04:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:28:54.239 20:04:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:54.239 20:04:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:28:54.239 20:04:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:54.239 20:04:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:28:54.499 malloc2 00:28:54.499 20:04:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:54.758 [2024-07-24 20:04:46.187349] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:54.758 [2024-07-24 20:04:46.187407] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:54.758 [2024-07-24 20:04:46.187428] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27dcd20 00:28:54.758 [2024-07-24 20:04:46.187440] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:54.758 [2024-07-24 20:04:46.188842] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:54.758 [2024-07-24 20:04:46.188871] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:54.758 pt2 00:28:54.758 20:04:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:28:54.758 20:04:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:28:54.758 20:04:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:28:55.017 [2024-07-24 20:04:46.480133] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:55.017 [2024-07-24 20:04:46.481448] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:55.017 [2024-07-24 20:04:46.481606] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x27dd910 00:28:55.017 [2024-07-24 20:04:46.481619] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:55.017 [2024-07-24 20:04:46.481692] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27dfab0 00:28:55.017 [2024-07-24 20:04:46.481811] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x27dd910 00:28:55.017 [2024-07-24 20:04:46.481826] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x27dd910 00:28:55.017 [2024-07-24 20:04:46.481897] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:55.017 20:04:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:55.017 20:04:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:55.017 20:04:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:55.017 20:04:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:55.017 20:04:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:55.017 20:04:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:55.017 20:04:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:55.017 20:04:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:55.017 20:04:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:55.017 20:04:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:55.017 20:04:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:55.017 20:04:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:55.276 20:04:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:55.276 "name": "raid_bdev1", 00:28:55.276 "uuid": "94732815-794a-4777-a4bd-fff6dbb7b040", 00:28:55.276 "strip_size_kb": 0, 00:28:55.276 "state": "online", 00:28:55.276 "raid_level": "raid1", 00:28:55.276 "superblock": true, 00:28:55.276 "num_base_bdevs": 2, 00:28:55.276 "num_base_bdevs_discovered": 2, 00:28:55.276 "num_base_bdevs_operational": 2, 00:28:55.276 "base_bdevs_list": [ 00:28:55.276 { 00:28:55.276 "name": "pt1", 00:28:55.276 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:55.276 "is_configured": true, 00:28:55.276 "data_offset": 256, 00:28:55.276 "data_size": 7936 00:28:55.276 }, 00:28:55.276 { 00:28:55.276 "name": "pt2", 00:28:55.276 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:55.276 "is_configured": true, 00:28:55.276 "data_offset": 256, 00:28:55.276 "data_size": 7936 00:28:55.276 } 00:28:55.276 ] 00:28:55.276 }' 00:28:55.276 20:04:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:55.276 20:04:46 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:55.845 20:04:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:28:55.845 20:04:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:55.845 20:04:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:55.845 20:04:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:55.845 20:04:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:55.845 20:04:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:28:55.845 20:04:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:55.845 20:04:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:56.116 [2024-07-24 20:04:47.639427] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:56.116 20:04:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:56.116 "name": "raid_bdev1", 00:28:56.116 "aliases": [ 00:28:56.116 "94732815-794a-4777-a4bd-fff6dbb7b040" 00:28:56.116 ], 00:28:56.116 "product_name": "Raid Volume", 00:28:56.116 "block_size": 4096, 00:28:56.116 "num_blocks": 7936, 00:28:56.116 "uuid": "94732815-794a-4777-a4bd-fff6dbb7b040", 00:28:56.116 "md_size": 32, 00:28:56.116 "md_interleave": false, 00:28:56.116 "dif_type": 0, 00:28:56.116 "assigned_rate_limits": { 00:28:56.116 "rw_ios_per_sec": 0, 00:28:56.116 "rw_mbytes_per_sec": 0, 00:28:56.116 "r_mbytes_per_sec": 0, 00:28:56.116 "w_mbytes_per_sec": 0 00:28:56.116 }, 00:28:56.116 "claimed": false, 00:28:56.116 "zoned": false, 00:28:56.116 "supported_io_types": { 00:28:56.116 "read": true, 00:28:56.116 "write": true, 00:28:56.116 "unmap": false, 00:28:56.116 "flush": false, 00:28:56.116 "reset": true, 00:28:56.116 "nvme_admin": false, 00:28:56.116 "nvme_io": false, 00:28:56.116 "nvme_io_md": false, 00:28:56.116 "write_zeroes": true, 00:28:56.116 "zcopy": false, 00:28:56.116 "get_zone_info": false, 00:28:56.116 "zone_management": false, 00:28:56.116 "zone_append": false, 00:28:56.116 "compare": false, 00:28:56.116 "compare_and_write": false, 00:28:56.116 "abort": false, 00:28:56.116 "seek_hole": false, 00:28:56.116 "seek_data": false, 00:28:56.116 "copy": false, 00:28:56.116 "nvme_iov_md": false 00:28:56.116 }, 00:28:56.116 "memory_domains": [ 00:28:56.116 { 00:28:56.116 "dma_device_id": "system", 00:28:56.116 "dma_device_type": 1 00:28:56.116 }, 00:28:56.116 { 00:28:56.116 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:56.116 "dma_device_type": 2 00:28:56.116 }, 00:28:56.116 { 00:28:56.116 "dma_device_id": "system", 00:28:56.116 "dma_device_type": 1 00:28:56.116 }, 00:28:56.116 { 00:28:56.116 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:56.116 "dma_device_type": 2 00:28:56.116 } 00:28:56.116 ], 00:28:56.116 "driver_specific": { 00:28:56.116 "raid": { 00:28:56.116 "uuid": "94732815-794a-4777-a4bd-fff6dbb7b040", 00:28:56.116 "strip_size_kb": 0, 00:28:56.116 "state": "online", 00:28:56.116 "raid_level": "raid1", 00:28:56.116 "superblock": true, 00:28:56.117 "num_base_bdevs": 2, 00:28:56.117 "num_base_bdevs_discovered": 2, 00:28:56.117 "num_base_bdevs_operational": 2, 00:28:56.117 "base_bdevs_list": [ 00:28:56.117 { 00:28:56.117 "name": "pt1", 00:28:56.117 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:56.117 "is_configured": true, 00:28:56.117 "data_offset": 256, 00:28:56.117 "data_size": 7936 00:28:56.117 }, 00:28:56.117 { 00:28:56.117 "name": "pt2", 00:28:56.117 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:56.117 "is_configured": true, 00:28:56.117 "data_offset": 256, 00:28:56.117 "data_size": 7936 00:28:56.117 } 00:28:56.117 ] 00:28:56.117 } 00:28:56.117 } 00:28:56.117 }' 00:28:56.117 20:04:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:56.379 20:04:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:56.379 pt2' 00:28:56.379 20:04:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:56.379 20:04:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:56.379 20:04:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:56.379 20:04:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:56.379 "name": "pt1", 00:28:56.379 "aliases": [ 00:28:56.379 "00000000-0000-0000-0000-000000000001" 00:28:56.379 ], 00:28:56.379 "product_name": "passthru", 00:28:56.379 "block_size": 4096, 00:28:56.379 "num_blocks": 8192, 00:28:56.379 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:56.379 "md_size": 32, 00:28:56.379 "md_interleave": false, 00:28:56.379 "dif_type": 0, 00:28:56.379 "assigned_rate_limits": { 00:28:56.379 "rw_ios_per_sec": 0, 00:28:56.379 "rw_mbytes_per_sec": 0, 00:28:56.379 "r_mbytes_per_sec": 0, 00:28:56.379 "w_mbytes_per_sec": 0 00:28:56.379 }, 00:28:56.379 "claimed": true, 00:28:56.379 "claim_type": "exclusive_write", 00:28:56.379 "zoned": false, 00:28:56.379 "supported_io_types": { 00:28:56.379 "read": true, 00:28:56.379 "write": true, 00:28:56.379 "unmap": true, 00:28:56.379 "flush": true, 00:28:56.379 "reset": true, 00:28:56.379 "nvme_admin": false, 00:28:56.379 "nvme_io": false, 00:28:56.379 "nvme_io_md": false, 00:28:56.379 "write_zeroes": true, 00:28:56.379 "zcopy": true, 00:28:56.379 "get_zone_info": false, 00:28:56.379 "zone_management": false, 00:28:56.379 "zone_append": false, 00:28:56.379 "compare": false, 00:28:56.379 "compare_and_write": false, 00:28:56.379 "abort": true, 00:28:56.379 "seek_hole": false, 00:28:56.379 "seek_data": false, 00:28:56.379 "copy": true, 00:28:56.379 "nvme_iov_md": false 00:28:56.379 }, 00:28:56.379 "memory_domains": [ 00:28:56.379 { 00:28:56.379 "dma_device_id": "system", 00:28:56.379 "dma_device_type": 1 00:28:56.379 }, 00:28:56.379 { 00:28:56.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:56.379 "dma_device_type": 2 00:28:56.379 } 00:28:56.379 ], 00:28:56.379 "driver_specific": { 00:28:56.379 "passthru": { 00:28:56.379 "name": "pt1", 00:28:56.379 "base_bdev_name": "malloc1" 00:28:56.379 } 00:28:56.379 } 00:28:56.379 }' 00:28:56.379 20:04:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:56.639 20:04:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:56.639 20:04:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:56.639 20:04:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:56.639 20:04:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:56.639 20:04:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:56.639 20:04:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:56.639 20:04:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:56.898 20:04:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:56.898 20:04:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:56.898 20:04:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:56.898 20:04:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:56.898 20:04:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:56.898 20:04:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:56.898 20:04:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:57.157 20:04:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:57.157 "name": "pt2", 00:28:57.157 "aliases": [ 00:28:57.157 "00000000-0000-0000-0000-000000000002" 00:28:57.157 ], 00:28:57.157 "product_name": "passthru", 00:28:57.157 "block_size": 4096, 00:28:57.157 "num_blocks": 8192, 00:28:57.157 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:57.157 "md_size": 32, 00:28:57.157 "md_interleave": false, 00:28:57.157 "dif_type": 0, 00:28:57.157 "assigned_rate_limits": { 00:28:57.157 "rw_ios_per_sec": 0, 00:28:57.157 "rw_mbytes_per_sec": 0, 00:28:57.157 "r_mbytes_per_sec": 0, 00:28:57.157 "w_mbytes_per_sec": 0 00:28:57.157 }, 00:28:57.157 "claimed": true, 00:28:57.157 "claim_type": "exclusive_write", 00:28:57.157 "zoned": false, 00:28:57.157 "supported_io_types": { 00:28:57.157 "read": true, 00:28:57.157 "write": true, 00:28:57.157 "unmap": true, 00:28:57.157 "flush": true, 00:28:57.157 "reset": true, 00:28:57.157 "nvme_admin": false, 00:28:57.157 "nvme_io": false, 00:28:57.157 "nvme_io_md": false, 00:28:57.157 "write_zeroes": true, 00:28:57.157 "zcopy": true, 00:28:57.157 "get_zone_info": false, 00:28:57.157 "zone_management": false, 00:28:57.157 "zone_append": false, 00:28:57.157 "compare": false, 00:28:57.157 "compare_and_write": false, 00:28:57.157 "abort": true, 00:28:57.157 "seek_hole": false, 00:28:57.157 "seek_data": false, 00:28:57.157 "copy": true, 00:28:57.157 "nvme_iov_md": false 00:28:57.157 }, 00:28:57.157 "memory_domains": [ 00:28:57.157 { 00:28:57.157 "dma_device_id": "system", 00:28:57.157 "dma_device_type": 1 00:28:57.157 }, 00:28:57.157 { 00:28:57.157 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:57.157 "dma_device_type": 2 00:28:57.157 } 00:28:57.157 ], 00:28:57.157 "driver_specific": { 00:28:57.157 "passthru": { 00:28:57.157 "name": "pt2", 00:28:57.157 "base_bdev_name": "malloc2" 00:28:57.157 } 00:28:57.157 } 00:28:57.157 }' 00:28:57.157 20:04:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:57.157 20:04:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:57.157 20:04:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:57.157 20:04:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:57.157 20:04:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:57.416 20:04:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:57.416 20:04:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:57.416 20:04:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:57.416 20:04:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:57.416 20:04:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:57.416 20:04:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:57.676 20:04:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:57.676 20:04:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:57.676 20:04:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:28:57.935 [2024-07-24 20:04:49.524444] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:58.195 20:04:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=94732815-794a-4777-a4bd-fff6dbb7b040 00:28:58.195 20:04:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@451 -- # '[' -z 94732815-794a-4777-a4bd-fff6dbb7b040 ']' 00:28:58.195 20:04:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:58.195 [2024-07-24 20:04:49.781005] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:58.195 [2024-07-24 20:04:49.781029] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:58.195 [2024-07-24 20:04:49.781087] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:58.195 [2024-07-24 20:04:49.781143] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:58.195 [2024-07-24 20:04:49.781156] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27dd910 name raid_bdev1, state offline 00:28:58.454 20:04:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:58.454 20:04:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:28:59.023 20:04:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:28:59.023 20:04:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:28:59.023 20:04:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:28:59.023 20:04:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:59.023 20:04:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:28:59.023 20:04:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:59.282 20:04:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:28:59.282 20:04:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:28:59.850 20:04:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:28:59.850 20:04:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:59.850 20:04:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@650 -- # local es=0 00:28:59.850 20:04:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:59.850 20:04:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:59.850 20:04:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:28:59.850 20:04:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:59.850 20:04:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:28:59.850 20:04:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:59.850 20:04:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:28:59.850 20:04:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:59.850 20:04:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:59.850 20:04:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:59.850 [2024-07-24 20:04:51.385167] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:28:59.850 [2024-07-24 20:04:51.386514] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:28:59.850 [2024-07-24 20:04:51.386569] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:28:59.850 [2024-07-24 20:04:51.386611] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:28:59.850 [2024-07-24 20:04:51.386631] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:59.850 [2024-07-24 20:04:51.386640] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2647f10 name raid_bdev1, state configuring 00:28:59.850 request: 00:28:59.850 { 00:28:59.850 "name": "raid_bdev1", 00:28:59.850 "raid_level": "raid1", 00:28:59.850 "base_bdevs": [ 00:28:59.850 "malloc1", 00:28:59.850 "malloc2" 00:28:59.850 ], 00:28:59.850 "superblock": false, 00:28:59.850 "method": "bdev_raid_create", 00:28:59.850 "req_id": 1 00:28:59.850 } 00:28:59.850 Got JSON-RPC error response 00:28:59.850 response: 00:28:59.850 { 00:28:59.850 "code": -17, 00:28:59.850 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:28:59.850 } 00:28:59.850 20:04:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@653 -- # es=1 00:28:59.850 20:04:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:28:59.850 20:04:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:28:59.850 20:04:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:28:59.850 20:04:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:59.850 20:04:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:29:00.419 20:04:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:29:00.419 20:04:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:29:00.419 20:04:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:00.678 [2024-07-24 20:04:52.147109] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:00.678 [2024-07-24 20:04:52.147159] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:00.678 [2024-07-24 20:04:52.147179] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26486f0 00:29:00.678 [2024-07-24 20:04:52.147191] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:00.678 [2024-07-24 20:04:52.148681] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:00.678 [2024-07-24 20:04:52.148710] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:00.678 [2024-07-24 20:04:52.148760] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:29:00.678 [2024-07-24 20:04:52.148785] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:00.678 pt1 00:29:00.678 20:04:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:29:00.678 20:04:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:00.678 20:04:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:00.678 20:04:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:00.678 20:04:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:00.678 20:04:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:00.678 20:04:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:00.678 20:04:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:00.678 20:04:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:00.678 20:04:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:00.678 20:04:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:00.678 20:04:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:00.937 20:04:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:00.937 "name": "raid_bdev1", 00:29:00.937 "uuid": "94732815-794a-4777-a4bd-fff6dbb7b040", 00:29:00.937 "strip_size_kb": 0, 00:29:00.937 "state": "configuring", 00:29:00.937 "raid_level": "raid1", 00:29:00.937 "superblock": true, 00:29:00.937 "num_base_bdevs": 2, 00:29:00.937 "num_base_bdevs_discovered": 1, 00:29:00.937 "num_base_bdevs_operational": 2, 00:29:00.937 "base_bdevs_list": [ 00:29:00.937 { 00:29:00.937 "name": "pt1", 00:29:00.937 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:00.937 "is_configured": true, 00:29:00.937 "data_offset": 256, 00:29:00.937 "data_size": 7936 00:29:00.937 }, 00:29:00.937 { 00:29:00.937 "name": null, 00:29:00.937 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:00.937 "is_configured": false, 00:29:00.937 "data_offset": 256, 00:29:00.937 "data_size": 7936 00:29:00.937 } 00:29:00.937 ] 00:29:00.937 }' 00:29:00.938 20:04:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:00.938 20:04:52 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:01.507 20:04:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@485 -- # '[' 2 -gt 2 ']' 00:29:01.507 20:04:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:29:01.507 20:04:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:29:01.507 20:04:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:01.766 [2024-07-24 20:04:53.209935] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:01.766 [2024-07-24 20:04:53.209983] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:01.766 [2024-07-24 20:04:53.210001] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27dff10 00:29:01.766 [2024-07-24 20:04:53.210014] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:01.766 [2024-07-24 20:04:53.210208] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:01.766 [2024-07-24 20:04:53.210226] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:01.766 [2024-07-24 20:04:53.210270] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:29:01.766 [2024-07-24 20:04:53.210288] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:01.766 [2024-07-24 20:04:53.210381] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x27e04e0 00:29:01.766 [2024-07-24 20:04:53.210401] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:01.766 [2024-07-24 20:04:53.210460] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27e4410 00:29:01.766 [2024-07-24 20:04:53.210561] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x27e04e0 00:29:01.766 [2024-07-24 20:04:53.210571] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x27e04e0 00:29:01.766 [2024-07-24 20:04:53.210639] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:01.766 pt2 00:29:01.766 20:04:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:29:01.766 20:04:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:29:01.766 20:04:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:01.766 20:04:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:01.767 20:04:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:01.767 20:04:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:01.767 20:04:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:01.767 20:04:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:01.767 20:04:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:01.767 20:04:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:01.767 20:04:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:01.767 20:04:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:01.767 20:04:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:01.767 20:04:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:02.025 20:04:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:02.025 "name": "raid_bdev1", 00:29:02.025 "uuid": "94732815-794a-4777-a4bd-fff6dbb7b040", 00:29:02.025 "strip_size_kb": 0, 00:29:02.025 "state": "online", 00:29:02.025 "raid_level": "raid1", 00:29:02.025 "superblock": true, 00:29:02.025 "num_base_bdevs": 2, 00:29:02.025 "num_base_bdevs_discovered": 2, 00:29:02.025 "num_base_bdevs_operational": 2, 00:29:02.025 "base_bdevs_list": [ 00:29:02.025 { 00:29:02.025 "name": "pt1", 00:29:02.025 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:02.025 "is_configured": true, 00:29:02.025 "data_offset": 256, 00:29:02.025 "data_size": 7936 00:29:02.025 }, 00:29:02.025 { 00:29:02.025 "name": "pt2", 00:29:02.025 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:02.025 "is_configured": true, 00:29:02.025 "data_offset": 256, 00:29:02.025 "data_size": 7936 00:29:02.025 } 00:29:02.025 ] 00:29:02.025 }' 00:29:02.025 20:04:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:02.025 20:04:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:02.593 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:29:02.593 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:29:02.593 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:02.593 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:02.593 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:02.593 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:29:02.593 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:02.593 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:02.593 [2024-07-24 20:04:54.172734] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:02.853 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:02.853 "name": "raid_bdev1", 00:29:02.853 "aliases": [ 00:29:02.853 "94732815-794a-4777-a4bd-fff6dbb7b040" 00:29:02.853 ], 00:29:02.853 "product_name": "Raid Volume", 00:29:02.853 "block_size": 4096, 00:29:02.853 "num_blocks": 7936, 00:29:02.853 "uuid": "94732815-794a-4777-a4bd-fff6dbb7b040", 00:29:02.853 "md_size": 32, 00:29:02.853 "md_interleave": false, 00:29:02.853 "dif_type": 0, 00:29:02.853 "assigned_rate_limits": { 00:29:02.853 "rw_ios_per_sec": 0, 00:29:02.853 "rw_mbytes_per_sec": 0, 00:29:02.853 "r_mbytes_per_sec": 0, 00:29:02.853 "w_mbytes_per_sec": 0 00:29:02.853 }, 00:29:02.853 "claimed": false, 00:29:02.853 "zoned": false, 00:29:02.853 "supported_io_types": { 00:29:02.853 "read": true, 00:29:02.853 "write": true, 00:29:02.853 "unmap": false, 00:29:02.853 "flush": false, 00:29:02.853 "reset": true, 00:29:02.853 "nvme_admin": false, 00:29:02.853 "nvme_io": false, 00:29:02.853 "nvme_io_md": false, 00:29:02.853 "write_zeroes": true, 00:29:02.853 "zcopy": false, 00:29:02.853 "get_zone_info": false, 00:29:02.853 "zone_management": false, 00:29:02.853 "zone_append": false, 00:29:02.853 "compare": false, 00:29:02.853 "compare_and_write": false, 00:29:02.853 "abort": false, 00:29:02.853 "seek_hole": false, 00:29:02.853 "seek_data": false, 00:29:02.853 "copy": false, 00:29:02.853 "nvme_iov_md": false 00:29:02.853 }, 00:29:02.853 "memory_domains": [ 00:29:02.853 { 00:29:02.853 "dma_device_id": "system", 00:29:02.853 "dma_device_type": 1 00:29:02.853 }, 00:29:02.853 { 00:29:02.853 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:02.853 "dma_device_type": 2 00:29:02.853 }, 00:29:02.853 { 00:29:02.853 "dma_device_id": "system", 00:29:02.853 "dma_device_type": 1 00:29:02.853 }, 00:29:02.853 { 00:29:02.853 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:02.853 "dma_device_type": 2 00:29:02.853 } 00:29:02.853 ], 00:29:02.853 "driver_specific": { 00:29:02.853 "raid": { 00:29:02.853 "uuid": "94732815-794a-4777-a4bd-fff6dbb7b040", 00:29:02.853 "strip_size_kb": 0, 00:29:02.853 "state": "online", 00:29:02.853 "raid_level": "raid1", 00:29:02.853 "superblock": true, 00:29:02.853 "num_base_bdevs": 2, 00:29:02.853 "num_base_bdevs_discovered": 2, 00:29:02.853 "num_base_bdevs_operational": 2, 00:29:02.853 "base_bdevs_list": [ 00:29:02.853 { 00:29:02.853 "name": "pt1", 00:29:02.853 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:02.853 "is_configured": true, 00:29:02.853 "data_offset": 256, 00:29:02.853 "data_size": 7936 00:29:02.853 }, 00:29:02.853 { 00:29:02.853 "name": "pt2", 00:29:02.853 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:02.853 "is_configured": true, 00:29:02.853 "data_offset": 256, 00:29:02.853 "data_size": 7936 00:29:02.853 } 00:29:02.853 ] 00:29:02.853 } 00:29:02.853 } 00:29:02.853 }' 00:29:02.853 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:02.853 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:29:02.853 pt2' 00:29:02.853 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:02.853 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:29:02.853 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:03.112 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:03.112 "name": "pt1", 00:29:03.112 "aliases": [ 00:29:03.112 "00000000-0000-0000-0000-000000000001" 00:29:03.112 ], 00:29:03.112 "product_name": "passthru", 00:29:03.112 "block_size": 4096, 00:29:03.112 "num_blocks": 8192, 00:29:03.112 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:03.112 "md_size": 32, 00:29:03.112 "md_interleave": false, 00:29:03.112 "dif_type": 0, 00:29:03.112 "assigned_rate_limits": { 00:29:03.112 "rw_ios_per_sec": 0, 00:29:03.112 "rw_mbytes_per_sec": 0, 00:29:03.112 "r_mbytes_per_sec": 0, 00:29:03.112 "w_mbytes_per_sec": 0 00:29:03.112 }, 00:29:03.112 "claimed": true, 00:29:03.112 "claim_type": "exclusive_write", 00:29:03.112 "zoned": false, 00:29:03.112 "supported_io_types": { 00:29:03.112 "read": true, 00:29:03.112 "write": true, 00:29:03.112 "unmap": true, 00:29:03.112 "flush": true, 00:29:03.112 "reset": true, 00:29:03.112 "nvme_admin": false, 00:29:03.112 "nvme_io": false, 00:29:03.112 "nvme_io_md": false, 00:29:03.112 "write_zeroes": true, 00:29:03.112 "zcopy": true, 00:29:03.112 "get_zone_info": false, 00:29:03.112 "zone_management": false, 00:29:03.112 "zone_append": false, 00:29:03.112 "compare": false, 00:29:03.112 "compare_and_write": false, 00:29:03.112 "abort": true, 00:29:03.112 "seek_hole": false, 00:29:03.113 "seek_data": false, 00:29:03.113 "copy": true, 00:29:03.113 "nvme_iov_md": false 00:29:03.113 }, 00:29:03.113 "memory_domains": [ 00:29:03.113 { 00:29:03.113 "dma_device_id": "system", 00:29:03.113 "dma_device_type": 1 00:29:03.113 }, 00:29:03.113 { 00:29:03.113 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:03.113 "dma_device_type": 2 00:29:03.113 } 00:29:03.113 ], 00:29:03.113 "driver_specific": { 00:29:03.113 "passthru": { 00:29:03.113 "name": "pt1", 00:29:03.113 "base_bdev_name": "malloc1" 00:29:03.113 } 00:29:03.113 } 00:29:03.113 }' 00:29:03.113 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:03.113 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:03.113 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:29:03.113 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:03.113 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:03.372 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:03.372 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:03.372 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:03.372 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:29:03.372 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:03.372 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:03.372 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:03.372 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:03.372 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:29:03.372 20:04:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:03.941 20:04:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:03.941 "name": "pt2", 00:29:03.941 "aliases": [ 00:29:03.941 "00000000-0000-0000-0000-000000000002" 00:29:03.941 ], 00:29:03.941 "product_name": "passthru", 00:29:03.941 "block_size": 4096, 00:29:03.941 "num_blocks": 8192, 00:29:03.941 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:03.941 "md_size": 32, 00:29:03.941 "md_interleave": false, 00:29:03.941 "dif_type": 0, 00:29:03.941 "assigned_rate_limits": { 00:29:03.941 "rw_ios_per_sec": 0, 00:29:03.941 "rw_mbytes_per_sec": 0, 00:29:03.941 "r_mbytes_per_sec": 0, 00:29:03.941 "w_mbytes_per_sec": 0 00:29:03.941 }, 00:29:03.941 "claimed": true, 00:29:03.941 "claim_type": "exclusive_write", 00:29:03.941 "zoned": false, 00:29:03.941 "supported_io_types": { 00:29:03.941 "read": true, 00:29:03.941 "write": true, 00:29:03.941 "unmap": true, 00:29:03.941 "flush": true, 00:29:03.941 "reset": true, 00:29:03.941 "nvme_admin": false, 00:29:03.941 "nvme_io": false, 00:29:03.941 "nvme_io_md": false, 00:29:03.941 "write_zeroes": true, 00:29:03.941 "zcopy": true, 00:29:03.941 "get_zone_info": false, 00:29:03.941 "zone_management": false, 00:29:03.941 "zone_append": false, 00:29:03.941 "compare": false, 00:29:03.941 "compare_and_write": false, 00:29:03.941 "abort": true, 00:29:03.941 "seek_hole": false, 00:29:03.941 "seek_data": false, 00:29:03.941 "copy": true, 00:29:03.941 "nvme_iov_md": false 00:29:03.941 }, 00:29:03.941 "memory_domains": [ 00:29:03.941 { 00:29:03.941 "dma_device_id": "system", 00:29:03.941 "dma_device_type": 1 00:29:03.941 }, 00:29:03.941 { 00:29:03.941 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:03.941 "dma_device_type": 2 00:29:03.941 } 00:29:03.941 ], 00:29:03.941 "driver_specific": { 00:29:03.941 "passthru": { 00:29:03.941 "name": "pt2", 00:29:03.941 "base_bdev_name": "malloc2" 00:29:03.941 } 00:29:03.941 } 00:29:03.941 }' 00:29:03.941 20:04:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:03.941 20:04:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:03.941 20:04:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:29:03.941 20:04:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:04.200 20:04:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:04.200 20:04:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:04.200 20:04:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:04.200 20:04:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:04.200 20:04:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:29:04.200 20:04:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:04.200 20:04:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:04.200 20:04:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:04.200 20:04:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:04.200 20:04:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:29:04.459 [2024-07-24 20:04:55.909322] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:04.459 20:04:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@502 -- # '[' 94732815-794a-4777-a4bd-fff6dbb7b040 '!=' 94732815-794a-4777-a4bd-fff6dbb7b040 ']' 00:29:04.459 20:04:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # has_redundancy raid1 00:29:04.459 20:04:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:29:04.459 20:04:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:29:04.459 20:04:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@508 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:29:05.027 [2024-07-24 20:04:56.422474] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:29:05.027 20:04:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@511 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:05.027 20:04:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:05.027 20:04:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:05.027 20:04:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:05.027 20:04:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:05.027 20:04:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:05.027 20:04:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:05.027 20:04:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:05.027 20:04:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:05.027 20:04:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:05.027 20:04:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:05.027 20:04:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:05.286 20:04:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:05.286 "name": "raid_bdev1", 00:29:05.286 "uuid": "94732815-794a-4777-a4bd-fff6dbb7b040", 00:29:05.286 "strip_size_kb": 0, 00:29:05.286 "state": "online", 00:29:05.286 "raid_level": "raid1", 00:29:05.286 "superblock": true, 00:29:05.286 "num_base_bdevs": 2, 00:29:05.286 "num_base_bdevs_discovered": 1, 00:29:05.286 "num_base_bdevs_operational": 1, 00:29:05.286 "base_bdevs_list": [ 00:29:05.286 { 00:29:05.286 "name": null, 00:29:05.286 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:05.286 "is_configured": false, 00:29:05.286 "data_offset": 256, 00:29:05.286 "data_size": 7936 00:29:05.286 }, 00:29:05.286 { 00:29:05.286 "name": "pt2", 00:29:05.286 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:05.286 "is_configured": true, 00:29:05.286 "data_offset": 256, 00:29:05.286 "data_size": 7936 00:29:05.286 } 00:29:05.286 ] 00:29:05.286 }' 00:29:05.286 20:04:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:05.286 20:04:56 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:05.853 20:04:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:06.111 [2024-07-24 20:04:57.565490] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:06.111 [2024-07-24 20:04:57.565516] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:06.111 [2024-07-24 20:04:57.565564] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:06.111 [2024-07-24 20:04:57.565605] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:06.111 [2024-07-24 20:04:57.565617] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27e04e0 name raid_bdev1, state offline 00:29:06.111 20:04:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@515 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:06.111 20:04:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@515 -- # jq -r '.[]' 00:29:06.370 20:04:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@515 -- # raid_bdev= 00:29:06.370 20:04:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@516 -- # '[' -n '' ']' 00:29:06.370 20:04:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@521 -- # (( i = 1 )) 00:29:06.370 20:04:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:29:06.370 20:04:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:29:06.370 20:04:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:29:06.370 20:04:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:29:06.370 20:04:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # (( i = 1 )) 00:29:06.370 20:04:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:29:06.370 20:04:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@534 -- # i=1 00:29:06.370 20:04:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:06.629 [2024-07-24 20:04:58.118941] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:06.629 [2024-07-24 20:04:58.118985] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:06.630 [2024-07-24 20:04:58.119002] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26ca1f0 00:29:06.630 [2024-07-24 20:04:58.119014] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:06.630 [2024-07-24 20:04:58.120445] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:06.630 [2024-07-24 20:04:58.120473] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:06.630 [2024-07-24 20:04:58.120518] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:29:06.630 [2024-07-24 20:04:58.120541] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:06.630 [2024-07-24 20:04:58.120620] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x27e1270 00:29:06.630 [2024-07-24 20:04:58.120630] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:06.630 [2024-07-24 20:04:58.120683] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x263f7b0 00:29:06.630 [2024-07-24 20:04:58.120780] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x27e1270 00:29:06.630 [2024-07-24 20:04:58.120790] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x27e1270 00:29:06.630 [2024-07-24 20:04:58.120852] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:06.630 pt2 00:29:06.630 20:04:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@538 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:06.630 20:04:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:06.630 20:04:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:06.630 20:04:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:06.630 20:04:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:06.630 20:04:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:06.630 20:04:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:06.630 20:04:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:06.630 20:04:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:06.630 20:04:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:06.630 20:04:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:06.630 20:04:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:06.889 20:04:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:06.889 "name": "raid_bdev1", 00:29:06.889 "uuid": "94732815-794a-4777-a4bd-fff6dbb7b040", 00:29:06.889 "strip_size_kb": 0, 00:29:06.889 "state": "online", 00:29:06.889 "raid_level": "raid1", 00:29:06.889 "superblock": true, 00:29:06.889 "num_base_bdevs": 2, 00:29:06.889 "num_base_bdevs_discovered": 1, 00:29:06.889 "num_base_bdevs_operational": 1, 00:29:06.889 "base_bdevs_list": [ 00:29:06.889 { 00:29:06.889 "name": null, 00:29:06.889 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:06.889 "is_configured": false, 00:29:06.889 "data_offset": 256, 00:29:06.889 "data_size": 7936 00:29:06.889 }, 00:29:06.889 { 00:29:06.889 "name": "pt2", 00:29:06.889 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:06.889 "is_configured": true, 00:29:06.889 "data_offset": 256, 00:29:06.889 "data_size": 7936 00:29:06.889 } 00:29:06.889 ] 00:29:06.889 }' 00:29:06.889 20:04:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:06.889 20:04:58 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:07.456 20:04:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:07.456 [2024-07-24 20:04:59.013291] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:07.456 [2024-07-24 20:04:59.013315] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:07.456 [2024-07-24 20:04:59.013365] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:07.456 [2024-07-24 20:04:59.013414] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:07.456 [2024-07-24 20:04:59.013426] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27e1270 name raid_bdev1, state offline 00:29:07.456 20:04:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:07.456 20:04:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@542 -- # jq -r '.[]' 00:29:07.716 20:04:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@542 -- # raid_bdev= 00:29:07.716 20:04:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@543 -- # '[' -n '' ']' 00:29:07.716 20:04:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@547 -- # '[' 2 -gt 2 ']' 00:29:07.716 20:04:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:07.975 [2024-07-24 20:04:59.378248] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:07.975 [2024-07-24 20:04:59.378293] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:07.975 [2024-07-24 20:04:59.378311] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27e0ef0 00:29:07.975 [2024-07-24 20:04:59.378324] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:07.975 [2024-07-24 20:04:59.379757] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:07.975 [2024-07-24 20:04:59.379785] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:07.975 [2024-07-24 20:04:59.379831] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:29:07.975 [2024-07-24 20:04:59.379855] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:07.975 [2024-07-24 20:04:59.379947] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:29:07.975 [2024-07-24 20:04:59.379960] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:07.975 [2024-07-24 20:04:59.379975] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27e3cb0 name raid_bdev1, state configuring 00:29:07.975 [2024-07-24 20:04:59.380004] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:07.975 [2024-07-24 20:04:59.380052] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x27e1b60 00:29:07.975 [2024-07-24 20:04:59.380062] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:07.975 [2024-07-24 20:04:59.380118] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27dd8b0 00:29:07.975 [2024-07-24 20:04:59.380215] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x27e1b60 00:29:07.975 [2024-07-24 20:04:59.380225] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x27e1b60 00:29:07.975 [2024-07-24 20:04:59.380295] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:07.975 pt1 00:29:07.975 20:04:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' 2 -gt 2 ']' 00:29:07.975 20:04:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@569 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:07.975 20:04:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:07.975 20:04:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:07.975 20:04:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:07.975 20:04:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:07.975 20:04:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:07.975 20:04:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:07.975 20:04:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:07.975 20:04:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:07.975 20:04:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:07.975 20:04:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:07.975 20:04:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:08.234 20:04:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:08.234 "name": "raid_bdev1", 00:29:08.234 "uuid": "94732815-794a-4777-a4bd-fff6dbb7b040", 00:29:08.234 "strip_size_kb": 0, 00:29:08.234 "state": "online", 00:29:08.234 "raid_level": "raid1", 00:29:08.234 "superblock": true, 00:29:08.234 "num_base_bdevs": 2, 00:29:08.234 "num_base_bdevs_discovered": 1, 00:29:08.234 "num_base_bdevs_operational": 1, 00:29:08.234 "base_bdevs_list": [ 00:29:08.234 { 00:29:08.234 "name": null, 00:29:08.234 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:08.234 "is_configured": false, 00:29:08.234 "data_offset": 256, 00:29:08.234 "data_size": 7936 00:29:08.234 }, 00:29:08.234 { 00:29:08.234 "name": "pt2", 00:29:08.234 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:08.234 "is_configured": true, 00:29:08.234 "data_offset": 256, 00:29:08.234 "data_size": 7936 00:29:08.234 } 00:29:08.234 ] 00:29:08.234 }' 00:29:08.234 20:04:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:08.234 20:04:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:08.802 20:05:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@570 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:29:08.802 20:05:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@570 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:29:09.061 20:05:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@570 -- # [[ false == \f\a\l\s\e ]] 00:29:09.061 20:05:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@573 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:09.061 20:05:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@573 -- # jq -r '.[] | .uuid' 00:29:09.320 [2024-07-24 20:05:00.770164] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:09.320 20:05:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@573 -- # '[' 94732815-794a-4777-a4bd-fff6dbb7b040 '!=' 94732815-794a-4777-a4bd-fff6dbb7b040 ']' 00:29:09.320 20:05:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@578 -- # killprocess 1528164 00:29:09.320 20:05:00 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@950 -- # '[' -z 1528164 ']' 00:29:09.320 20:05:00 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # kill -0 1528164 00:29:09.320 20:05:00 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@955 -- # uname 00:29:09.320 20:05:00 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:29:09.320 20:05:00 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1528164 00:29:09.320 20:05:00 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:29:09.320 20:05:00 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:29:09.320 20:05:00 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1528164' 00:29:09.320 killing process with pid 1528164 00:29:09.320 20:05:00 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@969 -- # kill 1528164 00:29:09.320 [2024-07-24 20:05:00.839768] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:09.320 [2024-07-24 20:05:00.839823] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:09.320 [2024-07-24 20:05:00.839869] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:09.320 [2024-07-24 20:05:00.839881] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27e1b60 name raid_bdev1, state offline 00:29:09.320 20:05:00 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@974 -- # wait 1528164 00:29:09.320 [2024-07-24 20:05:00.866047] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:09.579 20:05:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@580 -- # return 0 00:29:09.579 00:29:09.579 real 0m17.190s 00:29:09.579 user 0m31.301s 00:29:09.579 sys 0m3.146s 00:29:09.579 20:05:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1126 -- # xtrace_disable 00:29:09.579 20:05:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:09.579 ************************************ 00:29:09.579 END TEST raid_superblock_test_md_separate 00:29:09.579 ************************************ 00:29:09.579 20:05:01 bdev_raid -- bdev/bdev_raid.sh@987 -- # '[' true = true ']' 00:29:09.579 20:05:01 bdev_raid -- bdev/bdev_raid.sh@988 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:29:09.579 20:05:01 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:29:09.579 20:05:01 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:29:09.579 20:05:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:09.869 ************************************ 00:29:09.870 START TEST raid_rebuild_test_sb_md_separate 00:29:09.870 ************************************ 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 true false true 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=2 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@586 -- # local superblock=true 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@587 -- # local background_io=false 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # local verify=true 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # local strip_size 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # local create_arg 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@594 -- # local data_offset 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # '[' true = true ']' 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # create_arg+=' -s' 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # raid_pid=1530742 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@613 -- # waitforlisten 1530742 /var/tmp/spdk-raid.sock 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@831 -- # '[' -z 1530742 ']' 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@836 -- # local max_retries=100 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:09.870 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@840 -- # xtrace_disable 00:29:09.870 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:09.870 [2024-07-24 20:05:01.248726] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:29:09.870 [2024-07-24 20:05:01.248789] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1530742 ] 00:29:09.870 I/O size of 3145728 is greater than zero copy threshold (65536). 00:29:09.870 Zero copy mechanism will not be used. 00:29:09.870 [2024-07-24 20:05:01.377757] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:10.145 [2024-07-24 20:05:01.483270] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:10.145 [2024-07-24 20:05:01.538447] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:10.145 [2024-07-24 20:05:01.538478] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:10.145 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:29:10.145 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@864 -- # return 0 00:29:10.145 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:29:10.145 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:29:10.404 BaseBdev1_malloc 00:29:10.404 20:05:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:29:10.663 [2024-07-24 20:05:02.188062] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:29:10.663 [2024-07-24 20:05:02.188113] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:10.663 [2024-07-24 20:05:02.188135] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2113060 00:29:10.663 [2024-07-24 20:05:02.188148] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:10.663 [2024-07-24 20:05:02.189723] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:10.663 [2024-07-24 20:05:02.189754] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:10.663 BaseBdev1 00:29:10.663 20:05:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:29:10.663 20:05:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:29:10.922 BaseBdev2_malloc 00:29:10.922 20:05:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:29:11.181 [2024-07-24 20:05:02.684438] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:29:11.181 [2024-07-24 20:05:02.684487] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:11.181 [2024-07-24 20:05:02.684508] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21091a0 00:29:11.181 [2024-07-24 20:05:02.684520] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:11.181 [2024-07-24 20:05:02.685939] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:11.182 [2024-07-24 20:05:02.685968] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:29:11.182 BaseBdev2 00:29:11.182 20:05:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:29:11.441 spare_malloc 00:29:11.441 20:05:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:29:11.700 spare_delay 00:29:11.700 20:05:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:11.960 [2024-07-24 20:05:03.405068] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:11.960 [2024-07-24 20:05:03.405120] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:11.960 [2024-07-24 20:05:03.405141] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x210cbf0 00:29:11.960 [2024-07-24 20:05:03.405154] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:11.960 [2024-07-24 20:05:03.406595] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:11.960 [2024-07-24 20:05:03.406622] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:11.960 spare 00:29:11.960 20:05:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:29:12.218 [2024-07-24 20:05:03.649739] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:12.218 [2024-07-24 20:05:03.651055] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:12.218 [2024-07-24 20:05:03.651224] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x210e2d0 00:29:12.218 [2024-07-24 20:05:03.651237] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:12.218 [2024-07-24 20:05:03.651310] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f75a20 00:29:12.218 [2024-07-24 20:05:03.651436] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x210e2d0 00:29:12.218 [2024-07-24 20:05:03.651447] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x210e2d0 00:29:12.218 [2024-07-24 20:05:03.651517] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:12.218 20:05:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:12.218 20:05:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:12.218 20:05:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:12.218 20:05:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:12.218 20:05:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:12.218 20:05:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:12.218 20:05:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:12.218 20:05:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:12.218 20:05:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:12.218 20:05:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:12.218 20:05:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:12.218 20:05:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:12.477 20:05:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:12.477 "name": "raid_bdev1", 00:29:12.477 "uuid": "267cee7b-ac52-43bb-a9e5-422043a56d72", 00:29:12.477 "strip_size_kb": 0, 00:29:12.478 "state": "online", 00:29:12.478 "raid_level": "raid1", 00:29:12.478 "superblock": true, 00:29:12.478 "num_base_bdevs": 2, 00:29:12.478 "num_base_bdevs_discovered": 2, 00:29:12.478 "num_base_bdevs_operational": 2, 00:29:12.478 "base_bdevs_list": [ 00:29:12.478 { 00:29:12.478 "name": "BaseBdev1", 00:29:12.478 "uuid": "68d2736c-142f-5465-92e8-0094946e777c", 00:29:12.478 "is_configured": true, 00:29:12.478 "data_offset": 256, 00:29:12.478 "data_size": 7936 00:29:12.478 }, 00:29:12.478 { 00:29:12.478 "name": "BaseBdev2", 00:29:12.478 "uuid": "99e8ba4b-bad0-523b-8f43-0aec5f9b6593", 00:29:12.478 "is_configured": true, 00:29:12.478 "data_offset": 256, 00:29:12.478 "data_size": 7936 00:29:12.478 } 00:29:12.478 ] 00:29:12.478 }' 00:29:12.478 20:05:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:12.478 20:05:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:13.046 20:05:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:13.046 20:05:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:29:13.305 [2024-07-24 20:05:04.740847] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:13.305 20:05:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=7936 00:29:13.305 20:05:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:13.305 20:05:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:29:13.564 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # data_offset=256 00:29:13.564 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@636 -- # '[' false = true ']' 00:29:13.564 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # '[' true = true ']' 00:29:13.564 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@640 -- # local write_unit_size 00:29:13.564 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@643 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:29:13.564 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:13.564 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:29:13.564 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:29:13.564 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:29:13.564 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:29:13.564 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:29:13.564 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:29:13.564 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:29:13.564 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:29:13.823 [2024-07-24 20:05:05.241961] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2111f20 00:29:13.823 /dev/nbd0 00:29:13.823 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:29:13.823 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:29:13.823 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:29:13.823 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # local i 00:29:13.823 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:29:13.823 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:29:13.823 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:29:13.823 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@873 -- # break 00:29:13.823 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:29:13.823 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:29:13.823 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:13.823 1+0 records in 00:29:13.823 1+0 records out 00:29:13.823 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000273094 s, 15.0 MB/s 00:29:13.823 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:13.823 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # size=4096 00:29:13.823 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:13.823 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:29:13.823 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@889 -- # return 0 00:29:13.823 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:13.823 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:29:13.823 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@644 -- # '[' raid1 = raid5f ']' 00:29:13.823 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@648 -- # write_unit_size=1 00:29:13.823 20:05:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@650 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:29:14.760 7936+0 records in 00:29:14.760 7936+0 records out 00:29:14.760 32505856 bytes (33 MB, 31 MiB) copied, 0.751014 s, 43.3 MB/s 00:29:14.760 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@651 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:29:14.760 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:14.760 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:29:14.760 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:14.760 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:29:14.760 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:14.760 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:29:14.760 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:14.760 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:14.760 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:14.760 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:14.760 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:14.760 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:14.760 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@39 -- # sleep 0.1 00:29:14.760 [2024-07-24 20:05:06.344878] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:15.019 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i++ )) 00:29:15.019 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:15.019 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:15.019 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:29:15.019 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:29:15.019 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:29:15.278 [2024-07-24 20:05:06.669778] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:15.278 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:15.278 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:15.278 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:15.278 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:15.278 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:15.278 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:15.278 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:15.278 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:15.278 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:15.278 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:15.278 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:15.278 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:15.537 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:15.537 "name": "raid_bdev1", 00:29:15.537 "uuid": "267cee7b-ac52-43bb-a9e5-422043a56d72", 00:29:15.537 "strip_size_kb": 0, 00:29:15.537 "state": "online", 00:29:15.537 "raid_level": "raid1", 00:29:15.537 "superblock": true, 00:29:15.537 "num_base_bdevs": 2, 00:29:15.537 "num_base_bdevs_discovered": 1, 00:29:15.537 "num_base_bdevs_operational": 1, 00:29:15.537 "base_bdevs_list": [ 00:29:15.537 { 00:29:15.537 "name": null, 00:29:15.537 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:15.537 "is_configured": false, 00:29:15.537 "data_offset": 256, 00:29:15.537 "data_size": 7936 00:29:15.537 }, 00:29:15.537 { 00:29:15.537 "name": "BaseBdev2", 00:29:15.537 "uuid": "99e8ba4b-bad0-523b-8f43-0aec5f9b6593", 00:29:15.537 "is_configured": true, 00:29:15.537 "data_offset": 256, 00:29:15.537 "data_size": 7936 00:29:15.537 } 00:29:15.537 ] 00:29:15.537 }' 00:29:15.538 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:15.538 20:05:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:16.105 20:05:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:16.364 [2024-07-24 20:05:07.764709] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:16.364 [2024-07-24 20:05:07.767099] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2111f20 00:29:16.364 [2024-07-24 20:05:07.769321] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:16.364 20:05:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:29:17.301 20:05:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:17.301 20:05:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:17.301 20:05:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:17.301 20:05:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:17.301 20:05:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:17.301 20:05:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:17.301 20:05:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:17.560 20:05:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:17.560 "name": "raid_bdev1", 00:29:17.560 "uuid": "267cee7b-ac52-43bb-a9e5-422043a56d72", 00:29:17.560 "strip_size_kb": 0, 00:29:17.560 "state": "online", 00:29:17.560 "raid_level": "raid1", 00:29:17.560 "superblock": true, 00:29:17.560 "num_base_bdevs": 2, 00:29:17.560 "num_base_bdevs_discovered": 2, 00:29:17.560 "num_base_bdevs_operational": 2, 00:29:17.560 "process": { 00:29:17.560 "type": "rebuild", 00:29:17.560 "target": "spare", 00:29:17.560 "progress": { 00:29:17.560 "blocks": 3072, 00:29:17.560 "percent": 38 00:29:17.560 } 00:29:17.560 }, 00:29:17.560 "base_bdevs_list": [ 00:29:17.560 { 00:29:17.560 "name": "spare", 00:29:17.560 "uuid": "bf8aa0fa-e165-5c7d-a403-1c0c12d2c125", 00:29:17.560 "is_configured": true, 00:29:17.560 "data_offset": 256, 00:29:17.560 "data_size": 7936 00:29:17.560 }, 00:29:17.560 { 00:29:17.560 "name": "BaseBdev2", 00:29:17.560 "uuid": "99e8ba4b-bad0-523b-8f43-0aec5f9b6593", 00:29:17.560 "is_configured": true, 00:29:17.560 "data_offset": 256, 00:29:17.560 "data_size": 7936 00:29:17.560 } 00:29:17.560 ] 00:29:17.560 }' 00:29:17.560 20:05:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:17.560 20:05:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:17.560 20:05:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:17.560 20:05:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:17.560 20:05:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:29:17.819 [2024-07-24 20:05:09.346366] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:17.819 [2024-07-24 20:05:09.381894] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:17.819 [2024-07-24 20:05:09.381947] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:17.819 [2024-07-24 20:05:09.381964] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:17.819 [2024-07-24 20:05:09.381972] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:17.820 20:05:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:17.820 20:05:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:17.820 20:05:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:17.820 20:05:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:17.820 20:05:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:17.820 20:05:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:17.820 20:05:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:17.820 20:05:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:17.820 20:05:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:17.820 20:05:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:18.079 20:05:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:18.079 20:05:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:18.079 20:05:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:18.079 "name": "raid_bdev1", 00:29:18.079 "uuid": "267cee7b-ac52-43bb-a9e5-422043a56d72", 00:29:18.079 "strip_size_kb": 0, 00:29:18.079 "state": "online", 00:29:18.079 "raid_level": "raid1", 00:29:18.079 "superblock": true, 00:29:18.079 "num_base_bdevs": 2, 00:29:18.079 "num_base_bdevs_discovered": 1, 00:29:18.079 "num_base_bdevs_operational": 1, 00:29:18.079 "base_bdevs_list": [ 00:29:18.079 { 00:29:18.079 "name": null, 00:29:18.079 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:18.079 "is_configured": false, 00:29:18.079 "data_offset": 256, 00:29:18.079 "data_size": 7936 00:29:18.079 }, 00:29:18.079 { 00:29:18.079 "name": "BaseBdev2", 00:29:18.079 "uuid": "99e8ba4b-bad0-523b-8f43-0aec5f9b6593", 00:29:18.079 "is_configured": true, 00:29:18.079 "data_offset": 256, 00:29:18.079 "data_size": 7936 00:29:18.079 } 00:29:18.079 ] 00:29:18.079 }' 00:29:18.079 20:05:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:18.079 20:05:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:19.016 20:05:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:19.016 20:05:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:19.016 20:05:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:19.016 20:05:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:19.016 20:05:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:19.016 20:05:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:19.016 20:05:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:19.275 20:05:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:19.275 "name": "raid_bdev1", 00:29:19.275 "uuid": "267cee7b-ac52-43bb-a9e5-422043a56d72", 00:29:19.275 "strip_size_kb": 0, 00:29:19.275 "state": "online", 00:29:19.275 "raid_level": "raid1", 00:29:19.275 "superblock": true, 00:29:19.275 "num_base_bdevs": 2, 00:29:19.275 "num_base_bdevs_discovered": 1, 00:29:19.275 "num_base_bdevs_operational": 1, 00:29:19.275 "base_bdevs_list": [ 00:29:19.275 { 00:29:19.275 "name": null, 00:29:19.275 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:19.275 "is_configured": false, 00:29:19.275 "data_offset": 256, 00:29:19.275 "data_size": 7936 00:29:19.275 }, 00:29:19.275 { 00:29:19.275 "name": "BaseBdev2", 00:29:19.275 "uuid": "99e8ba4b-bad0-523b-8f43-0aec5f9b6593", 00:29:19.275 "is_configured": true, 00:29:19.275 "data_offset": 256, 00:29:19.275 "data_size": 7936 00:29:19.275 } 00:29:19.275 ] 00:29:19.275 }' 00:29:19.275 20:05:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:19.275 20:05:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:19.275 20:05:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:19.275 20:05:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:19.275 20:05:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:19.535 [2024-07-24 20:05:11.074023] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:19.535 [2024-07-24 20:05:11.076506] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21117b0 00:29:19.535 [2024-07-24 20:05:11.077978] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:19.535 20:05:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@678 -- # sleep 1 00:29:20.913 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:20.913 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:20.913 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:20.913 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:20.913 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:20.913 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:20.913 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:20.913 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:20.913 "name": "raid_bdev1", 00:29:20.913 "uuid": "267cee7b-ac52-43bb-a9e5-422043a56d72", 00:29:20.913 "strip_size_kb": 0, 00:29:20.913 "state": "online", 00:29:20.913 "raid_level": "raid1", 00:29:20.913 "superblock": true, 00:29:20.913 "num_base_bdevs": 2, 00:29:20.913 "num_base_bdevs_discovered": 2, 00:29:20.913 "num_base_bdevs_operational": 2, 00:29:20.913 "process": { 00:29:20.913 "type": "rebuild", 00:29:20.913 "target": "spare", 00:29:20.913 "progress": { 00:29:20.913 "blocks": 3072, 00:29:20.913 "percent": 38 00:29:20.913 } 00:29:20.913 }, 00:29:20.913 "base_bdevs_list": [ 00:29:20.913 { 00:29:20.913 "name": "spare", 00:29:20.913 "uuid": "bf8aa0fa-e165-5c7d-a403-1c0c12d2c125", 00:29:20.913 "is_configured": true, 00:29:20.913 "data_offset": 256, 00:29:20.913 "data_size": 7936 00:29:20.913 }, 00:29:20.913 { 00:29:20.913 "name": "BaseBdev2", 00:29:20.913 "uuid": "99e8ba4b-bad0-523b-8f43-0aec5f9b6593", 00:29:20.913 "is_configured": true, 00:29:20.913 "data_offset": 256, 00:29:20.913 "data_size": 7936 00:29:20.913 } 00:29:20.913 ] 00:29:20.913 }' 00:29:20.913 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:20.913 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:20.913 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:20.913 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:20.913 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@681 -- # '[' true = true ']' 00:29:20.913 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@681 -- # '[' = false ']' 00:29:20.913 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 681: [: =: unary operator expected 00:29:20.913 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=2 00:29:20.913 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:29:20.913 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # '[' 2 -gt 2 ']' 00:29:20.913 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # local timeout=1133 00:29:20.914 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:29:20.914 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:20.914 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:20.914 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:20.914 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:20.914 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:20.914 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:20.914 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:21.172 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:21.172 "name": "raid_bdev1", 00:29:21.172 "uuid": "267cee7b-ac52-43bb-a9e5-422043a56d72", 00:29:21.172 "strip_size_kb": 0, 00:29:21.172 "state": "online", 00:29:21.172 "raid_level": "raid1", 00:29:21.172 "superblock": true, 00:29:21.172 "num_base_bdevs": 2, 00:29:21.172 "num_base_bdevs_discovered": 2, 00:29:21.172 "num_base_bdevs_operational": 2, 00:29:21.172 "process": { 00:29:21.172 "type": "rebuild", 00:29:21.172 "target": "spare", 00:29:21.172 "progress": { 00:29:21.172 "blocks": 3840, 00:29:21.172 "percent": 48 00:29:21.172 } 00:29:21.172 }, 00:29:21.172 "base_bdevs_list": [ 00:29:21.172 { 00:29:21.172 "name": "spare", 00:29:21.172 "uuid": "bf8aa0fa-e165-5c7d-a403-1c0c12d2c125", 00:29:21.172 "is_configured": true, 00:29:21.172 "data_offset": 256, 00:29:21.172 "data_size": 7936 00:29:21.172 }, 00:29:21.172 { 00:29:21.172 "name": "BaseBdev2", 00:29:21.172 "uuid": "99e8ba4b-bad0-523b-8f43-0aec5f9b6593", 00:29:21.172 "is_configured": true, 00:29:21.172 "data_offset": 256, 00:29:21.172 "data_size": 7936 00:29:21.172 } 00:29:21.172 ] 00:29:21.172 }' 00:29:21.172 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:21.172 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:21.172 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:21.431 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:21.431 20:05:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@726 -- # sleep 1 00:29:22.367 20:05:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:29:22.367 20:05:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:22.367 20:05:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:22.367 20:05:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:22.367 20:05:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:22.367 20:05:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:22.367 20:05:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:22.367 20:05:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:22.626 20:05:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:22.626 "name": "raid_bdev1", 00:29:22.626 "uuid": "267cee7b-ac52-43bb-a9e5-422043a56d72", 00:29:22.626 "strip_size_kb": 0, 00:29:22.626 "state": "online", 00:29:22.626 "raid_level": "raid1", 00:29:22.626 "superblock": true, 00:29:22.626 "num_base_bdevs": 2, 00:29:22.626 "num_base_bdevs_discovered": 2, 00:29:22.626 "num_base_bdevs_operational": 2, 00:29:22.626 "process": { 00:29:22.626 "type": "rebuild", 00:29:22.626 "target": "spare", 00:29:22.626 "progress": { 00:29:22.626 "blocks": 7424, 00:29:22.626 "percent": 93 00:29:22.626 } 00:29:22.626 }, 00:29:22.626 "base_bdevs_list": [ 00:29:22.626 { 00:29:22.626 "name": "spare", 00:29:22.626 "uuid": "bf8aa0fa-e165-5c7d-a403-1c0c12d2c125", 00:29:22.626 "is_configured": true, 00:29:22.626 "data_offset": 256, 00:29:22.626 "data_size": 7936 00:29:22.626 }, 00:29:22.626 { 00:29:22.626 "name": "BaseBdev2", 00:29:22.626 "uuid": "99e8ba4b-bad0-523b-8f43-0aec5f9b6593", 00:29:22.626 "is_configured": true, 00:29:22.626 "data_offset": 256, 00:29:22.626 "data_size": 7936 00:29:22.626 } 00:29:22.626 ] 00:29:22.626 }' 00:29:22.626 20:05:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:22.626 20:05:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:22.626 20:05:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:22.626 20:05:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:22.626 20:05:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@726 -- # sleep 1 00:29:22.626 [2024-07-24 20:05:14.202601] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:29:22.626 [2024-07-24 20:05:14.202658] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:29:22.626 [2024-07-24 20:05:14.202740] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:23.563 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:29:23.563 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:23.563 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:23.563 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:23.563 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:23.563 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:23.563 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:23.563 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:23.822 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:23.822 "name": "raid_bdev1", 00:29:23.822 "uuid": "267cee7b-ac52-43bb-a9e5-422043a56d72", 00:29:23.822 "strip_size_kb": 0, 00:29:23.822 "state": "online", 00:29:23.822 "raid_level": "raid1", 00:29:23.822 "superblock": true, 00:29:23.822 "num_base_bdevs": 2, 00:29:23.822 "num_base_bdevs_discovered": 2, 00:29:23.822 "num_base_bdevs_operational": 2, 00:29:23.822 "base_bdevs_list": [ 00:29:23.822 { 00:29:23.822 "name": "spare", 00:29:23.822 "uuid": "bf8aa0fa-e165-5c7d-a403-1c0c12d2c125", 00:29:23.822 "is_configured": true, 00:29:23.822 "data_offset": 256, 00:29:23.822 "data_size": 7936 00:29:23.822 }, 00:29:23.822 { 00:29:23.822 "name": "BaseBdev2", 00:29:23.822 "uuid": "99e8ba4b-bad0-523b-8f43-0aec5f9b6593", 00:29:23.822 "is_configured": true, 00:29:23.822 "data_offset": 256, 00:29:23.822 "data_size": 7936 00:29:23.822 } 00:29:23.822 ] 00:29:23.822 }' 00:29:23.822 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:24.082 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:29:24.082 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:24.082 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:29:24.082 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@724 -- # break 00:29:24.082 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:24.082 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:24.082 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:24.082 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:24.082 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:24.082 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:24.082 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:24.341 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:24.341 "name": "raid_bdev1", 00:29:24.341 "uuid": "267cee7b-ac52-43bb-a9e5-422043a56d72", 00:29:24.341 "strip_size_kb": 0, 00:29:24.341 "state": "online", 00:29:24.341 "raid_level": "raid1", 00:29:24.341 "superblock": true, 00:29:24.341 "num_base_bdevs": 2, 00:29:24.341 "num_base_bdevs_discovered": 2, 00:29:24.341 "num_base_bdevs_operational": 2, 00:29:24.341 "base_bdevs_list": [ 00:29:24.341 { 00:29:24.341 "name": "spare", 00:29:24.341 "uuid": "bf8aa0fa-e165-5c7d-a403-1c0c12d2c125", 00:29:24.341 "is_configured": true, 00:29:24.341 "data_offset": 256, 00:29:24.341 "data_size": 7936 00:29:24.341 }, 00:29:24.341 { 00:29:24.341 "name": "BaseBdev2", 00:29:24.341 "uuid": "99e8ba4b-bad0-523b-8f43-0aec5f9b6593", 00:29:24.341 "is_configured": true, 00:29:24.341 "data_offset": 256, 00:29:24.341 "data_size": 7936 00:29:24.341 } 00:29:24.341 ] 00:29:24.341 }' 00:29:24.341 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:24.341 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:24.341 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:24.341 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:24.341 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:24.341 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:24.341 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:24.341 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:24.341 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:24.341 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:24.341 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:24.341 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:24.341 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:24.341 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:24.341 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:24.341 20:05:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:24.600 20:05:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:24.600 "name": "raid_bdev1", 00:29:24.600 "uuid": "267cee7b-ac52-43bb-a9e5-422043a56d72", 00:29:24.600 "strip_size_kb": 0, 00:29:24.600 "state": "online", 00:29:24.600 "raid_level": "raid1", 00:29:24.600 "superblock": true, 00:29:24.600 "num_base_bdevs": 2, 00:29:24.600 "num_base_bdevs_discovered": 2, 00:29:24.600 "num_base_bdevs_operational": 2, 00:29:24.600 "base_bdevs_list": [ 00:29:24.600 { 00:29:24.600 "name": "spare", 00:29:24.600 "uuid": "bf8aa0fa-e165-5c7d-a403-1c0c12d2c125", 00:29:24.600 "is_configured": true, 00:29:24.600 "data_offset": 256, 00:29:24.600 "data_size": 7936 00:29:24.600 }, 00:29:24.600 { 00:29:24.600 "name": "BaseBdev2", 00:29:24.600 "uuid": "99e8ba4b-bad0-523b-8f43-0aec5f9b6593", 00:29:24.600 "is_configured": true, 00:29:24.600 "data_offset": 256, 00:29:24.600 "data_size": 7936 00:29:24.600 } 00:29:24.600 ] 00:29:24.600 }' 00:29:24.600 20:05:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:24.600 20:05:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:25.168 20:05:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:25.427 [2024-07-24 20:05:16.921621] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:25.428 [2024-07-24 20:05:16.921655] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:25.428 [2024-07-24 20:05:16.921716] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:25.428 [2024-07-24 20:05:16.921775] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:25.428 [2024-07-24 20:05:16.921788] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x210e2d0 name raid_bdev1, state offline 00:29:25.428 20:05:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@735 -- # jq length 00:29:25.428 20:05:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:25.687 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:29:25.687 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:29:25.687 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # '[' false = true ']' 00:29:25.687 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:29:25.687 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:25.687 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:29:25.687 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:29:25.687 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:25.687 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:29:25.687 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:29:25.687 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:29:25.687 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:25.687 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:29:25.946 /dev/nbd0 00:29:25.946 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:29:25.946 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:29:25.946 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:29:25.946 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # local i 00:29:25.946 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:29:25.946 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:29:25.946 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:29:25.946 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@873 -- # break 00:29:25.946 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:29:25.946 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:29:25.946 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:25.946 1+0 records in 00:29:25.946 1+0 records out 00:29:25.946 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000223734 s, 18.3 MB/s 00:29:25.946 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:25.946 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # size=4096 00:29:25.946 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:25.946 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:29:25.946 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@889 -- # return 0 00:29:25.946 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:25.946 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:25.946 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:29:26.205 /dev/nbd1 00:29:26.205 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:29:26.205 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:29:26.205 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:29:26.205 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # local i 00:29:26.205 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:29:26.205 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:29:26.205 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:29:26.205 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@873 -- # break 00:29:26.205 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:29:26.205 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:29:26.205 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:26.205 1+0 records in 00:29:26.205 1+0 records out 00:29:26.205 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000322487 s, 12.7 MB/s 00:29:26.205 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:26.205 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # size=4096 00:29:26.205 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:26.205 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:29:26.205 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@889 -- # return 0 00:29:26.205 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:26.205 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:26.205 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:29:26.464 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:29:26.464 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:26.464 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:26.464 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:26.464 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:29:26.464 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:26.464 20:05:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:29:26.723 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:26.723 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:26.723 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:26.723 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:26.723 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:26.723 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:26.723 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:29:26.723 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:29:26.723 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:26.723 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:29:26.982 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:29:26.982 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:29:26.982 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:29:26.982 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:26.982 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:26.982 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:29:26.982 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:29:26.982 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:29:26.982 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@758 -- # '[' true = true ']' 00:29:26.982 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:27.241 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:27.565 [2024-07-24 20:05:18.898204] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:27.565 [2024-07-24 20:05:18.898249] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:27.565 [2024-07-24 20:05:18.898270] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f75840 00:29:27.565 [2024-07-24 20:05:18.898283] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:27.565 [2024-07-24 20:05:18.899752] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:27.565 [2024-07-24 20:05:18.899779] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:27.565 [2024-07-24 20:05:18.899839] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:29:27.565 [2024-07-24 20:05:18.899867] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:27.565 [2024-07-24 20:05:18.899963] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:27.565 spare 00:29:27.565 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:27.565 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:27.565 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:27.565 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:27.565 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:27.565 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:27.565 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:27.565 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:27.565 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:27.565 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:27.565 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:27.565 20:05:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:27.565 [2024-07-24 20:05:19.000272] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x210ed90 00:29:27.565 [2024-07-24 20:05:19.000288] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:27.565 [2024-07-24 20:05:19.000358] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x207bab0 00:29:27.565 [2024-07-24 20:05:19.000486] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x210ed90 00:29:27.565 [2024-07-24 20:05:19.000496] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x210ed90 00:29:27.565 [2024-07-24 20:05:19.000586] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:27.825 20:05:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:27.825 "name": "raid_bdev1", 00:29:27.825 "uuid": "267cee7b-ac52-43bb-a9e5-422043a56d72", 00:29:27.825 "strip_size_kb": 0, 00:29:27.825 "state": "online", 00:29:27.825 "raid_level": "raid1", 00:29:27.825 "superblock": true, 00:29:27.825 "num_base_bdevs": 2, 00:29:27.825 "num_base_bdevs_discovered": 2, 00:29:27.825 "num_base_bdevs_operational": 2, 00:29:27.825 "base_bdevs_list": [ 00:29:27.825 { 00:29:27.825 "name": "spare", 00:29:27.825 "uuid": "bf8aa0fa-e165-5c7d-a403-1c0c12d2c125", 00:29:27.825 "is_configured": true, 00:29:27.825 "data_offset": 256, 00:29:27.825 "data_size": 7936 00:29:27.825 }, 00:29:27.825 { 00:29:27.825 "name": "BaseBdev2", 00:29:27.825 "uuid": "99e8ba4b-bad0-523b-8f43-0aec5f9b6593", 00:29:27.825 "is_configured": true, 00:29:27.825 "data_offset": 256, 00:29:27.825 "data_size": 7936 00:29:27.825 } 00:29:27.825 ] 00:29:27.825 }' 00:29:27.825 20:05:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:27.825 20:05:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:28.393 20:05:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:28.393 20:05:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:28.393 20:05:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:28.393 20:05:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:28.393 20:05:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:28.393 20:05:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:28.393 20:05:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:28.652 20:05:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:28.652 "name": "raid_bdev1", 00:29:28.652 "uuid": "267cee7b-ac52-43bb-a9e5-422043a56d72", 00:29:28.652 "strip_size_kb": 0, 00:29:28.652 "state": "online", 00:29:28.652 "raid_level": "raid1", 00:29:28.652 "superblock": true, 00:29:28.652 "num_base_bdevs": 2, 00:29:28.652 "num_base_bdevs_discovered": 2, 00:29:28.652 "num_base_bdevs_operational": 2, 00:29:28.652 "base_bdevs_list": [ 00:29:28.652 { 00:29:28.652 "name": "spare", 00:29:28.652 "uuid": "bf8aa0fa-e165-5c7d-a403-1c0c12d2c125", 00:29:28.652 "is_configured": true, 00:29:28.652 "data_offset": 256, 00:29:28.652 "data_size": 7936 00:29:28.652 }, 00:29:28.652 { 00:29:28.652 "name": "BaseBdev2", 00:29:28.652 "uuid": "99e8ba4b-bad0-523b-8f43-0aec5f9b6593", 00:29:28.652 "is_configured": true, 00:29:28.652 "data_offset": 256, 00:29:28.653 "data_size": 7936 00:29:28.653 } 00:29:28.653 ] 00:29:28.653 }' 00:29:28.653 20:05:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:28.653 20:05:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:28.653 20:05:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:28.653 20:05:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:28.653 20:05:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:28.653 20:05:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@765 -- # jq -r '.[].base_bdevs_list[0].name' 00:29:28.911 20:05:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@765 -- # [[ spare == \s\p\a\r\e ]] 00:29:28.911 20:05:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:29:29.171 [2024-07-24 20:05:20.538839] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:29.171 20:05:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:29.171 20:05:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:29.171 20:05:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:29.171 20:05:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:29.171 20:05:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:29.171 20:05:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:29.171 20:05:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:29.171 20:05:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:29.171 20:05:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:29.171 20:05:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:29.171 20:05:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:29.171 20:05:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:29.432 20:05:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:29.432 "name": "raid_bdev1", 00:29:29.432 "uuid": "267cee7b-ac52-43bb-a9e5-422043a56d72", 00:29:29.432 "strip_size_kb": 0, 00:29:29.432 "state": "online", 00:29:29.432 "raid_level": "raid1", 00:29:29.432 "superblock": true, 00:29:29.432 "num_base_bdevs": 2, 00:29:29.432 "num_base_bdevs_discovered": 1, 00:29:29.432 "num_base_bdevs_operational": 1, 00:29:29.432 "base_bdevs_list": [ 00:29:29.432 { 00:29:29.432 "name": null, 00:29:29.432 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:29.432 "is_configured": false, 00:29:29.432 "data_offset": 256, 00:29:29.432 "data_size": 7936 00:29:29.432 }, 00:29:29.432 { 00:29:29.432 "name": "BaseBdev2", 00:29:29.432 "uuid": "99e8ba4b-bad0-523b-8f43-0aec5f9b6593", 00:29:29.432 "is_configured": true, 00:29:29.432 "data_offset": 256, 00:29:29.432 "data_size": 7936 00:29:29.432 } 00:29:29.432 ] 00:29:29.432 }' 00:29:29.432 20:05:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:29.432 20:05:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:30.001 20:05:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@770 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:30.260 [2024-07-24 20:05:21.621741] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:30.260 [2024-07-24 20:05:21.621887] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:29:30.260 [2024-07-24 20:05:21.621904] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:29:30.260 [2024-07-24 20:05:21.621931] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:30.260 [2024-07-24 20:05:21.624132] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21071e0 00:29:30.260 [2024-07-24 20:05:21.625560] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:30.260 20:05:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # sleep 1 00:29:31.196 20:05:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:31.196 20:05:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:31.196 20:05:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:31.196 20:05:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:31.196 20:05:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:31.196 20:05:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:31.196 20:05:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:31.455 20:05:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:31.455 "name": "raid_bdev1", 00:29:31.455 "uuid": "267cee7b-ac52-43bb-a9e5-422043a56d72", 00:29:31.455 "strip_size_kb": 0, 00:29:31.455 "state": "online", 00:29:31.455 "raid_level": "raid1", 00:29:31.455 "superblock": true, 00:29:31.455 "num_base_bdevs": 2, 00:29:31.455 "num_base_bdevs_discovered": 2, 00:29:31.455 "num_base_bdevs_operational": 2, 00:29:31.455 "process": { 00:29:31.455 "type": "rebuild", 00:29:31.455 "target": "spare", 00:29:31.455 "progress": { 00:29:31.455 "blocks": 3072, 00:29:31.455 "percent": 38 00:29:31.455 } 00:29:31.455 }, 00:29:31.455 "base_bdevs_list": [ 00:29:31.455 { 00:29:31.455 "name": "spare", 00:29:31.455 "uuid": "bf8aa0fa-e165-5c7d-a403-1c0c12d2c125", 00:29:31.455 "is_configured": true, 00:29:31.455 "data_offset": 256, 00:29:31.455 "data_size": 7936 00:29:31.455 }, 00:29:31.455 { 00:29:31.455 "name": "BaseBdev2", 00:29:31.455 "uuid": "99e8ba4b-bad0-523b-8f43-0aec5f9b6593", 00:29:31.455 "is_configured": true, 00:29:31.455 "data_offset": 256, 00:29:31.455 "data_size": 7936 00:29:31.455 } 00:29:31.455 ] 00:29:31.455 }' 00:29:31.455 20:05:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:31.455 20:05:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:31.455 20:05:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:31.455 20:05:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:31.455 20:05:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:31.714 [2024-07-24 20:05:23.227056] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:31.714 [2024-07-24 20:05:23.237894] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:31.714 [2024-07-24 20:05:23.237937] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:31.714 [2024-07-24 20:05:23.237953] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:31.714 [2024-07-24 20:05:23.237961] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:31.714 20:05:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:31.714 20:05:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:31.714 20:05:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:31.714 20:05:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:31.714 20:05:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:31.714 20:05:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:31.714 20:05:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:31.714 20:05:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:31.714 20:05:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:31.714 20:05:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:31.714 20:05:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:31.714 20:05:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:31.986 20:05:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:31.986 "name": "raid_bdev1", 00:29:31.986 "uuid": "267cee7b-ac52-43bb-a9e5-422043a56d72", 00:29:31.986 "strip_size_kb": 0, 00:29:31.986 "state": "online", 00:29:31.986 "raid_level": "raid1", 00:29:31.986 "superblock": true, 00:29:31.986 "num_base_bdevs": 2, 00:29:31.986 "num_base_bdevs_discovered": 1, 00:29:31.986 "num_base_bdevs_operational": 1, 00:29:31.986 "base_bdevs_list": [ 00:29:31.986 { 00:29:31.986 "name": null, 00:29:31.986 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:31.986 "is_configured": false, 00:29:31.986 "data_offset": 256, 00:29:31.986 "data_size": 7936 00:29:31.986 }, 00:29:31.986 { 00:29:31.986 "name": "BaseBdev2", 00:29:31.986 "uuid": "99e8ba4b-bad0-523b-8f43-0aec5f9b6593", 00:29:31.986 "is_configured": true, 00:29:31.986 "data_offset": 256, 00:29:31.986 "data_size": 7936 00:29:31.986 } 00:29:31.986 ] 00:29:31.986 }' 00:29:31.986 20:05:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:31.986 20:05:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:32.556 20:05:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:32.814 [2024-07-24 20:05:24.336439] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:32.814 [2024-07-24 20:05:24.336489] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:32.814 [2024-07-24 20:05:24.336512] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x210f010 00:29:32.814 [2024-07-24 20:05:24.336525] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:32.814 [2024-07-24 20:05:24.336741] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:32.814 [2024-07-24 20:05:24.336758] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:32.814 [2024-07-24 20:05:24.336817] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:29:32.814 [2024-07-24 20:05:24.336828] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:29:32.814 [2024-07-24 20:05:24.336839] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:29:32.814 [2024-07-24 20:05:24.336857] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:32.814 [2024-07-24 20:05:24.339051] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2111660 00:29:32.814 [2024-07-24 20:05:24.340436] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:32.814 spare 00:29:32.814 20:05:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # sleep 1 00:29:34.191 20:05:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:34.191 20:05:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:34.191 20:05:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:34.191 20:05:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:34.191 20:05:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:34.191 20:05:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:34.191 20:05:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:34.191 20:05:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:34.191 "name": "raid_bdev1", 00:29:34.191 "uuid": "267cee7b-ac52-43bb-a9e5-422043a56d72", 00:29:34.191 "strip_size_kb": 0, 00:29:34.191 "state": "online", 00:29:34.191 "raid_level": "raid1", 00:29:34.191 "superblock": true, 00:29:34.191 "num_base_bdevs": 2, 00:29:34.191 "num_base_bdevs_discovered": 2, 00:29:34.191 "num_base_bdevs_operational": 2, 00:29:34.191 "process": { 00:29:34.191 "type": "rebuild", 00:29:34.191 "target": "spare", 00:29:34.191 "progress": { 00:29:34.191 "blocks": 3072, 00:29:34.191 "percent": 38 00:29:34.191 } 00:29:34.191 }, 00:29:34.191 "base_bdevs_list": [ 00:29:34.191 { 00:29:34.191 "name": "spare", 00:29:34.191 "uuid": "bf8aa0fa-e165-5c7d-a403-1c0c12d2c125", 00:29:34.191 "is_configured": true, 00:29:34.191 "data_offset": 256, 00:29:34.191 "data_size": 7936 00:29:34.191 }, 00:29:34.191 { 00:29:34.191 "name": "BaseBdev2", 00:29:34.191 "uuid": "99e8ba4b-bad0-523b-8f43-0aec5f9b6593", 00:29:34.191 "is_configured": true, 00:29:34.191 "data_offset": 256, 00:29:34.191 "data_size": 7936 00:29:34.191 } 00:29:34.191 ] 00:29:34.191 }' 00:29:34.191 20:05:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:34.191 20:05:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:34.191 20:05:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:34.191 20:05:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:34.191 20:05:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:34.450 [2024-07-24 20:05:25.917499] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:34.450 [2024-07-24 20:05:25.953101] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:34.450 [2024-07-24 20:05:25.953147] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:34.450 [2024-07-24 20:05:25.953163] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:34.450 [2024-07-24 20:05:25.953171] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:34.450 20:05:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@783 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:34.450 20:05:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:34.450 20:05:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:34.450 20:05:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:34.450 20:05:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:34.450 20:05:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:34.450 20:05:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:34.450 20:05:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:34.450 20:05:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:34.450 20:05:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:34.450 20:05:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:34.450 20:05:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:34.709 20:05:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:34.709 "name": "raid_bdev1", 00:29:34.709 "uuid": "267cee7b-ac52-43bb-a9e5-422043a56d72", 00:29:34.709 "strip_size_kb": 0, 00:29:34.709 "state": "online", 00:29:34.709 "raid_level": "raid1", 00:29:34.709 "superblock": true, 00:29:34.709 "num_base_bdevs": 2, 00:29:34.709 "num_base_bdevs_discovered": 1, 00:29:34.709 "num_base_bdevs_operational": 1, 00:29:34.709 "base_bdevs_list": [ 00:29:34.709 { 00:29:34.709 "name": null, 00:29:34.709 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:34.709 "is_configured": false, 00:29:34.709 "data_offset": 256, 00:29:34.709 "data_size": 7936 00:29:34.709 }, 00:29:34.709 { 00:29:34.709 "name": "BaseBdev2", 00:29:34.709 "uuid": "99e8ba4b-bad0-523b-8f43-0aec5f9b6593", 00:29:34.709 "is_configured": true, 00:29:34.709 "data_offset": 256, 00:29:34.709 "data_size": 7936 00:29:34.709 } 00:29:34.709 ] 00:29:34.709 }' 00:29:34.709 20:05:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:34.709 20:05:26 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:35.275 20:05:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:35.275 20:05:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:35.275 20:05:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:35.276 20:05:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:35.276 20:05:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:35.276 20:05:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:35.276 20:05:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:35.533 20:05:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:35.533 "name": "raid_bdev1", 00:29:35.533 "uuid": "267cee7b-ac52-43bb-a9e5-422043a56d72", 00:29:35.533 "strip_size_kb": 0, 00:29:35.533 "state": "online", 00:29:35.533 "raid_level": "raid1", 00:29:35.533 "superblock": true, 00:29:35.533 "num_base_bdevs": 2, 00:29:35.533 "num_base_bdevs_discovered": 1, 00:29:35.533 "num_base_bdevs_operational": 1, 00:29:35.533 "base_bdevs_list": [ 00:29:35.533 { 00:29:35.533 "name": null, 00:29:35.533 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:35.533 "is_configured": false, 00:29:35.533 "data_offset": 256, 00:29:35.533 "data_size": 7936 00:29:35.533 }, 00:29:35.533 { 00:29:35.533 "name": "BaseBdev2", 00:29:35.533 "uuid": "99e8ba4b-bad0-523b-8f43-0aec5f9b6593", 00:29:35.533 "is_configured": true, 00:29:35.533 "data_offset": 256, 00:29:35.533 "data_size": 7936 00:29:35.533 } 00:29:35.533 ] 00:29:35.533 }' 00:29:35.533 20:05:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:35.792 20:05:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:35.792 20:05:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:35.792 20:05:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:35.792 20:05:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@787 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:29:36.050 20:05:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@788 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:29:36.308 [2024-07-24 20:05:27.653229] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:29:36.308 [2024-07-24 20:05:27.653277] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:36.308 [2024-07-24 20:05:27.653297] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2113290 00:29:36.308 [2024-07-24 20:05:27.653309] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:36.308 [2024-07-24 20:05:27.653514] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:36.308 [2024-07-24 20:05:27.653531] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:36.308 [2024-07-24 20:05:27.653576] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:29:36.308 [2024-07-24 20:05:27.653588] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:36.308 [2024-07-24 20:05:27.653598] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:36.308 BaseBdev1 00:29:36.308 20:05:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@789 -- # sleep 1 00:29:37.244 20:05:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@790 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:37.244 20:05:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:37.244 20:05:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:37.244 20:05:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:37.244 20:05:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:37.244 20:05:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:37.244 20:05:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:37.244 20:05:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:37.244 20:05:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:37.244 20:05:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:37.244 20:05:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:37.244 20:05:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:37.502 20:05:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:37.502 "name": "raid_bdev1", 00:29:37.502 "uuid": "267cee7b-ac52-43bb-a9e5-422043a56d72", 00:29:37.502 "strip_size_kb": 0, 00:29:37.502 "state": "online", 00:29:37.502 "raid_level": "raid1", 00:29:37.502 "superblock": true, 00:29:37.502 "num_base_bdevs": 2, 00:29:37.502 "num_base_bdevs_discovered": 1, 00:29:37.502 "num_base_bdevs_operational": 1, 00:29:37.502 "base_bdevs_list": [ 00:29:37.502 { 00:29:37.502 "name": null, 00:29:37.502 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:37.502 "is_configured": false, 00:29:37.502 "data_offset": 256, 00:29:37.502 "data_size": 7936 00:29:37.502 }, 00:29:37.502 { 00:29:37.502 "name": "BaseBdev2", 00:29:37.502 "uuid": "99e8ba4b-bad0-523b-8f43-0aec5f9b6593", 00:29:37.502 "is_configured": true, 00:29:37.502 "data_offset": 256, 00:29:37.502 "data_size": 7936 00:29:37.502 } 00:29:37.502 ] 00:29:37.502 }' 00:29:37.502 20:05:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:37.502 20:05:28 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:38.070 20:05:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:38.070 20:05:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:38.070 20:05:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:38.070 20:05:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:38.070 20:05:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:38.070 20:05:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:38.070 20:05:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:38.329 20:05:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:38.329 "name": "raid_bdev1", 00:29:38.329 "uuid": "267cee7b-ac52-43bb-a9e5-422043a56d72", 00:29:38.329 "strip_size_kb": 0, 00:29:38.329 "state": "online", 00:29:38.329 "raid_level": "raid1", 00:29:38.329 "superblock": true, 00:29:38.329 "num_base_bdevs": 2, 00:29:38.329 "num_base_bdevs_discovered": 1, 00:29:38.329 "num_base_bdevs_operational": 1, 00:29:38.329 "base_bdevs_list": [ 00:29:38.329 { 00:29:38.329 "name": null, 00:29:38.329 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:38.329 "is_configured": false, 00:29:38.329 "data_offset": 256, 00:29:38.329 "data_size": 7936 00:29:38.329 }, 00:29:38.329 { 00:29:38.329 "name": "BaseBdev2", 00:29:38.329 "uuid": "99e8ba4b-bad0-523b-8f43-0aec5f9b6593", 00:29:38.329 "is_configured": true, 00:29:38.329 "data_offset": 256, 00:29:38.329 "data_size": 7936 00:29:38.329 } 00:29:38.329 ] 00:29:38.329 }' 00:29:38.329 20:05:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:38.329 20:05:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:38.329 20:05:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:38.329 20:05:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:38.329 20:05:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@792 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:38.329 20:05:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@650 -- # local es=0 00:29:38.329 20:05:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:38.329 20:05:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:38.329 20:05:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:29:38.329 20:05:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:38.329 20:05:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:29:38.329 20:05:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:38.329 20:05:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:29:38.329 20:05:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:38.329 20:05:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:29:38.329 20:05:29 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:38.588 [2024-07-24 20:05:30.111777] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:38.588 [2024-07-24 20:05:30.111905] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:38.588 [2024-07-24 20:05:30.111921] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:38.588 request: 00:29:38.588 { 00:29:38.588 "base_bdev": "BaseBdev1", 00:29:38.588 "raid_bdev": "raid_bdev1", 00:29:38.588 "method": "bdev_raid_add_base_bdev", 00:29:38.588 "req_id": 1 00:29:38.588 } 00:29:38.588 Got JSON-RPC error response 00:29:38.588 response: 00:29:38.588 { 00:29:38.588 "code": -22, 00:29:38.588 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:29:38.588 } 00:29:38.588 20:05:30 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@653 -- # es=1 00:29:38.588 20:05:30 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:29:38.588 20:05:30 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:29:38.588 20:05:30 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:29:38.588 20:05:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@793 -- # sleep 1 00:29:39.967 20:05:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@794 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:39.967 20:05:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:39.967 20:05:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:39.967 20:05:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:39.967 20:05:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:39.967 20:05:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:39.967 20:05:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:39.967 20:05:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:39.967 20:05:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:39.967 20:05:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:39.967 20:05:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:39.967 20:05:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:39.967 20:05:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:39.967 "name": "raid_bdev1", 00:29:39.967 "uuid": "267cee7b-ac52-43bb-a9e5-422043a56d72", 00:29:39.967 "strip_size_kb": 0, 00:29:39.967 "state": "online", 00:29:39.967 "raid_level": "raid1", 00:29:39.967 "superblock": true, 00:29:39.967 "num_base_bdevs": 2, 00:29:39.967 "num_base_bdevs_discovered": 1, 00:29:39.967 "num_base_bdevs_operational": 1, 00:29:39.967 "base_bdevs_list": [ 00:29:39.967 { 00:29:39.967 "name": null, 00:29:39.967 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:39.967 "is_configured": false, 00:29:39.967 "data_offset": 256, 00:29:39.967 "data_size": 7936 00:29:39.967 }, 00:29:39.967 { 00:29:39.967 "name": "BaseBdev2", 00:29:39.967 "uuid": "99e8ba4b-bad0-523b-8f43-0aec5f9b6593", 00:29:39.967 "is_configured": true, 00:29:39.967 "data_offset": 256, 00:29:39.967 "data_size": 7936 00:29:39.967 } 00:29:39.967 ] 00:29:39.967 }' 00:29:39.967 20:05:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:39.967 20:05:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:40.535 20:05:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@795 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:40.535 20:05:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:40.535 20:05:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:40.535 20:05:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:40.535 20:05:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:40.535 20:05:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:40.535 20:05:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:40.795 20:05:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:40.795 "name": "raid_bdev1", 00:29:40.795 "uuid": "267cee7b-ac52-43bb-a9e5-422043a56d72", 00:29:40.795 "strip_size_kb": 0, 00:29:40.795 "state": "online", 00:29:40.795 "raid_level": "raid1", 00:29:40.795 "superblock": true, 00:29:40.795 "num_base_bdevs": 2, 00:29:40.795 "num_base_bdevs_discovered": 1, 00:29:40.795 "num_base_bdevs_operational": 1, 00:29:40.795 "base_bdevs_list": [ 00:29:40.795 { 00:29:40.795 "name": null, 00:29:40.795 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:40.795 "is_configured": false, 00:29:40.795 "data_offset": 256, 00:29:40.795 "data_size": 7936 00:29:40.795 }, 00:29:40.795 { 00:29:40.795 "name": "BaseBdev2", 00:29:40.795 "uuid": "99e8ba4b-bad0-523b-8f43-0aec5f9b6593", 00:29:40.795 "is_configured": true, 00:29:40.795 "data_offset": 256, 00:29:40.795 "data_size": 7936 00:29:40.795 } 00:29:40.795 ] 00:29:40.795 }' 00:29:40.795 20:05:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:40.795 20:05:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:40.795 20:05:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:40.795 20:05:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:40.795 20:05:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@798 -- # killprocess 1530742 00:29:40.795 20:05:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@950 -- # '[' -z 1530742 ']' 00:29:40.795 20:05:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # kill -0 1530742 00:29:40.795 20:05:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@955 -- # uname 00:29:40.795 20:05:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:29:40.795 20:05:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1530742 00:29:41.054 20:05:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:29:41.054 20:05:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:29:41.054 20:05:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1530742' 00:29:41.054 killing process with pid 1530742 00:29:41.054 20:05:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@969 -- # kill 1530742 00:29:41.054 Received shutdown signal, test time was about 60.000000 seconds 00:29:41.054 00:29:41.054 Latency(us) 00:29:41.054 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:41.054 =================================================================================================================== 00:29:41.054 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:29:41.054 [2024-07-24 20:05:32.432584] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:41.054 [2024-07-24 20:05:32.432675] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:41.054 [2024-07-24 20:05:32.432721] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:41.054 [2024-07-24 20:05:32.432734] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x210ed90 name raid_bdev1, state offline 00:29:41.054 20:05:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@974 -- # wait 1530742 00:29:41.054 [2024-07-24 20:05:32.471474] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:41.314 20:05:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@800 -- # return 0 00:29:41.314 00:29:41.314 real 0m31.520s 00:29:41.314 user 0m49.478s 00:29:41.314 sys 0m5.226s 00:29:41.314 20:05:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1126 -- # xtrace_disable 00:29:41.314 20:05:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:41.314 ************************************ 00:29:41.314 END TEST raid_rebuild_test_sb_md_separate 00:29:41.314 ************************************ 00:29:41.314 20:05:32 bdev_raid -- bdev/bdev_raid.sh@991 -- # base_malloc_params='-m 32 -i' 00:29:41.314 20:05:32 bdev_raid -- bdev/bdev_raid.sh@992 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:29:41.314 20:05:32 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:29:41.314 20:05:32 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:29:41.314 20:05:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:41.314 ************************************ 00:29:41.314 START TEST raid_state_function_test_sb_md_interleaved 00:29:41.314 ************************************ 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 2 true 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=1535245 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1535245' 00:29:41.314 Process raid pid: 1535245 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 1535245 /var/tmp/spdk-raid.sock 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@831 -- # '[' -z 1535245 ']' 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # local max_retries=100 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:41.314 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@840 -- # xtrace_disable 00:29:41.314 20:05:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:41.314 [2024-07-24 20:05:32.850578] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:29:41.314 [2024-07-24 20:05:32.850646] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:29:41.573 [2024-07-24 20:05:32.975014] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:41.573 [2024-07-24 20:05:33.082303] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:41.573 [2024-07-24 20:05:33.144630] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:41.573 [2024-07-24 20:05:33.144685] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:42.510 20:05:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:29:42.510 20:05:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@864 -- # return 0 00:29:42.510 20:05:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:29:42.510 [2024-07-24 20:05:34.049207] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:29:42.510 [2024-07-24 20:05:34.049253] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:29:42.510 [2024-07-24 20:05:34.049264] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:42.510 [2024-07-24 20:05:34.049276] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:42.510 20:05:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:29:42.510 20:05:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:42.510 20:05:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:42.510 20:05:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:42.510 20:05:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:42.510 20:05:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:42.510 20:05:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:42.511 20:05:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:42.511 20:05:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:42.511 20:05:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:42.511 20:05:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:42.511 20:05:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:42.770 20:05:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:42.770 "name": "Existed_Raid", 00:29:42.770 "uuid": "06fdd8d6-3f84-41c2-a8e2-0ed685032bb2", 00:29:42.770 "strip_size_kb": 0, 00:29:42.770 "state": "configuring", 00:29:42.770 "raid_level": "raid1", 00:29:42.770 "superblock": true, 00:29:42.770 "num_base_bdevs": 2, 00:29:42.770 "num_base_bdevs_discovered": 0, 00:29:42.770 "num_base_bdevs_operational": 2, 00:29:42.770 "base_bdevs_list": [ 00:29:42.770 { 00:29:42.770 "name": "BaseBdev1", 00:29:42.770 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:42.770 "is_configured": false, 00:29:42.770 "data_offset": 0, 00:29:42.770 "data_size": 0 00:29:42.770 }, 00:29:42.770 { 00:29:42.770 "name": "BaseBdev2", 00:29:42.770 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:42.770 "is_configured": false, 00:29:42.770 "data_offset": 0, 00:29:42.770 "data_size": 0 00:29:42.770 } 00:29:42.770 ] 00:29:42.770 }' 00:29:42.770 20:05:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:42.770 20:05:34 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:43.705 20:05:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:29:43.705 [2024-07-24 20:05:35.244243] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:29:43.705 [2024-07-24 20:05:35.244271] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x171d9f0 name Existed_Raid, state configuring 00:29:43.705 20:05:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:29:43.963 [2024-07-24 20:05:35.484900] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:29:43.963 [2024-07-24 20:05:35.484931] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:29:43.963 [2024-07-24 20:05:35.484941] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:43.963 [2024-07-24 20:05:35.484952] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:43.963 20:05:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:29:44.222 [2024-07-24 20:05:35.727958] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:44.222 BaseBdev1 00:29:44.222 20:05:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:29:44.222 20:05:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:29:44.222 20:05:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:29:44.222 20:05:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # local i 00:29:44.222 20:05:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:29:44.222 20:05:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:29:44.222 20:05:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:44.481 20:05:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:29:44.740 [ 00:29:44.740 { 00:29:44.740 "name": "BaseBdev1", 00:29:44.740 "aliases": [ 00:29:44.740 "a499368a-7af5-4d2f-becb-cb2a303f7c5a" 00:29:44.740 ], 00:29:44.740 "product_name": "Malloc disk", 00:29:44.740 "block_size": 4128, 00:29:44.740 "num_blocks": 8192, 00:29:44.740 "uuid": "a499368a-7af5-4d2f-becb-cb2a303f7c5a", 00:29:44.740 "md_size": 32, 00:29:44.740 "md_interleave": true, 00:29:44.740 "dif_type": 0, 00:29:44.740 "assigned_rate_limits": { 00:29:44.740 "rw_ios_per_sec": 0, 00:29:44.740 "rw_mbytes_per_sec": 0, 00:29:44.740 "r_mbytes_per_sec": 0, 00:29:44.740 "w_mbytes_per_sec": 0 00:29:44.740 }, 00:29:44.740 "claimed": true, 00:29:44.740 "claim_type": "exclusive_write", 00:29:44.740 "zoned": false, 00:29:44.740 "supported_io_types": { 00:29:44.740 "read": true, 00:29:44.740 "write": true, 00:29:44.740 "unmap": true, 00:29:44.740 "flush": true, 00:29:44.740 "reset": true, 00:29:44.740 "nvme_admin": false, 00:29:44.740 "nvme_io": false, 00:29:44.740 "nvme_io_md": false, 00:29:44.740 "write_zeroes": true, 00:29:44.740 "zcopy": true, 00:29:44.740 "get_zone_info": false, 00:29:44.740 "zone_management": false, 00:29:44.740 "zone_append": false, 00:29:44.740 "compare": false, 00:29:44.740 "compare_and_write": false, 00:29:44.740 "abort": true, 00:29:44.740 "seek_hole": false, 00:29:44.740 "seek_data": false, 00:29:44.740 "copy": true, 00:29:44.740 "nvme_iov_md": false 00:29:44.740 }, 00:29:44.740 "memory_domains": [ 00:29:44.740 { 00:29:44.740 "dma_device_id": "system", 00:29:44.740 "dma_device_type": 1 00:29:44.740 }, 00:29:44.740 { 00:29:44.740 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:44.740 "dma_device_type": 2 00:29:44.740 } 00:29:44.740 ], 00:29:44.740 "driver_specific": {} 00:29:44.740 } 00:29:44.740 ] 00:29:44.740 20:05:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@907 -- # return 0 00:29:44.741 20:05:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:29:44.741 20:05:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:44.741 20:05:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:44.741 20:05:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:44.741 20:05:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:44.741 20:05:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:44.741 20:05:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:44.741 20:05:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:44.741 20:05:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:44.741 20:05:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:44.741 20:05:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:44.741 20:05:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:45.000 20:05:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:45.000 "name": "Existed_Raid", 00:29:45.000 "uuid": "10862922-dfea-4183-9e42-2f240a6864fb", 00:29:45.000 "strip_size_kb": 0, 00:29:45.000 "state": "configuring", 00:29:45.000 "raid_level": "raid1", 00:29:45.000 "superblock": true, 00:29:45.000 "num_base_bdevs": 2, 00:29:45.000 "num_base_bdevs_discovered": 1, 00:29:45.000 "num_base_bdevs_operational": 2, 00:29:45.000 "base_bdevs_list": [ 00:29:45.000 { 00:29:45.000 "name": "BaseBdev1", 00:29:45.000 "uuid": "a499368a-7af5-4d2f-becb-cb2a303f7c5a", 00:29:45.000 "is_configured": true, 00:29:45.000 "data_offset": 256, 00:29:45.000 "data_size": 7936 00:29:45.000 }, 00:29:45.000 { 00:29:45.000 "name": "BaseBdev2", 00:29:45.000 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:45.000 "is_configured": false, 00:29:45.000 "data_offset": 0, 00:29:45.000 "data_size": 0 00:29:45.000 } 00:29:45.000 ] 00:29:45.000 }' 00:29:45.000 20:05:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:45.000 20:05:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:45.982 20:05:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:29:46.241 [2024-07-24 20:05:37.761373] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:29:46.241 [2024-07-24 20:05:37.761426] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x171d2e0 name Existed_Raid, state configuring 00:29:46.241 20:05:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:29:46.500 [2024-07-24 20:05:38.010074] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:46.500 [2024-07-24 20:05:38.011617] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:46.500 [2024-07-24 20:05:38.011650] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:46.500 20:05:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:29:46.500 20:05:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:29:46.500 20:05:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:29:46.500 20:05:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:46.500 20:05:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:46.500 20:05:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:46.501 20:05:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:46.501 20:05:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:46.501 20:05:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:46.501 20:05:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:46.501 20:05:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:46.501 20:05:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:46.501 20:05:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:46.501 20:05:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:46.760 20:05:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:46.760 "name": "Existed_Raid", 00:29:46.760 "uuid": "7d769d2e-4ef1-4c88-83da-0212976f9a3e", 00:29:46.760 "strip_size_kb": 0, 00:29:46.760 "state": "configuring", 00:29:46.760 "raid_level": "raid1", 00:29:46.760 "superblock": true, 00:29:46.760 "num_base_bdevs": 2, 00:29:46.760 "num_base_bdevs_discovered": 1, 00:29:46.760 "num_base_bdevs_operational": 2, 00:29:46.760 "base_bdevs_list": [ 00:29:46.760 { 00:29:46.760 "name": "BaseBdev1", 00:29:46.760 "uuid": "a499368a-7af5-4d2f-becb-cb2a303f7c5a", 00:29:46.760 "is_configured": true, 00:29:46.760 "data_offset": 256, 00:29:46.760 "data_size": 7936 00:29:46.760 }, 00:29:46.760 { 00:29:46.761 "name": "BaseBdev2", 00:29:46.761 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:46.761 "is_configured": false, 00:29:46.761 "data_offset": 0, 00:29:46.761 "data_size": 0 00:29:46.761 } 00:29:46.761 ] 00:29:46.761 }' 00:29:46.761 20:05:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:46.761 20:05:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:48.138 20:05:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:29:48.138 [2024-07-24 20:05:39.595122] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:48.138 [2024-07-24 20:05:39.595265] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x171f1c0 00:29:48.138 [2024-07-24 20:05:39.595278] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:48.138 [2024-07-24 20:05:39.595338] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x171c9e0 00:29:48.138 [2024-07-24 20:05:39.595427] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x171f1c0 00:29:48.138 [2024-07-24 20:05:39.595438] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x171f1c0 00:29:48.138 [2024-07-24 20:05:39.595493] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:48.138 BaseBdev2 00:29:48.138 20:05:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:29:48.138 20:05:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:29:48.138 20:05:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:29:48.138 20:05:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # local i 00:29:48.138 20:05:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:29:48.138 20:05:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:29:48.138 20:05:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:48.397 20:05:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:29:48.656 [ 00:29:48.656 { 00:29:48.656 "name": "BaseBdev2", 00:29:48.656 "aliases": [ 00:29:48.656 "e54137aa-63a9-47fc-b73e-306e37f15aa1" 00:29:48.656 ], 00:29:48.656 "product_name": "Malloc disk", 00:29:48.656 "block_size": 4128, 00:29:48.656 "num_blocks": 8192, 00:29:48.656 "uuid": "e54137aa-63a9-47fc-b73e-306e37f15aa1", 00:29:48.656 "md_size": 32, 00:29:48.656 "md_interleave": true, 00:29:48.656 "dif_type": 0, 00:29:48.656 "assigned_rate_limits": { 00:29:48.656 "rw_ios_per_sec": 0, 00:29:48.656 "rw_mbytes_per_sec": 0, 00:29:48.656 "r_mbytes_per_sec": 0, 00:29:48.656 "w_mbytes_per_sec": 0 00:29:48.656 }, 00:29:48.656 "claimed": true, 00:29:48.656 "claim_type": "exclusive_write", 00:29:48.656 "zoned": false, 00:29:48.656 "supported_io_types": { 00:29:48.656 "read": true, 00:29:48.656 "write": true, 00:29:48.656 "unmap": true, 00:29:48.656 "flush": true, 00:29:48.656 "reset": true, 00:29:48.656 "nvme_admin": false, 00:29:48.656 "nvme_io": false, 00:29:48.656 "nvme_io_md": false, 00:29:48.656 "write_zeroes": true, 00:29:48.656 "zcopy": true, 00:29:48.656 "get_zone_info": false, 00:29:48.656 "zone_management": false, 00:29:48.656 "zone_append": false, 00:29:48.656 "compare": false, 00:29:48.656 "compare_and_write": false, 00:29:48.656 "abort": true, 00:29:48.656 "seek_hole": false, 00:29:48.656 "seek_data": false, 00:29:48.656 "copy": true, 00:29:48.656 "nvme_iov_md": false 00:29:48.656 }, 00:29:48.656 "memory_domains": [ 00:29:48.656 { 00:29:48.656 "dma_device_id": "system", 00:29:48.656 "dma_device_type": 1 00:29:48.656 }, 00:29:48.656 { 00:29:48.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:48.656 "dma_device_type": 2 00:29:48.656 } 00:29:48.656 ], 00:29:48.656 "driver_specific": {} 00:29:48.656 } 00:29:48.656 ] 00:29:48.656 20:05:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@907 -- # return 0 00:29:48.656 20:05:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:29:48.656 20:05:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:29:48.656 20:05:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:29:48.656 20:05:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:48.656 20:05:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:48.656 20:05:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:48.656 20:05:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:48.656 20:05:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:48.656 20:05:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:48.656 20:05:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:48.656 20:05:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:48.656 20:05:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:48.656 20:05:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:48.656 20:05:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:49.224 20:05:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:49.224 "name": "Existed_Raid", 00:29:49.224 "uuid": "7d769d2e-4ef1-4c88-83da-0212976f9a3e", 00:29:49.224 "strip_size_kb": 0, 00:29:49.224 "state": "online", 00:29:49.224 "raid_level": "raid1", 00:29:49.224 "superblock": true, 00:29:49.224 "num_base_bdevs": 2, 00:29:49.224 "num_base_bdevs_discovered": 2, 00:29:49.224 "num_base_bdevs_operational": 2, 00:29:49.224 "base_bdevs_list": [ 00:29:49.224 { 00:29:49.224 "name": "BaseBdev1", 00:29:49.224 "uuid": "a499368a-7af5-4d2f-becb-cb2a303f7c5a", 00:29:49.224 "is_configured": true, 00:29:49.224 "data_offset": 256, 00:29:49.224 "data_size": 7936 00:29:49.224 }, 00:29:49.224 { 00:29:49.224 "name": "BaseBdev2", 00:29:49.224 "uuid": "e54137aa-63a9-47fc-b73e-306e37f15aa1", 00:29:49.224 "is_configured": true, 00:29:49.224 "data_offset": 256, 00:29:49.224 "data_size": 7936 00:29:49.224 } 00:29:49.224 ] 00:29:49.224 }' 00:29:49.224 20:05:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:49.224 20:05:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:49.812 20:05:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:29:49.812 20:05:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:29:49.812 20:05:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:49.812 20:05:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:49.812 20:05:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:49.812 20:05:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:29:49.812 20:05:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:29:49.812 20:05:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:50.071 [2024-07-24 20:05:41.556657] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:50.071 20:05:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:50.071 "name": "Existed_Raid", 00:29:50.071 "aliases": [ 00:29:50.071 "7d769d2e-4ef1-4c88-83da-0212976f9a3e" 00:29:50.071 ], 00:29:50.071 "product_name": "Raid Volume", 00:29:50.071 "block_size": 4128, 00:29:50.071 "num_blocks": 7936, 00:29:50.071 "uuid": "7d769d2e-4ef1-4c88-83da-0212976f9a3e", 00:29:50.071 "md_size": 32, 00:29:50.071 "md_interleave": true, 00:29:50.071 "dif_type": 0, 00:29:50.071 "assigned_rate_limits": { 00:29:50.071 "rw_ios_per_sec": 0, 00:29:50.071 "rw_mbytes_per_sec": 0, 00:29:50.071 "r_mbytes_per_sec": 0, 00:29:50.071 "w_mbytes_per_sec": 0 00:29:50.071 }, 00:29:50.071 "claimed": false, 00:29:50.071 "zoned": false, 00:29:50.071 "supported_io_types": { 00:29:50.071 "read": true, 00:29:50.071 "write": true, 00:29:50.071 "unmap": false, 00:29:50.071 "flush": false, 00:29:50.071 "reset": true, 00:29:50.071 "nvme_admin": false, 00:29:50.071 "nvme_io": false, 00:29:50.071 "nvme_io_md": false, 00:29:50.071 "write_zeroes": true, 00:29:50.071 "zcopy": false, 00:29:50.071 "get_zone_info": false, 00:29:50.071 "zone_management": false, 00:29:50.071 "zone_append": false, 00:29:50.071 "compare": false, 00:29:50.071 "compare_and_write": false, 00:29:50.071 "abort": false, 00:29:50.071 "seek_hole": false, 00:29:50.071 "seek_data": false, 00:29:50.071 "copy": false, 00:29:50.071 "nvme_iov_md": false 00:29:50.071 }, 00:29:50.071 "memory_domains": [ 00:29:50.071 { 00:29:50.071 "dma_device_id": "system", 00:29:50.071 "dma_device_type": 1 00:29:50.071 }, 00:29:50.071 { 00:29:50.071 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:50.071 "dma_device_type": 2 00:29:50.071 }, 00:29:50.071 { 00:29:50.071 "dma_device_id": "system", 00:29:50.071 "dma_device_type": 1 00:29:50.071 }, 00:29:50.071 { 00:29:50.071 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:50.071 "dma_device_type": 2 00:29:50.071 } 00:29:50.071 ], 00:29:50.071 "driver_specific": { 00:29:50.071 "raid": { 00:29:50.071 "uuid": "7d769d2e-4ef1-4c88-83da-0212976f9a3e", 00:29:50.071 "strip_size_kb": 0, 00:29:50.071 "state": "online", 00:29:50.071 "raid_level": "raid1", 00:29:50.071 "superblock": true, 00:29:50.071 "num_base_bdevs": 2, 00:29:50.071 "num_base_bdevs_discovered": 2, 00:29:50.071 "num_base_bdevs_operational": 2, 00:29:50.071 "base_bdevs_list": [ 00:29:50.071 { 00:29:50.071 "name": "BaseBdev1", 00:29:50.071 "uuid": "a499368a-7af5-4d2f-becb-cb2a303f7c5a", 00:29:50.071 "is_configured": true, 00:29:50.071 "data_offset": 256, 00:29:50.071 "data_size": 7936 00:29:50.071 }, 00:29:50.071 { 00:29:50.071 "name": "BaseBdev2", 00:29:50.071 "uuid": "e54137aa-63a9-47fc-b73e-306e37f15aa1", 00:29:50.071 "is_configured": true, 00:29:50.071 "data_offset": 256, 00:29:50.071 "data_size": 7936 00:29:50.071 } 00:29:50.071 ] 00:29:50.071 } 00:29:50.071 } 00:29:50.071 }' 00:29:50.071 20:05:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:50.071 20:05:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:29:50.071 BaseBdev2' 00:29:50.071 20:05:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:50.071 20:05:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:29:50.071 20:05:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:50.639 20:05:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:50.639 "name": "BaseBdev1", 00:29:50.639 "aliases": [ 00:29:50.639 "a499368a-7af5-4d2f-becb-cb2a303f7c5a" 00:29:50.639 ], 00:29:50.639 "product_name": "Malloc disk", 00:29:50.639 "block_size": 4128, 00:29:50.639 "num_blocks": 8192, 00:29:50.639 "uuid": "a499368a-7af5-4d2f-becb-cb2a303f7c5a", 00:29:50.639 "md_size": 32, 00:29:50.639 "md_interleave": true, 00:29:50.639 "dif_type": 0, 00:29:50.639 "assigned_rate_limits": { 00:29:50.640 "rw_ios_per_sec": 0, 00:29:50.640 "rw_mbytes_per_sec": 0, 00:29:50.640 "r_mbytes_per_sec": 0, 00:29:50.640 "w_mbytes_per_sec": 0 00:29:50.640 }, 00:29:50.640 "claimed": true, 00:29:50.640 "claim_type": "exclusive_write", 00:29:50.640 "zoned": false, 00:29:50.640 "supported_io_types": { 00:29:50.640 "read": true, 00:29:50.640 "write": true, 00:29:50.640 "unmap": true, 00:29:50.640 "flush": true, 00:29:50.640 "reset": true, 00:29:50.640 "nvme_admin": false, 00:29:50.640 "nvme_io": false, 00:29:50.640 "nvme_io_md": false, 00:29:50.640 "write_zeroes": true, 00:29:50.640 "zcopy": true, 00:29:50.640 "get_zone_info": false, 00:29:50.640 "zone_management": false, 00:29:50.640 "zone_append": false, 00:29:50.640 "compare": false, 00:29:50.640 "compare_and_write": false, 00:29:50.640 "abort": true, 00:29:50.640 "seek_hole": false, 00:29:50.640 "seek_data": false, 00:29:50.640 "copy": true, 00:29:50.640 "nvme_iov_md": false 00:29:50.640 }, 00:29:50.640 "memory_domains": [ 00:29:50.640 { 00:29:50.640 "dma_device_id": "system", 00:29:50.640 "dma_device_type": 1 00:29:50.640 }, 00:29:50.640 { 00:29:50.640 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:50.640 "dma_device_type": 2 00:29:50.640 } 00:29:50.640 ], 00:29:50.640 "driver_specific": {} 00:29:50.640 }' 00:29:50.640 20:05:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:50.640 20:05:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:50.898 20:05:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:50.898 20:05:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:50.898 20:05:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:50.898 20:05:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:50.898 20:05:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:50.898 20:05:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:50.898 20:05:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:50.898 20:05:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:50.898 20:05:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:51.158 20:05:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:51.158 20:05:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:51.158 20:05:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:29:51.158 20:05:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:51.158 20:05:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:51.158 "name": "BaseBdev2", 00:29:51.159 "aliases": [ 00:29:51.159 "e54137aa-63a9-47fc-b73e-306e37f15aa1" 00:29:51.159 ], 00:29:51.159 "product_name": "Malloc disk", 00:29:51.159 "block_size": 4128, 00:29:51.159 "num_blocks": 8192, 00:29:51.159 "uuid": "e54137aa-63a9-47fc-b73e-306e37f15aa1", 00:29:51.159 "md_size": 32, 00:29:51.159 "md_interleave": true, 00:29:51.159 "dif_type": 0, 00:29:51.159 "assigned_rate_limits": { 00:29:51.159 "rw_ios_per_sec": 0, 00:29:51.159 "rw_mbytes_per_sec": 0, 00:29:51.159 "r_mbytes_per_sec": 0, 00:29:51.159 "w_mbytes_per_sec": 0 00:29:51.159 }, 00:29:51.159 "claimed": true, 00:29:51.159 "claim_type": "exclusive_write", 00:29:51.159 "zoned": false, 00:29:51.159 "supported_io_types": { 00:29:51.159 "read": true, 00:29:51.159 "write": true, 00:29:51.159 "unmap": true, 00:29:51.159 "flush": true, 00:29:51.159 "reset": true, 00:29:51.159 "nvme_admin": false, 00:29:51.159 "nvme_io": false, 00:29:51.159 "nvme_io_md": false, 00:29:51.159 "write_zeroes": true, 00:29:51.159 "zcopy": true, 00:29:51.159 "get_zone_info": false, 00:29:51.159 "zone_management": false, 00:29:51.159 "zone_append": false, 00:29:51.159 "compare": false, 00:29:51.159 "compare_and_write": false, 00:29:51.159 "abort": true, 00:29:51.159 "seek_hole": false, 00:29:51.159 "seek_data": false, 00:29:51.159 "copy": true, 00:29:51.159 "nvme_iov_md": false 00:29:51.159 }, 00:29:51.159 "memory_domains": [ 00:29:51.159 { 00:29:51.159 "dma_device_id": "system", 00:29:51.159 "dma_device_type": 1 00:29:51.159 }, 00:29:51.159 { 00:29:51.159 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:51.159 "dma_device_type": 2 00:29:51.159 } 00:29:51.159 ], 00:29:51.159 "driver_specific": {} 00:29:51.159 }' 00:29:51.159 20:05:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:51.418 20:05:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:51.418 20:05:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:51.418 20:05:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:51.418 20:05:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:51.418 20:05:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:51.418 20:05:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:51.418 20:05:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:51.677 20:05:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:51.677 20:05:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:51.677 20:05:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:51.677 20:05:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:51.677 20:05:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:29:51.936 [2024-07-24 20:05:43.333156] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:51.936 20:05:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:29:51.936 20:05:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:29:51.936 20:05:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:29:51.936 20:05:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:29:51.936 20:05:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:29:51.936 20:05:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:29:51.936 20:05:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:51.936 20:05:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:51.936 20:05:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:51.936 20:05:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:51.936 20:05:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:51.936 20:05:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:51.936 20:05:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:51.936 20:05:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:51.936 20:05:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:51.936 20:05:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:51.936 20:05:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:52.196 20:05:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:52.196 "name": "Existed_Raid", 00:29:52.196 "uuid": "7d769d2e-4ef1-4c88-83da-0212976f9a3e", 00:29:52.196 "strip_size_kb": 0, 00:29:52.196 "state": "online", 00:29:52.196 "raid_level": "raid1", 00:29:52.196 "superblock": true, 00:29:52.196 "num_base_bdevs": 2, 00:29:52.196 "num_base_bdevs_discovered": 1, 00:29:52.196 "num_base_bdevs_operational": 1, 00:29:52.196 "base_bdevs_list": [ 00:29:52.196 { 00:29:52.196 "name": null, 00:29:52.196 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:52.196 "is_configured": false, 00:29:52.196 "data_offset": 256, 00:29:52.196 "data_size": 7936 00:29:52.196 }, 00:29:52.196 { 00:29:52.196 "name": "BaseBdev2", 00:29:52.196 "uuid": "e54137aa-63a9-47fc-b73e-306e37f15aa1", 00:29:52.196 "is_configured": true, 00:29:52.196 "data_offset": 256, 00:29:52.196 "data_size": 7936 00:29:52.196 } 00:29:52.196 ] 00:29:52.196 }' 00:29:52.196 20:05:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:52.196 20:05:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:52.763 20:05:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:29:52.763 20:05:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:52.763 20:05:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:52.763 20:05:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:29:53.022 20:05:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:29:53.022 20:05:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:29:53.022 20:05:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:29:53.022 [2024-07-24 20:05:44.593595] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:29:53.022 [2024-07-24 20:05:44.593676] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:53.022 [2024-07-24 20:05:44.604983] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:53.022 [2024-07-24 20:05:44.605016] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:53.022 [2024-07-24 20:05:44.605028] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x171f1c0 name Existed_Raid, state offline 00:29:53.280 20:05:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:29:53.280 20:05:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:53.280 20:05:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:53.280 20:05:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:29:53.539 20:05:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:29:53.539 20:05:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:29:53.539 20:05:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:29:53.539 20:05:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 1535245 00:29:53.539 20:05:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@950 -- # '[' -z 1535245 ']' 00:29:53.539 20:05:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # kill -0 1535245 00:29:53.539 20:05:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # uname 00:29:53.539 20:05:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:29:53.539 20:05:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1535245 00:29:53.539 20:05:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:29:53.539 20:05:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:29:53.539 20:05:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1535245' 00:29:53.539 killing process with pid 1535245 00:29:53.539 20:05:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@969 -- # kill 1535245 00:29:53.539 [2024-07-24 20:05:44.953659] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:53.539 20:05:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@974 -- # wait 1535245 00:29:53.539 [2024-07-24 20:05:44.954611] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:53.798 20:05:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:29:53.798 00:29:53.798 real 0m12.401s 00:29:53.798 user 0m22.262s 00:29:53.798 sys 0m2.209s 00:29:53.798 20:05:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1126 -- # xtrace_disable 00:29:53.798 20:05:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:53.798 ************************************ 00:29:53.798 END TEST raid_state_function_test_sb_md_interleaved 00:29:53.798 ************************************ 00:29:53.798 20:05:45 bdev_raid -- bdev/bdev_raid.sh@993 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:29:53.798 20:05:45 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:29:53.798 20:05:45 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:29:53.798 20:05:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:53.798 ************************************ 00:29:53.798 START TEST raid_superblock_test_md_interleaved 00:29:53.798 ************************************ 00:29:53.798 20:05:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 2 00:29:53.798 20:05:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@408 -- # local raid_level=raid1 00:29:53.798 20:05:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=2 00:29:53.798 20:05:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:29:53.798 20:05:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:29:53.798 20:05:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:29:53.798 20:05:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:29:53.798 20:05:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:29:53.798 20:05:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:29:53.798 20:05:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:29:53.798 20:05:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@414 -- # local strip_size 00:29:53.798 20:05:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:29:53.798 20:05:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:29:53.798 20:05:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:29:53.798 20:05:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@419 -- # '[' raid1 '!=' raid1 ']' 00:29:53.798 20:05:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@423 -- # strip_size=0 00:29:53.798 20:05:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@427 -- # raid_pid=1537052 00:29:53.798 20:05:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@428 -- # waitforlisten 1537052 /var/tmp/spdk-raid.sock 00:29:53.798 20:05:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:29:53.798 20:05:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@831 -- # '[' -z 1537052 ']' 00:29:53.798 20:05:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:53.798 20:05:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@836 -- # local max_retries=100 00:29:53.798 20:05:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:53.798 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:53.798 20:05:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@840 -- # xtrace_disable 00:29:53.798 20:05:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:53.798 [2024-07-24 20:05:45.339722] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:29:53.798 [2024-07-24 20:05:45.339794] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1537052 ] 00:29:54.058 [2024-07-24 20:05:45.469364] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:54.058 [2024-07-24 20:05:45.566946] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:54.058 [2024-07-24 20:05:45.632591] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:54.058 [2024-07-24 20:05:45.632631] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:54.625 20:05:46 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:29:54.625 20:05:46 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@864 -- # return 0 00:29:54.625 20:05:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:29:54.625 20:05:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:29:54.625 20:05:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:29:54.625 20:05:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:29:54.625 20:05:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:29:54.625 20:05:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:29:54.625 20:05:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:29:54.625 20:05:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:29:54.625 20:05:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:29:54.884 malloc1 00:29:54.884 20:05:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:55.143 [2024-07-24 20:05:46.567062] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:55.143 [2024-07-24 20:05:46.567113] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:55.143 [2024-07-24 20:05:46.567135] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14c6f40 00:29:55.143 [2024-07-24 20:05:46.567149] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:55.143 [2024-07-24 20:05:46.568699] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:55.143 [2024-07-24 20:05:46.568729] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:55.143 pt1 00:29:55.143 20:05:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:29:55.143 20:05:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:29:55.143 20:05:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:29:55.143 20:05:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:29:55.143 20:05:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:29:55.143 20:05:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:29:55.143 20:05:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:29:55.143 20:05:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:29:55.143 20:05:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:29:55.712 malloc2 00:29:55.712 20:05:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:55.970 [2024-07-24 20:05:47.378242] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:55.970 [2024-07-24 20:05:47.378296] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:55.970 [2024-07-24 20:05:47.378316] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1654830 00:29:55.970 [2024-07-24 20:05:47.378329] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:55.971 [2024-07-24 20:05:47.379870] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:55.971 [2024-07-24 20:05:47.379899] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:55.971 pt2 00:29:55.971 20:05:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:29:55.971 20:05:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:29:55.971 20:05:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:29:56.538 [2024-07-24 20:05:47.883584] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:56.538 [2024-07-24 20:05:47.885078] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:56.538 [2024-07-24 20:05:47.885247] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1647870 00:29:56.538 [2024-07-24 20:05:47.885260] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:56.538 [2024-07-24 20:05:47.885343] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14c4f70 00:29:56.538 [2024-07-24 20:05:47.885445] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1647870 00:29:56.538 [2024-07-24 20:05:47.885456] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1647870 00:29:56.538 [2024-07-24 20:05:47.885517] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:56.538 20:05:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:56.538 20:05:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:56.538 20:05:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:56.538 20:05:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:56.538 20:05:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:56.538 20:05:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:56.538 20:05:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:56.538 20:05:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:56.538 20:05:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:56.538 20:05:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:56.538 20:05:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:56.538 20:05:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:57.105 20:05:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:57.105 "name": "raid_bdev1", 00:29:57.105 "uuid": "7531c85c-6912-41cc-9b5d-a73590f73cdd", 00:29:57.105 "strip_size_kb": 0, 00:29:57.105 "state": "online", 00:29:57.105 "raid_level": "raid1", 00:29:57.105 "superblock": true, 00:29:57.105 "num_base_bdevs": 2, 00:29:57.105 "num_base_bdevs_discovered": 2, 00:29:57.105 "num_base_bdevs_operational": 2, 00:29:57.105 "base_bdevs_list": [ 00:29:57.105 { 00:29:57.105 "name": "pt1", 00:29:57.105 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:57.105 "is_configured": true, 00:29:57.105 "data_offset": 256, 00:29:57.105 "data_size": 7936 00:29:57.105 }, 00:29:57.105 { 00:29:57.105 "name": "pt2", 00:29:57.105 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:57.105 "is_configured": true, 00:29:57.105 "data_offset": 256, 00:29:57.105 "data_size": 7936 00:29:57.105 } 00:29:57.105 ] 00:29:57.106 }' 00:29:57.106 20:05:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:57.106 20:05:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:57.673 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:29:57.673 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:29:57.673 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:57.673 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:57.673 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:57.673 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:29:57.673 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:57.673 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:57.673 [2024-07-24 20:05:49.259456] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:57.932 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:57.932 "name": "raid_bdev1", 00:29:57.932 "aliases": [ 00:29:57.932 "7531c85c-6912-41cc-9b5d-a73590f73cdd" 00:29:57.932 ], 00:29:57.932 "product_name": "Raid Volume", 00:29:57.932 "block_size": 4128, 00:29:57.932 "num_blocks": 7936, 00:29:57.932 "uuid": "7531c85c-6912-41cc-9b5d-a73590f73cdd", 00:29:57.932 "md_size": 32, 00:29:57.932 "md_interleave": true, 00:29:57.932 "dif_type": 0, 00:29:57.932 "assigned_rate_limits": { 00:29:57.932 "rw_ios_per_sec": 0, 00:29:57.932 "rw_mbytes_per_sec": 0, 00:29:57.932 "r_mbytes_per_sec": 0, 00:29:57.932 "w_mbytes_per_sec": 0 00:29:57.932 }, 00:29:57.932 "claimed": false, 00:29:57.932 "zoned": false, 00:29:57.932 "supported_io_types": { 00:29:57.932 "read": true, 00:29:57.932 "write": true, 00:29:57.932 "unmap": false, 00:29:57.932 "flush": false, 00:29:57.932 "reset": true, 00:29:57.932 "nvme_admin": false, 00:29:57.932 "nvme_io": false, 00:29:57.932 "nvme_io_md": false, 00:29:57.932 "write_zeroes": true, 00:29:57.932 "zcopy": false, 00:29:57.932 "get_zone_info": false, 00:29:57.932 "zone_management": false, 00:29:57.932 "zone_append": false, 00:29:57.932 "compare": false, 00:29:57.932 "compare_and_write": false, 00:29:57.932 "abort": false, 00:29:57.932 "seek_hole": false, 00:29:57.932 "seek_data": false, 00:29:57.932 "copy": false, 00:29:57.932 "nvme_iov_md": false 00:29:57.932 }, 00:29:57.932 "memory_domains": [ 00:29:57.932 { 00:29:57.932 "dma_device_id": "system", 00:29:57.932 "dma_device_type": 1 00:29:57.932 }, 00:29:57.932 { 00:29:57.932 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:57.932 "dma_device_type": 2 00:29:57.932 }, 00:29:57.932 { 00:29:57.932 "dma_device_id": "system", 00:29:57.932 "dma_device_type": 1 00:29:57.932 }, 00:29:57.932 { 00:29:57.932 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:57.932 "dma_device_type": 2 00:29:57.932 } 00:29:57.932 ], 00:29:57.932 "driver_specific": { 00:29:57.932 "raid": { 00:29:57.932 "uuid": "7531c85c-6912-41cc-9b5d-a73590f73cdd", 00:29:57.932 "strip_size_kb": 0, 00:29:57.932 "state": "online", 00:29:57.932 "raid_level": "raid1", 00:29:57.932 "superblock": true, 00:29:57.932 "num_base_bdevs": 2, 00:29:57.932 "num_base_bdevs_discovered": 2, 00:29:57.932 "num_base_bdevs_operational": 2, 00:29:57.932 "base_bdevs_list": [ 00:29:57.932 { 00:29:57.932 "name": "pt1", 00:29:57.932 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:57.932 "is_configured": true, 00:29:57.932 "data_offset": 256, 00:29:57.932 "data_size": 7936 00:29:57.932 }, 00:29:57.932 { 00:29:57.932 "name": "pt2", 00:29:57.932 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:57.932 "is_configured": true, 00:29:57.932 "data_offset": 256, 00:29:57.932 "data_size": 7936 00:29:57.932 } 00:29:57.932 ] 00:29:57.932 } 00:29:57.932 } 00:29:57.932 }' 00:29:57.932 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:57.932 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:29:57.932 pt2' 00:29:57.932 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:57.932 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:29:57.933 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:58.191 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:58.191 "name": "pt1", 00:29:58.191 "aliases": [ 00:29:58.191 "00000000-0000-0000-0000-000000000001" 00:29:58.191 ], 00:29:58.191 "product_name": "passthru", 00:29:58.191 "block_size": 4128, 00:29:58.191 "num_blocks": 8192, 00:29:58.191 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:58.191 "md_size": 32, 00:29:58.191 "md_interleave": true, 00:29:58.191 "dif_type": 0, 00:29:58.191 "assigned_rate_limits": { 00:29:58.191 "rw_ios_per_sec": 0, 00:29:58.191 "rw_mbytes_per_sec": 0, 00:29:58.191 "r_mbytes_per_sec": 0, 00:29:58.191 "w_mbytes_per_sec": 0 00:29:58.191 }, 00:29:58.191 "claimed": true, 00:29:58.191 "claim_type": "exclusive_write", 00:29:58.191 "zoned": false, 00:29:58.191 "supported_io_types": { 00:29:58.191 "read": true, 00:29:58.191 "write": true, 00:29:58.191 "unmap": true, 00:29:58.191 "flush": true, 00:29:58.191 "reset": true, 00:29:58.191 "nvme_admin": false, 00:29:58.191 "nvme_io": false, 00:29:58.191 "nvme_io_md": false, 00:29:58.191 "write_zeroes": true, 00:29:58.191 "zcopy": true, 00:29:58.191 "get_zone_info": false, 00:29:58.191 "zone_management": false, 00:29:58.191 "zone_append": false, 00:29:58.191 "compare": false, 00:29:58.191 "compare_and_write": false, 00:29:58.191 "abort": true, 00:29:58.191 "seek_hole": false, 00:29:58.191 "seek_data": false, 00:29:58.191 "copy": true, 00:29:58.191 "nvme_iov_md": false 00:29:58.191 }, 00:29:58.191 "memory_domains": [ 00:29:58.192 { 00:29:58.192 "dma_device_id": "system", 00:29:58.192 "dma_device_type": 1 00:29:58.192 }, 00:29:58.192 { 00:29:58.192 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:58.192 "dma_device_type": 2 00:29:58.192 } 00:29:58.192 ], 00:29:58.192 "driver_specific": { 00:29:58.192 "passthru": { 00:29:58.192 "name": "pt1", 00:29:58.192 "base_bdev_name": "malloc1" 00:29:58.192 } 00:29:58.192 } 00:29:58.192 }' 00:29:58.192 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:58.192 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:58.192 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:58.192 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:58.192 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:58.192 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:58.451 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:58.451 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:58.451 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:58.451 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:58.451 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:58.451 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:58.451 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:58.451 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:29:58.451 20:05:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:58.709 20:05:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:58.710 "name": "pt2", 00:29:58.710 "aliases": [ 00:29:58.710 "00000000-0000-0000-0000-000000000002" 00:29:58.710 ], 00:29:58.710 "product_name": "passthru", 00:29:58.710 "block_size": 4128, 00:29:58.710 "num_blocks": 8192, 00:29:58.710 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:58.710 "md_size": 32, 00:29:58.710 "md_interleave": true, 00:29:58.710 "dif_type": 0, 00:29:58.710 "assigned_rate_limits": { 00:29:58.710 "rw_ios_per_sec": 0, 00:29:58.710 "rw_mbytes_per_sec": 0, 00:29:58.710 "r_mbytes_per_sec": 0, 00:29:58.710 "w_mbytes_per_sec": 0 00:29:58.710 }, 00:29:58.710 "claimed": true, 00:29:58.710 "claim_type": "exclusive_write", 00:29:58.710 "zoned": false, 00:29:58.710 "supported_io_types": { 00:29:58.710 "read": true, 00:29:58.710 "write": true, 00:29:58.710 "unmap": true, 00:29:58.710 "flush": true, 00:29:58.710 "reset": true, 00:29:58.710 "nvme_admin": false, 00:29:58.710 "nvme_io": false, 00:29:58.710 "nvme_io_md": false, 00:29:58.710 "write_zeroes": true, 00:29:58.710 "zcopy": true, 00:29:58.710 "get_zone_info": false, 00:29:58.710 "zone_management": false, 00:29:58.710 "zone_append": false, 00:29:58.710 "compare": false, 00:29:58.710 "compare_and_write": false, 00:29:58.710 "abort": true, 00:29:58.710 "seek_hole": false, 00:29:58.710 "seek_data": false, 00:29:58.710 "copy": true, 00:29:58.710 "nvme_iov_md": false 00:29:58.710 }, 00:29:58.710 "memory_domains": [ 00:29:58.710 { 00:29:58.710 "dma_device_id": "system", 00:29:58.710 "dma_device_type": 1 00:29:58.710 }, 00:29:58.710 { 00:29:58.710 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:58.710 "dma_device_type": 2 00:29:58.710 } 00:29:58.710 ], 00:29:58.710 "driver_specific": { 00:29:58.710 "passthru": { 00:29:58.710 "name": "pt2", 00:29:58.710 "base_bdev_name": "malloc2" 00:29:58.710 } 00:29:58.710 } 00:29:58.710 }' 00:29:58.710 20:05:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:58.710 20:05:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:58.969 20:05:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:58.969 20:05:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:58.969 20:05:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:58.969 20:05:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:58.969 20:05:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:58.969 20:05:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:59.228 20:05:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:59.228 20:05:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:59.228 20:05:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:59.228 20:05:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:59.228 20:05:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:59.228 20:05:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:29:59.487 [2024-07-24 20:05:50.891770] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:59.487 20:05:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=7531c85c-6912-41cc-9b5d-a73590f73cdd 00:29:59.487 20:05:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@451 -- # '[' -z 7531c85c-6912-41cc-9b5d-a73590f73cdd ']' 00:29:59.487 20:05:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:30:00.054 [2024-07-24 20:05:51.392839] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:00.054 [2024-07-24 20:05:51.392865] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:00.054 [2024-07-24 20:05:51.392925] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:00.054 [2024-07-24 20:05:51.392980] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:00.054 [2024-07-24 20:05:51.392992] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1647870 name raid_bdev1, state offline 00:30:00.054 20:05:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:00.054 20:05:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:30:00.622 20:05:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:30:00.622 20:05:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:30:00.622 20:05:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:30:00.622 20:05:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:30:00.881 20:05:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:30:00.881 20:05:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:30:01.448 20:05:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:30:01.448 20:05:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:30:02.016 20:05:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:30:02.016 20:05:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:30:02.016 20:05:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@650 -- # local es=0 00:30:02.016 20:05:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:30:02.016 20:05:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:02.016 20:05:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:30:02.016 20:05:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:02.016 20:05:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:30:02.016 20:05:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:02.016 20:05:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:30:02.016 20:05:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:02.016 20:05:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:30:02.016 20:05:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:30:02.275 [2024-07-24 20:05:53.714851] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:30:02.275 [2024-07-24 20:05:53.716302] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:30:02.275 [2024-07-24 20:05:53.716362] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:30:02.275 [2024-07-24 20:05:53.716410] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:30:02.275 [2024-07-24 20:05:53.716429] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:02.275 [2024-07-24 20:05:53.716440] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14c73e0 name raid_bdev1, state configuring 00:30:02.275 request: 00:30:02.275 { 00:30:02.275 "name": "raid_bdev1", 00:30:02.275 "raid_level": "raid1", 00:30:02.275 "base_bdevs": [ 00:30:02.275 "malloc1", 00:30:02.275 "malloc2" 00:30:02.275 ], 00:30:02.275 "superblock": false, 00:30:02.275 "method": "bdev_raid_create", 00:30:02.275 "req_id": 1 00:30:02.275 } 00:30:02.275 Got JSON-RPC error response 00:30:02.275 response: 00:30:02.275 { 00:30:02.275 "code": -17, 00:30:02.275 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:30:02.275 } 00:30:02.275 20:05:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@653 -- # es=1 00:30:02.275 20:05:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:30:02.275 20:05:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:30:02.275 20:05:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:30:02.275 20:05:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:02.275 20:05:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:30:02.880 20:05:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:30:02.880 20:05:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:30:02.880 20:05:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:30:03.139 [2024-07-24 20:05:54.476833] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:30:03.139 [2024-07-24 20:05:54.476883] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:03.139 [2024-07-24 20:05:54.476902] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14c7170 00:30:03.139 [2024-07-24 20:05:54.476914] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:03.139 [2024-07-24 20:05:54.478335] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:03.139 [2024-07-24 20:05:54.478363] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:30:03.139 [2024-07-24 20:05:54.478423] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:30:03.139 [2024-07-24 20:05:54.478453] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:30:03.139 pt1 00:30:03.139 20:05:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:30:03.139 20:05:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:03.139 20:05:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:03.139 20:05:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:03.139 20:05:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:03.139 20:05:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:03.139 20:05:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:03.139 20:05:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:03.139 20:05:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:03.139 20:05:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:03.139 20:05:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:03.139 20:05:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:03.706 20:05:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:03.706 "name": "raid_bdev1", 00:30:03.706 "uuid": "7531c85c-6912-41cc-9b5d-a73590f73cdd", 00:30:03.706 "strip_size_kb": 0, 00:30:03.706 "state": "configuring", 00:30:03.706 "raid_level": "raid1", 00:30:03.706 "superblock": true, 00:30:03.706 "num_base_bdevs": 2, 00:30:03.706 "num_base_bdevs_discovered": 1, 00:30:03.706 "num_base_bdevs_operational": 2, 00:30:03.706 "base_bdevs_list": [ 00:30:03.706 { 00:30:03.706 "name": "pt1", 00:30:03.706 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:03.706 "is_configured": true, 00:30:03.706 "data_offset": 256, 00:30:03.706 "data_size": 7936 00:30:03.706 }, 00:30:03.706 { 00:30:03.706 "name": null, 00:30:03.706 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:03.706 "is_configured": false, 00:30:03.706 "data_offset": 256, 00:30:03.706 "data_size": 7936 00:30:03.706 } 00:30:03.706 ] 00:30:03.706 }' 00:30:03.706 20:05:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:03.706 20:05:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:04.274 20:05:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@485 -- # '[' 2 -gt 2 ']' 00:30:04.274 20:05:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:30:04.274 20:05:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:30:04.274 20:05:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:30:04.274 [2024-07-24 20:05:55.836458] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:30:04.274 [2024-07-24 20:05:55.836508] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:04.274 [2024-07-24 20:05:55.836530] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16482e0 00:30:04.274 [2024-07-24 20:05:55.836543] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:04.274 [2024-07-24 20:05:55.836715] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:04.274 [2024-07-24 20:05:55.836731] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:30:04.274 [2024-07-24 20:05:55.836775] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:30:04.274 [2024-07-24 20:05:55.836795] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:30:04.274 [2024-07-24 20:05:55.836875] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x14c5810 00:30:04.274 [2024-07-24 20:05:55.836885] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:30:04.274 [2024-07-24 20:05:55.836939] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1647c90 00:30:04.274 [2024-07-24 20:05:55.837017] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14c5810 00:30:04.274 [2024-07-24 20:05:55.837027] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14c5810 00:30:04.274 [2024-07-24 20:05:55.837083] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:04.274 pt2 00:30:04.274 20:05:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:30:04.274 20:05:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:30:04.274 20:05:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:04.274 20:05:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:04.274 20:05:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:04.274 20:05:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:04.274 20:05:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:04.274 20:05:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:04.274 20:05:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:04.274 20:05:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:04.274 20:05:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:04.274 20:05:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:04.532 20:05:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:04.532 20:05:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:04.532 20:05:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:04.532 "name": "raid_bdev1", 00:30:04.532 "uuid": "7531c85c-6912-41cc-9b5d-a73590f73cdd", 00:30:04.532 "strip_size_kb": 0, 00:30:04.532 "state": "online", 00:30:04.532 "raid_level": "raid1", 00:30:04.532 "superblock": true, 00:30:04.532 "num_base_bdevs": 2, 00:30:04.532 "num_base_bdevs_discovered": 2, 00:30:04.532 "num_base_bdevs_operational": 2, 00:30:04.532 "base_bdevs_list": [ 00:30:04.532 { 00:30:04.532 "name": "pt1", 00:30:04.532 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:04.532 "is_configured": true, 00:30:04.532 "data_offset": 256, 00:30:04.532 "data_size": 7936 00:30:04.532 }, 00:30:04.532 { 00:30:04.532 "name": "pt2", 00:30:04.532 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:04.532 "is_configured": true, 00:30:04.532 "data_offset": 256, 00:30:04.532 "data_size": 7936 00:30:04.532 } 00:30:04.532 ] 00:30:04.532 }' 00:30:04.532 20:05:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:04.532 20:05:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:05.099 20:05:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:30:05.099 20:05:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:30:05.099 20:05:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:30:05.099 20:05:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:30:05.099 20:05:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:30:05.099 20:05:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:30:05.099 20:05:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:30:05.099 20:05:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:05.357 [2024-07-24 20:05:56.879487] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:05.357 20:05:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:30:05.357 "name": "raid_bdev1", 00:30:05.357 "aliases": [ 00:30:05.357 "7531c85c-6912-41cc-9b5d-a73590f73cdd" 00:30:05.357 ], 00:30:05.357 "product_name": "Raid Volume", 00:30:05.357 "block_size": 4128, 00:30:05.357 "num_blocks": 7936, 00:30:05.357 "uuid": "7531c85c-6912-41cc-9b5d-a73590f73cdd", 00:30:05.357 "md_size": 32, 00:30:05.357 "md_interleave": true, 00:30:05.357 "dif_type": 0, 00:30:05.357 "assigned_rate_limits": { 00:30:05.357 "rw_ios_per_sec": 0, 00:30:05.357 "rw_mbytes_per_sec": 0, 00:30:05.357 "r_mbytes_per_sec": 0, 00:30:05.357 "w_mbytes_per_sec": 0 00:30:05.357 }, 00:30:05.357 "claimed": false, 00:30:05.357 "zoned": false, 00:30:05.357 "supported_io_types": { 00:30:05.357 "read": true, 00:30:05.357 "write": true, 00:30:05.357 "unmap": false, 00:30:05.357 "flush": false, 00:30:05.357 "reset": true, 00:30:05.357 "nvme_admin": false, 00:30:05.357 "nvme_io": false, 00:30:05.357 "nvme_io_md": false, 00:30:05.357 "write_zeroes": true, 00:30:05.357 "zcopy": false, 00:30:05.357 "get_zone_info": false, 00:30:05.357 "zone_management": false, 00:30:05.357 "zone_append": false, 00:30:05.357 "compare": false, 00:30:05.357 "compare_and_write": false, 00:30:05.357 "abort": false, 00:30:05.357 "seek_hole": false, 00:30:05.357 "seek_data": false, 00:30:05.357 "copy": false, 00:30:05.357 "nvme_iov_md": false 00:30:05.357 }, 00:30:05.357 "memory_domains": [ 00:30:05.357 { 00:30:05.357 "dma_device_id": "system", 00:30:05.357 "dma_device_type": 1 00:30:05.357 }, 00:30:05.357 { 00:30:05.357 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:05.357 "dma_device_type": 2 00:30:05.357 }, 00:30:05.357 { 00:30:05.357 "dma_device_id": "system", 00:30:05.357 "dma_device_type": 1 00:30:05.357 }, 00:30:05.357 { 00:30:05.357 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:05.357 "dma_device_type": 2 00:30:05.357 } 00:30:05.357 ], 00:30:05.357 "driver_specific": { 00:30:05.357 "raid": { 00:30:05.357 "uuid": "7531c85c-6912-41cc-9b5d-a73590f73cdd", 00:30:05.357 "strip_size_kb": 0, 00:30:05.357 "state": "online", 00:30:05.357 "raid_level": "raid1", 00:30:05.357 "superblock": true, 00:30:05.357 "num_base_bdevs": 2, 00:30:05.357 "num_base_bdevs_discovered": 2, 00:30:05.357 "num_base_bdevs_operational": 2, 00:30:05.357 "base_bdevs_list": [ 00:30:05.357 { 00:30:05.357 "name": "pt1", 00:30:05.357 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:05.357 "is_configured": true, 00:30:05.357 "data_offset": 256, 00:30:05.357 "data_size": 7936 00:30:05.357 }, 00:30:05.357 { 00:30:05.357 "name": "pt2", 00:30:05.357 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:05.357 "is_configured": true, 00:30:05.357 "data_offset": 256, 00:30:05.357 "data_size": 7936 00:30:05.357 } 00:30:05.357 ] 00:30:05.357 } 00:30:05.357 } 00:30:05.357 }' 00:30:05.357 20:05:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:30:05.357 20:05:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:30:05.357 pt2' 00:30:05.357 20:05:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:05.357 20:05:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:05.357 20:05:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:30:05.615 20:05:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:05.616 "name": "pt1", 00:30:05.616 "aliases": [ 00:30:05.616 "00000000-0000-0000-0000-000000000001" 00:30:05.616 ], 00:30:05.616 "product_name": "passthru", 00:30:05.616 "block_size": 4128, 00:30:05.616 "num_blocks": 8192, 00:30:05.616 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:05.616 "md_size": 32, 00:30:05.616 "md_interleave": true, 00:30:05.616 "dif_type": 0, 00:30:05.616 "assigned_rate_limits": { 00:30:05.616 "rw_ios_per_sec": 0, 00:30:05.616 "rw_mbytes_per_sec": 0, 00:30:05.616 "r_mbytes_per_sec": 0, 00:30:05.616 "w_mbytes_per_sec": 0 00:30:05.616 }, 00:30:05.616 "claimed": true, 00:30:05.616 "claim_type": "exclusive_write", 00:30:05.616 "zoned": false, 00:30:05.616 "supported_io_types": { 00:30:05.616 "read": true, 00:30:05.616 "write": true, 00:30:05.616 "unmap": true, 00:30:05.616 "flush": true, 00:30:05.616 "reset": true, 00:30:05.616 "nvme_admin": false, 00:30:05.616 "nvme_io": false, 00:30:05.616 "nvme_io_md": false, 00:30:05.616 "write_zeroes": true, 00:30:05.616 "zcopy": true, 00:30:05.616 "get_zone_info": false, 00:30:05.616 "zone_management": false, 00:30:05.616 "zone_append": false, 00:30:05.616 "compare": false, 00:30:05.616 "compare_and_write": false, 00:30:05.616 "abort": true, 00:30:05.616 "seek_hole": false, 00:30:05.616 "seek_data": false, 00:30:05.616 "copy": true, 00:30:05.616 "nvme_iov_md": false 00:30:05.616 }, 00:30:05.616 "memory_domains": [ 00:30:05.616 { 00:30:05.616 "dma_device_id": "system", 00:30:05.616 "dma_device_type": 1 00:30:05.616 }, 00:30:05.616 { 00:30:05.616 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:05.616 "dma_device_type": 2 00:30:05.616 } 00:30:05.616 ], 00:30:05.616 "driver_specific": { 00:30:05.616 "passthru": { 00:30:05.616 "name": "pt1", 00:30:05.616 "base_bdev_name": "malloc1" 00:30:05.616 } 00:30:05.616 } 00:30:05.616 }' 00:30:05.616 20:05:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:05.874 20:05:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:05.874 20:05:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:30:05.874 20:05:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:05.874 20:05:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:06.133 20:05:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:30:06.133 20:05:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:06.133 20:05:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:06.133 20:05:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:30:06.133 20:05:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:06.133 20:05:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:06.133 20:05:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:30:06.133 20:05:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:06.133 20:05:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:30:06.133 20:05:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:06.393 20:05:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:06.393 "name": "pt2", 00:30:06.393 "aliases": [ 00:30:06.393 "00000000-0000-0000-0000-000000000002" 00:30:06.393 ], 00:30:06.393 "product_name": "passthru", 00:30:06.393 "block_size": 4128, 00:30:06.393 "num_blocks": 8192, 00:30:06.393 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:06.393 "md_size": 32, 00:30:06.393 "md_interleave": true, 00:30:06.393 "dif_type": 0, 00:30:06.393 "assigned_rate_limits": { 00:30:06.393 "rw_ios_per_sec": 0, 00:30:06.393 "rw_mbytes_per_sec": 0, 00:30:06.393 "r_mbytes_per_sec": 0, 00:30:06.393 "w_mbytes_per_sec": 0 00:30:06.393 }, 00:30:06.393 "claimed": true, 00:30:06.393 "claim_type": "exclusive_write", 00:30:06.393 "zoned": false, 00:30:06.393 "supported_io_types": { 00:30:06.393 "read": true, 00:30:06.393 "write": true, 00:30:06.393 "unmap": true, 00:30:06.393 "flush": true, 00:30:06.393 "reset": true, 00:30:06.393 "nvme_admin": false, 00:30:06.393 "nvme_io": false, 00:30:06.393 "nvme_io_md": false, 00:30:06.393 "write_zeroes": true, 00:30:06.393 "zcopy": true, 00:30:06.393 "get_zone_info": false, 00:30:06.393 "zone_management": false, 00:30:06.393 "zone_append": false, 00:30:06.393 "compare": false, 00:30:06.393 "compare_and_write": false, 00:30:06.393 "abort": true, 00:30:06.393 "seek_hole": false, 00:30:06.393 "seek_data": false, 00:30:06.393 "copy": true, 00:30:06.393 "nvme_iov_md": false 00:30:06.393 }, 00:30:06.393 "memory_domains": [ 00:30:06.393 { 00:30:06.393 "dma_device_id": "system", 00:30:06.393 "dma_device_type": 1 00:30:06.393 }, 00:30:06.393 { 00:30:06.393 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:06.393 "dma_device_type": 2 00:30:06.393 } 00:30:06.393 ], 00:30:06.393 "driver_specific": { 00:30:06.393 "passthru": { 00:30:06.393 "name": "pt2", 00:30:06.393 "base_bdev_name": "malloc2" 00:30:06.393 } 00:30:06.393 } 00:30:06.393 }' 00:30:06.393 20:05:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:06.652 20:05:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:06.652 20:05:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:30:06.652 20:05:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:06.652 20:05:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:06.652 20:05:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:30:06.653 20:05:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:06.912 20:05:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:06.912 20:05:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:30:06.912 20:05:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:06.912 20:05:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:06.912 20:05:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:30:06.912 20:05:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:06.912 20:05:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:30:07.480 [2024-07-24 20:05:58.965169] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:07.480 20:05:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@502 -- # '[' 7531c85c-6912-41cc-9b5d-a73590f73cdd '!=' 7531c85c-6912-41cc-9b5d-a73590f73cdd ']' 00:30:07.480 20:05:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # has_redundancy raid1 00:30:07.480 20:05:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:30:07.480 20:05:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:30:07.480 20:05:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@508 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:30:08.049 [2024-07-24 20:05:59.486316] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:30:08.049 20:05:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@511 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:08.049 20:05:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:08.049 20:05:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:08.049 20:05:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:08.049 20:05:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:08.049 20:05:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:08.049 20:05:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:08.049 20:05:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:08.049 20:05:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:08.049 20:05:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:08.049 20:05:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:08.049 20:05:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:08.617 20:06:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:08.617 "name": "raid_bdev1", 00:30:08.617 "uuid": "7531c85c-6912-41cc-9b5d-a73590f73cdd", 00:30:08.617 "strip_size_kb": 0, 00:30:08.617 "state": "online", 00:30:08.617 "raid_level": "raid1", 00:30:08.617 "superblock": true, 00:30:08.617 "num_base_bdevs": 2, 00:30:08.617 "num_base_bdevs_discovered": 1, 00:30:08.617 "num_base_bdevs_operational": 1, 00:30:08.617 "base_bdevs_list": [ 00:30:08.617 { 00:30:08.617 "name": null, 00:30:08.617 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:08.617 "is_configured": false, 00:30:08.617 "data_offset": 256, 00:30:08.617 "data_size": 7936 00:30:08.617 }, 00:30:08.617 { 00:30:08.617 "name": "pt2", 00:30:08.617 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:08.617 "is_configured": true, 00:30:08.617 "data_offset": 256, 00:30:08.617 "data_size": 7936 00:30:08.617 } 00:30:08.617 ] 00:30:08.617 }' 00:30:08.617 20:06:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:08.617 20:06:00 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:09.554 20:06:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:30:09.554 [2024-07-24 20:06:01.126659] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:09.554 [2024-07-24 20:06:01.126688] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:09.554 [2024-07-24 20:06:01.126746] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:09.554 [2024-07-24 20:06:01.126791] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:09.554 [2024-07-24 20:06:01.126803] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14c5810 name raid_bdev1, state offline 00:30:09.812 20:06:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@515 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:09.812 20:06:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@515 -- # jq -r '.[]' 00:30:09.812 20:06:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@515 -- # raid_bdev= 00:30:09.812 20:06:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@516 -- # '[' -n '' ']' 00:30:09.813 20:06:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@521 -- # (( i = 1 )) 00:30:09.813 20:06:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:30:09.813 20:06:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:30:10.071 20:06:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:30:10.071 20:06:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:30:10.071 20:06:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # (( i = 1 )) 00:30:10.071 20:06:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:30:10.071 20:06:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@534 -- # i=1 00:30:10.071 20:06:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:30:10.330 [2024-07-24 20:06:01.872591] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:30:10.330 [2024-07-24 20:06:01.872641] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:10.330 [2024-07-24 20:06:01.872659] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14c50e0 00:30:10.330 [2024-07-24 20:06:01.872678] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:10.330 [2024-07-24 20:06:01.874108] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:10.330 [2024-07-24 20:06:01.874136] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:30:10.330 [2024-07-24 20:06:01.874186] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:30:10.330 [2024-07-24 20:06:01.874216] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:30:10.330 [2024-07-24 20:06:01.874286] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x14c5c30 00:30:10.330 [2024-07-24 20:06:01.874296] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:30:10.330 [2024-07-24 20:06:01.874355] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14c5f10 00:30:10.330 [2024-07-24 20:06:01.874443] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14c5c30 00:30:10.330 [2024-07-24 20:06:01.874454] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14c5c30 00:30:10.330 [2024-07-24 20:06:01.874508] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:10.330 pt2 00:30:10.330 20:06:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@538 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:10.330 20:06:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:10.330 20:06:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:10.330 20:06:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:10.330 20:06:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:10.330 20:06:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:10.330 20:06:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:10.330 20:06:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:10.330 20:06:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:10.330 20:06:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:10.330 20:06:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:10.330 20:06:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:10.590 20:06:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:10.590 "name": "raid_bdev1", 00:30:10.590 "uuid": "7531c85c-6912-41cc-9b5d-a73590f73cdd", 00:30:10.590 "strip_size_kb": 0, 00:30:10.590 "state": "online", 00:30:10.590 "raid_level": "raid1", 00:30:10.590 "superblock": true, 00:30:10.590 "num_base_bdevs": 2, 00:30:10.590 "num_base_bdevs_discovered": 1, 00:30:10.590 "num_base_bdevs_operational": 1, 00:30:10.590 "base_bdevs_list": [ 00:30:10.590 { 00:30:10.590 "name": null, 00:30:10.590 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:10.590 "is_configured": false, 00:30:10.590 "data_offset": 256, 00:30:10.590 "data_size": 7936 00:30:10.590 }, 00:30:10.590 { 00:30:10.590 "name": "pt2", 00:30:10.590 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:10.590 "is_configured": true, 00:30:10.590 "data_offset": 256, 00:30:10.590 "data_size": 7936 00:30:10.590 } 00:30:10.590 ] 00:30:10.590 }' 00:30:10.590 20:06:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:10.590 20:06:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:11.527 20:06:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:30:11.527 [2024-07-24 20:06:02.979516] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:11.527 [2024-07-24 20:06:02.979542] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:11.527 [2024-07-24 20:06:02.979600] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:11.527 [2024-07-24 20:06:02.979644] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:11.527 [2024-07-24 20:06:02.979655] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14c5c30 name raid_bdev1, state offline 00:30:11.527 20:06:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:11.527 20:06:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@542 -- # jq -r '.[]' 00:30:12.094 20:06:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@542 -- # raid_bdev= 00:30:12.094 20:06:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@543 -- # '[' -n '' ']' 00:30:12.094 20:06:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@547 -- # '[' 2 -gt 2 ']' 00:30:12.094 20:06:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:30:12.353 [2024-07-24 20:06:03.745518] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:30:12.353 [2024-07-24 20:06:03.745565] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:12.353 [2024-07-24 20:06:03.745585] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1654fa0 00:30:12.353 [2024-07-24 20:06:03.745597] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:12.353 [2024-07-24 20:06:03.747020] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:12.353 [2024-07-24 20:06:03.747048] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:30:12.353 [2024-07-24 20:06:03.747097] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:30:12.353 [2024-07-24 20:06:03.747124] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:30:12.353 [2024-07-24 20:06:03.747208] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:30:12.353 [2024-07-24 20:06:03.747221] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:12.353 [2024-07-24 20:06:03.747235] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x164b220 name raid_bdev1, state configuring 00:30:12.353 [2024-07-24 20:06:03.747258] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:30:12.353 [2024-07-24 20:06:03.747314] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x164a890 00:30:12.353 [2024-07-24 20:06:03.747324] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:30:12.354 [2024-07-24 20:06:03.747382] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1647c90 00:30:12.354 [2024-07-24 20:06:03.747467] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x164a890 00:30:12.354 [2024-07-24 20:06:03.747477] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x164a890 00:30:12.354 [2024-07-24 20:06:03.747535] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:12.354 pt1 00:30:12.354 20:06:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' 2 -gt 2 ']' 00:30:12.354 20:06:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@569 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:12.354 20:06:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:12.354 20:06:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:12.354 20:06:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:12.354 20:06:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:12.354 20:06:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:12.354 20:06:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:12.354 20:06:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:12.354 20:06:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:12.354 20:06:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:12.354 20:06:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:12.354 20:06:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:12.613 20:06:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:12.613 "name": "raid_bdev1", 00:30:12.613 "uuid": "7531c85c-6912-41cc-9b5d-a73590f73cdd", 00:30:12.613 "strip_size_kb": 0, 00:30:12.613 "state": "online", 00:30:12.613 "raid_level": "raid1", 00:30:12.613 "superblock": true, 00:30:12.613 "num_base_bdevs": 2, 00:30:12.613 "num_base_bdevs_discovered": 1, 00:30:12.613 "num_base_bdevs_operational": 1, 00:30:12.613 "base_bdevs_list": [ 00:30:12.613 { 00:30:12.613 "name": null, 00:30:12.613 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:12.613 "is_configured": false, 00:30:12.613 "data_offset": 256, 00:30:12.613 "data_size": 7936 00:30:12.613 }, 00:30:12.613 { 00:30:12.613 "name": "pt2", 00:30:12.613 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:12.613 "is_configured": true, 00:30:12.613 "data_offset": 256, 00:30:12.613 "data_size": 7936 00:30:12.613 } 00:30:12.613 ] 00:30:12.613 }' 00:30:12.613 20:06:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:12.613 20:06:04 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:13.551 20:06:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@570 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:30:13.551 20:06:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@570 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:30:13.810 20:06:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@570 -- # [[ false == \f\a\l\s\e ]] 00:30:13.810 20:06:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@573 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:13.810 20:06:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@573 -- # jq -r '.[] | .uuid' 00:30:13.810 [2024-07-24 20:06:05.309896] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:13.810 20:06:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@573 -- # '[' 7531c85c-6912-41cc-9b5d-a73590f73cdd '!=' 7531c85c-6912-41cc-9b5d-a73590f73cdd ']' 00:30:13.810 20:06:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@578 -- # killprocess 1537052 00:30:13.810 20:06:05 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@950 -- # '[' -z 1537052 ']' 00:30:13.810 20:06:05 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # kill -0 1537052 00:30:13.810 20:06:05 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@955 -- # uname 00:30:13.810 20:06:05 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:30:13.810 20:06:05 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1537052 00:30:13.810 20:06:05 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:30:13.810 20:06:05 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:30:13.810 20:06:05 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1537052' 00:30:13.810 killing process with pid 1537052 00:30:13.810 20:06:05 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@969 -- # kill 1537052 00:30:13.810 [2024-07-24 20:06:05.386070] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:30:13.810 [2024-07-24 20:06:05.386123] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:13.810 [2024-07-24 20:06:05.386172] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:13.810 [2024-07-24 20:06:05.386184] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x164a890 name raid_bdev1, state offline 00:30:13.810 20:06:05 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@974 -- # wait 1537052 00:30:13.810 [2024-07-24 20:06:05.402981] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:30:14.070 20:06:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@580 -- # return 0 00:30:14.070 00:30:14.070 real 0m20.334s 00:30:14.070 user 0m37.091s 00:30:14.070 sys 0m3.492s 00:30:14.070 20:06:05 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1126 -- # xtrace_disable 00:30:14.070 20:06:05 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:14.070 ************************************ 00:30:14.070 END TEST raid_superblock_test_md_interleaved 00:30:14.070 ************************************ 00:30:14.070 20:06:05 bdev_raid -- bdev/bdev_raid.sh@994 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:30:14.070 20:06:05 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:30:14.070 20:06:05 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:30:14.070 20:06:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:30:14.329 ************************************ 00:30:14.329 START TEST raid_rebuild_test_sb_md_interleaved 00:30:14.329 ************************************ 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 true false false 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=2 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@586 -- # local superblock=true 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@587 -- # local background_io=false 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # local verify=false 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # local strip_size 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # local create_arg 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@594 -- # local data_offset 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # '[' true = true ']' 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # create_arg+=' -s' 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # raid_pid=1540322 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@613 -- # waitforlisten 1540322 /var/tmp/spdk-raid.sock 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@831 -- # '[' -z 1540322 ']' 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # local max_retries=100 00:30:14.329 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:30:14.330 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:30:14.330 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@840 -- # xtrace_disable 00:30:14.330 20:06:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:14.330 [2024-07-24 20:06:05.764591] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:30:14.330 [2024-07-24 20:06:05.764663] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1540322 ] 00:30:14.330 I/O size of 3145728 is greater than zero copy threshold (65536). 00:30:14.330 Zero copy mechanism will not be used. 00:30:14.330 [2024-07-24 20:06:05.891550] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:14.588 [2024-07-24 20:06:05.998892] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:14.588 [2024-07-24 20:06:06.055706] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:14.588 [2024-07-24 20:06:06.055734] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:15.156 20:06:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:30:15.156 20:06:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@864 -- # return 0 00:30:15.156 20:06:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:30:15.156 20:06:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:30:15.415 BaseBdev1_malloc 00:30:15.415 20:06:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:30:15.674 [2024-07-24 20:06:07.174826] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:30:15.674 [2024-07-24 20:06:07.174875] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:15.674 [2024-07-24 20:06:07.174898] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8d8630 00:30:15.674 [2024-07-24 20:06:07.174911] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:15.674 [2024-07-24 20:06:07.176511] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:15.674 [2024-07-24 20:06:07.176542] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:30:15.674 BaseBdev1 00:30:15.674 20:06:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:30:15.674 20:06:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:30:15.933 BaseBdev2_malloc 00:30:15.933 20:06:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:30:16.192 [2024-07-24 20:06:07.669066] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:30:16.192 [2024-07-24 20:06:07.669112] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:16.192 [2024-07-24 20:06:07.669135] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa65dd0 00:30:16.192 [2024-07-24 20:06:07.669148] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:16.192 [2024-07-24 20:06:07.670615] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:16.192 [2024-07-24 20:06:07.670642] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:30:16.192 BaseBdev2 00:30:16.192 20:06:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:30:16.451 spare_malloc 00:30:16.451 20:06:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:30:16.710 spare_delay 00:30:16.710 20:06:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:30:16.969 [2024-07-24 20:06:08.391868] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:30:16.969 [2024-07-24 20:06:08.391913] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:16.969 [2024-07-24 20:06:08.391937] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa68fe0 00:30:16.969 [2024-07-24 20:06:08.391951] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:16.969 [2024-07-24 20:06:08.393358] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:16.969 [2024-07-24 20:06:08.393388] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:30:16.969 spare 00:30:16.969 20:06:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:30:17.228 [2024-07-24 20:06:08.624514] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:17.228 [2024-07-24 20:06:08.625839] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:30:17.228 [2024-07-24 20:06:08.626008] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xa72bd0 00:30:17.228 [2024-07-24 20:06:08.626021] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:30:17.228 [2024-07-24 20:06:08.626094] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8cea10 00:30:17.228 [2024-07-24 20:06:08.626175] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa72bd0 00:30:17.228 [2024-07-24 20:06:08.626185] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa72bd0 00:30:17.228 [2024-07-24 20:06:08.626241] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:17.228 20:06:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:17.228 20:06:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:17.228 20:06:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:17.228 20:06:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:17.228 20:06:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:17.228 20:06:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:17.228 20:06:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:17.228 20:06:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:17.228 20:06:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:17.228 20:06:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:17.228 20:06:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:17.228 20:06:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:17.487 20:06:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:17.487 "name": "raid_bdev1", 00:30:17.487 "uuid": "73e285ff-c07d-43ed-9f90-c39171902b92", 00:30:17.487 "strip_size_kb": 0, 00:30:17.487 "state": "online", 00:30:17.487 "raid_level": "raid1", 00:30:17.487 "superblock": true, 00:30:17.487 "num_base_bdevs": 2, 00:30:17.487 "num_base_bdevs_discovered": 2, 00:30:17.487 "num_base_bdevs_operational": 2, 00:30:17.487 "base_bdevs_list": [ 00:30:17.487 { 00:30:17.487 "name": "BaseBdev1", 00:30:17.487 "uuid": "506ab0f2-5199-5366-b272-e31879e72954", 00:30:17.487 "is_configured": true, 00:30:17.487 "data_offset": 256, 00:30:17.487 "data_size": 7936 00:30:17.487 }, 00:30:17.487 { 00:30:17.487 "name": "BaseBdev2", 00:30:17.487 "uuid": "a93186ca-e7fb-5cef-8e35-416e62ab92a4", 00:30:17.487 "is_configured": true, 00:30:17.487 "data_offset": 256, 00:30:17.487 "data_size": 7936 00:30:17.487 } 00:30:17.487 ] 00:30:17.487 }' 00:30:17.487 20:06:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:17.487 20:06:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:18.055 20:06:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:18.055 20:06:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:30:18.314 [2024-07-24 20:06:09.727672] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:18.314 20:06:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=7936 00:30:18.314 20:06:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:18.314 20:06:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:30:18.573 20:06:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@634 -- # data_offset=256 00:30:18.573 20:06:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@636 -- # '[' false = true ']' 00:30:18.573 20:06:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # '[' false = true ']' 00:30:18.573 20:06:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:30:18.833 [2024-07-24 20:06:10.228748] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:30:18.833 20:06:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:18.833 20:06:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:18.833 20:06:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:18.833 20:06:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:18.833 20:06:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:18.833 20:06:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:18.833 20:06:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:18.833 20:06:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:18.833 20:06:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:18.833 20:06:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:18.833 20:06:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:18.833 20:06:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:19.092 20:06:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:19.092 "name": "raid_bdev1", 00:30:19.092 "uuid": "73e285ff-c07d-43ed-9f90-c39171902b92", 00:30:19.092 "strip_size_kb": 0, 00:30:19.092 "state": "online", 00:30:19.092 "raid_level": "raid1", 00:30:19.092 "superblock": true, 00:30:19.092 "num_base_bdevs": 2, 00:30:19.092 "num_base_bdevs_discovered": 1, 00:30:19.092 "num_base_bdevs_operational": 1, 00:30:19.092 "base_bdevs_list": [ 00:30:19.092 { 00:30:19.092 "name": null, 00:30:19.093 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:19.093 "is_configured": false, 00:30:19.093 "data_offset": 256, 00:30:19.093 "data_size": 7936 00:30:19.093 }, 00:30:19.093 { 00:30:19.093 "name": "BaseBdev2", 00:30:19.093 "uuid": "a93186ca-e7fb-5cef-8e35-416e62ab92a4", 00:30:19.093 "is_configured": true, 00:30:19.093 "data_offset": 256, 00:30:19.093 "data_size": 7936 00:30:19.093 } 00:30:19.093 ] 00:30:19.093 }' 00:30:19.093 20:06:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:19.093 20:06:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:19.660 20:06:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:30:19.950 [2024-07-24 20:06:11.315668] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:19.950 [2024-07-24 20:06:11.319312] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8d0520 00:30:19.950 [2024-07-24 20:06:11.321631] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:19.950 20:06:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:30:20.887 20:06:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:20.887 20:06:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:20.887 20:06:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:20.887 20:06:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:20.887 20:06:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:20.887 20:06:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:20.887 20:06:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:21.146 20:06:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:21.146 "name": "raid_bdev1", 00:30:21.146 "uuid": "73e285ff-c07d-43ed-9f90-c39171902b92", 00:30:21.146 "strip_size_kb": 0, 00:30:21.146 "state": "online", 00:30:21.146 "raid_level": "raid1", 00:30:21.146 "superblock": true, 00:30:21.146 "num_base_bdevs": 2, 00:30:21.146 "num_base_bdevs_discovered": 2, 00:30:21.146 "num_base_bdevs_operational": 2, 00:30:21.146 "process": { 00:30:21.146 "type": "rebuild", 00:30:21.146 "target": "spare", 00:30:21.146 "progress": { 00:30:21.146 "blocks": 3072, 00:30:21.146 "percent": 38 00:30:21.146 } 00:30:21.146 }, 00:30:21.146 "base_bdevs_list": [ 00:30:21.146 { 00:30:21.146 "name": "spare", 00:30:21.146 "uuid": "12ee0c91-caf4-5755-bfe6-d238003be2c9", 00:30:21.146 "is_configured": true, 00:30:21.146 "data_offset": 256, 00:30:21.146 "data_size": 7936 00:30:21.146 }, 00:30:21.146 { 00:30:21.146 "name": "BaseBdev2", 00:30:21.146 "uuid": "a93186ca-e7fb-5cef-8e35-416e62ab92a4", 00:30:21.146 "is_configured": true, 00:30:21.146 "data_offset": 256, 00:30:21.146 "data_size": 7936 00:30:21.146 } 00:30:21.146 ] 00:30:21.146 }' 00:30:21.146 20:06:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:21.146 20:06:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:21.146 20:06:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:21.146 20:06:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:21.146 20:06:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:30:21.405 [2024-07-24 20:06:12.922786] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:21.405 [2024-07-24 20:06:12.934587] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:30:21.405 [2024-07-24 20:06:12.934635] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:21.405 [2024-07-24 20:06:12.934651] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:21.405 [2024-07-24 20:06:12.934660] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:30:21.405 20:06:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:21.405 20:06:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:21.405 20:06:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:21.405 20:06:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:21.405 20:06:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:21.405 20:06:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:21.405 20:06:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:21.405 20:06:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:21.405 20:06:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:21.405 20:06:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:21.405 20:06:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:21.405 20:06:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:21.664 20:06:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:21.664 "name": "raid_bdev1", 00:30:21.664 "uuid": "73e285ff-c07d-43ed-9f90-c39171902b92", 00:30:21.664 "strip_size_kb": 0, 00:30:21.664 "state": "online", 00:30:21.664 "raid_level": "raid1", 00:30:21.664 "superblock": true, 00:30:21.664 "num_base_bdevs": 2, 00:30:21.664 "num_base_bdevs_discovered": 1, 00:30:21.664 "num_base_bdevs_operational": 1, 00:30:21.664 "base_bdevs_list": [ 00:30:21.664 { 00:30:21.664 "name": null, 00:30:21.664 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:21.664 "is_configured": false, 00:30:21.664 "data_offset": 256, 00:30:21.664 "data_size": 7936 00:30:21.664 }, 00:30:21.664 { 00:30:21.664 "name": "BaseBdev2", 00:30:21.664 "uuid": "a93186ca-e7fb-5cef-8e35-416e62ab92a4", 00:30:21.664 "is_configured": true, 00:30:21.664 "data_offset": 256, 00:30:21.664 "data_size": 7936 00:30:21.664 } 00:30:21.664 ] 00:30:21.664 }' 00:30:21.664 20:06:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:21.664 20:06:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:22.240 20:06:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:22.240 20:06:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:22.240 20:06:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:22.240 20:06:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:22.240 20:06:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:22.240 20:06:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:22.240 20:06:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:22.501 20:06:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:22.501 "name": "raid_bdev1", 00:30:22.501 "uuid": "73e285ff-c07d-43ed-9f90-c39171902b92", 00:30:22.501 "strip_size_kb": 0, 00:30:22.501 "state": "online", 00:30:22.501 "raid_level": "raid1", 00:30:22.501 "superblock": true, 00:30:22.501 "num_base_bdevs": 2, 00:30:22.501 "num_base_bdevs_discovered": 1, 00:30:22.501 "num_base_bdevs_operational": 1, 00:30:22.501 "base_bdevs_list": [ 00:30:22.501 { 00:30:22.501 "name": null, 00:30:22.501 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:22.501 "is_configured": false, 00:30:22.501 "data_offset": 256, 00:30:22.501 "data_size": 7936 00:30:22.501 }, 00:30:22.501 { 00:30:22.501 "name": "BaseBdev2", 00:30:22.501 "uuid": "a93186ca-e7fb-5cef-8e35-416e62ab92a4", 00:30:22.501 "is_configured": true, 00:30:22.501 "data_offset": 256, 00:30:22.501 "data_size": 7936 00:30:22.501 } 00:30:22.501 ] 00:30:22.501 }' 00:30:22.501 20:06:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:22.759 20:06:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:22.759 20:06:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:22.759 20:06:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:22.759 20:06:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:30:23.018 [2024-07-24 20:06:14.374246] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:23.018 [2024-07-24 20:06:14.377870] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8cf6e0 00:30:23.018 [2024-07-24 20:06:14.379358] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:23.018 20:06:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@678 -- # sleep 1 00:30:23.955 20:06:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:23.955 20:06:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:23.955 20:06:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:23.955 20:06:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:23.955 20:06:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:23.955 20:06:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:23.955 20:06:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:24.215 20:06:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:24.215 "name": "raid_bdev1", 00:30:24.215 "uuid": "73e285ff-c07d-43ed-9f90-c39171902b92", 00:30:24.215 "strip_size_kb": 0, 00:30:24.215 "state": "online", 00:30:24.215 "raid_level": "raid1", 00:30:24.215 "superblock": true, 00:30:24.215 "num_base_bdevs": 2, 00:30:24.215 "num_base_bdevs_discovered": 2, 00:30:24.215 "num_base_bdevs_operational": 2, 00:30:24.215 "process": { 00:30:24.215 "type": "rebuild", 00:30:24.215 "target": "spare", 00:30:24.215 "progress": { 00:30:24.215 "blocks": 3072, 00:30:24.215 "percent": 38 00:30:24.215 } 00:30:24.215 }, 00:30:24.215 "base_bdevs_list": [ 00:30:24.215 { 00:30:24.215 "name": "spare", 00:30:24.215 "uuid": "12ee0c91-caf4-5755-bfe6-d238003be2c9", 00:30:24.215 "is_configured": true, 00:30:24.215 "data_offset": 256, 00:30:24.215 "data_size": 7936 00:30:24.215 }, 00:30:24.215 { 00:30:24.215 "name": "BaseBdev2", 00:30:24.215 "uuid": "a93186ca-e7fb-5cef-8e35-416e62ab92a4", 00:30:24.215 "is_configured": true, 00:30:24.215 "data_offset": 256, 00:30:24.215 "data_size": 7936 00:30:24.215 } 00:30:24.215 ] 00:30:24.215 }' 00:30:24.215 20:06:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:24.215 20:06:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:24.215 20:06:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:24.215 20:06:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:24.215 20:06:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@681 -- # '[' true = true ']' 00:30:24.215 20:06:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@681 -- # '[' = false ']' 00:30:24.215 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 681: [: =: unary operator expected 00:30:24.215 20:06:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=2 00:30:24.215 20:06:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:30:24.215 20:06:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # '[' 2 -gt 2 ']' 00:30:24.215 20:06:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # local timeout=1196 00:30:24.215 20:06:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:30:24.215 20:06:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:24.215 20:06:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:24.215 20:06:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:24.215 20:06:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:24.215 20:06:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:24.215 20:06:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:24.215 20:06:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:24.474 20:06:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:24.474 "name": "raid_bdev1", 00:30:24.474 "uuid": "73e285ff-c07d-43ed-9f90-c39171902b92", 00:30:24.474 "strip_size_kb": 0, 00:30:24.474 "state": "online", 00:30:24.474 "raid_level": "raid1", 00:30:24.474 "superblock": true, 00:30:24.474 "num_base_bdevs": 2, 00:30:24.474 "num_base_bdevs_discovered": 2, 00:30:24.474 "num_base_bdevs_operational": 2, 00:30:24.474 "process": { 00:30:24.474 "type": "rebuild", 00:30:24.474 "target": "spare", 00:30:24.474 "progress": { 00:30:24.474 "blocks": 4096, 00:30:24.474 "percent": 51 00:30:24.474 } 00:30:24.474 }, 00:30:24.474 "base_bdevs_list": [ 00:30:24.474 { 00:30:24.474 "name": "spare", 00:30:24.474 "uuid": "12ee0c91-caf4-5755-bfe6-d238003be2c9", 00:30:24.474 "is_configured": true, 00:30:24.474 "data_offset": 256, 00:30:24.474 "data_size": 7936 00:30:24.474 }, 00:30:24.474 { 00:30:24.474 "name": "BaseBdev2", 00:30:24.474 "uuid": "a93186ca-e7fb-5cef-8e35-416e62ab92a4", 00:30:24.474 "is_configured": true, 00:30:24.474 "data_offset": 256, 00:30:24.474 "data_size": 7936 00:30:24.474 } 00:30:24.474 ] 00:30:24.474 }' 00:30:24.474 20:06:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:24.733 20:06:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:24.733 20:06:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:24.733 20:06:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:24.733 20:06:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@726 -- # sleep 1 00:30:25.669 20:06:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:30:25.669 20:06:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:25.669 20:06:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:25.669 20:06:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:25.669 20:06:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:25.669 20:06:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:25.669 20:06:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:25.669 20:06:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:25.928 20:06:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:25.928 "name": "raid_bdev1", 00:30:25.928 "uuid": "73e285ff-c07d-43ed-9f90-c39171902b92", 00:30:25.928 "strip_size_kb": 0, 00:30:25.928 "state": "online", 00:30:25.928 "raid_level": "raid1", 00:30:25.928 "superblock": true, 00:30:25.928 "num_base_bdevs": 2, 00:30:25.928 "num_base_bdevs_discovered": 2, 00:30:25.928 "num_base_bdevs_operational": 2, 00:30:25.928 "process": { 00:30:25.928 "type": "rebuild", 00:30:25.928 "target": "spare", 00:30:25.928 "progress": { 00:30:25.928 "blocks": 7424, 00:30:25.928 "percent": 93 00:30:25.928 } 00:30:25.928 }, 00:30:25.928 "base_bdevs_list": [ 00:30:25.928 { 00:30:25.928 "name": "spare", 00:30:25.928 "uuid": "12ee0c91-caf4-5755-bfe6-d238003be2c9", 00:30:25.928 "is_configured": true, 00:30:25.928 "data_offset": 256, 00:30:25.928 "data_size": 7936 00:30:25.928 }, 00:30:25.928 { 00:30:25.928 "name": "BaseBdev2", 00:30:25.928 "uuid": "a93186ca-e7fb-5cef-8e35-416e62ab92a4", 00:30:25.928 "is_configured": true, 00:30:25.928 "data_offset": 256, 00:30:25.928 "data_size": 7936 00:30:25.928 } 00:30:25.928 ] 00:30:25.928 }' 00:30:25.928 20:06:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:25.928 20:06:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:25.928 20:06:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:25.928 [2024-07-24 20:06:17.503462] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:30:25.928 [2024-07-24 20:06:17.503518] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:30:25.928 [2024-07-24 20:06:17.503604] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:25.928 20:06:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:25.928 20:06:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@726 -- # sleep 1 00:30:27.305 20:06:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:30:27.305 20:06:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:27.305 20:06:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:27.305 20:06:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:27.305 20:06:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:27.305 20:06:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:27.305 20:06:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:27.305 20:06:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:27.305 20:06:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:27.305 "name": "raid_bdev1", 00:30:27.305 "uuid": "73e285ff-c07d-43ed-9f90-c39171902b92", 00:30:27.305 "strip_size_kb": 0, 00:30:27.305 "state": "online", 00:30:27.305 "raid_level": "raid1", 00:30:27.305 "superblock": true, 00:30:27.305 "num_base_bdevs": 2, 00:30:27.305 "num_base_bdevs_discovered": 2, 00:30:27.305 "num_base_bdevs_operational": 2, 00:30:27.305 "base_bdevs_list": [ 00:30:27.305 { 00:30:27.305 "name": "spare", 00:30:27.305 "uuid": "12ee0c91-caf4-5755-bfe6-d238003be2c9", 00:30:27.305 "is_configured": true, 00:30:27.305 "data_offset": 256, 00:30:27.305 "data_size": 7936 00:30:27.305 }, 00:30:27.305 { 00:30:27.305 "name": "BaseBdev2", 00:30:27.305 "uuid": "a93186ca-e7fb-5cef-8e35-416e62ab92a4", 00:30:27.305 "is_configured": true, 00:30:27.305 "data_offset": 256, 00:30:27.305 "data_size": 7936 00:30:27.305 } 00:30:27.305 ] 00:30:27.305 }' 00:30:27.305 20:06:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:27.305 20:06:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:30:27.305 20:06:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:27.564 20:06:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:30:27.564 20:06:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@724 -- # break 00:30:27.564 20:06:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:27.564 20:06:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:27.564 20:06:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:27.564 20:06:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:27.564 20:06:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:27.564 20:06:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:27.564 20:06:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:27.564 20:06:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:27.564 "name": "raid_bdev1", 00:30:27.564 "uuid": "73e285ff-c07d-43ed-9f90-c39171902b92", 00:30:27.564 "strip_size_kb": 0, 00:30:27.564 "state": "online", 00:30:27.564 "raid_level": "raid1", 00:30:27.564 "superblock": true, 00:30:27.564 "num_base_bdevs": 2, 00:30:27.564 "num_base_bdevs_discovered": 2, 00:30:27.564 "num_base_bdevs_operational": 2, 00:30:27.564 "base_bdevs_list": [ 00:30:27.564 { 00:30:27.564 "name": "spare", 00:30:27.564 "uuid": "12ee0c91-caf4-5755-bfe6-d238003be2c9", 00:30:27.564 "is_configured": true, 00:30:27.564 "data_offset": 256, 00:30:27.564 "data_size": 7936 00:30:27.564 }, 00:30:27.564 { 00:30:27.564 "name": "BaseBdev2", 00:30:27.564 "uuid": "a93186ca-e7fb-5cef-8e35-416e62ab92a4", 00:30:27.564 "is_configured": true, 00:30:27.564 "data_offset": 256, 00:30:27.564 "data_size": 7936 00:30:27.564 } 00:30:27.564 ] 00:30:27.564 }' 00:30:27.564 20:06:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:27.822 20:06:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:27.822 20:06:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:27.822 20:06:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:27.822 20:06:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:27.822 20:06:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:27.822 20:06:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:27.822 20:06:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:27.822 20:06:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:27.822 20:06:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:27.822 20:06:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:27.822 20:06:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:27.822 20:06:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:27.823 20:06:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:27.823 20:06:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:27.823 20:06:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:28.082 20:06:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:28.082 "name": "raid_bdev1", 00:30:28.082 "uuid": "73e285ff-c07d-43ed-9f90-c39171902b92", 00:30:28.082 "strip_size_kb": 0, 00:30:28.082 "state": "online", 00:30:28.082 "raid_level": "raid1", 00:30:28.082 "superblock": true, 00:30:28.082 "num_base_bdevs": 2, 00:30:28.082 "num_base_bdevs_discovered": 2, 00:30:28.082 "num_base_bdevs_operational": 2, 00:30:28.082 "base_bdevs_list": [ 00:30:28.082 { 00:30:28.082 "name": "spare", 00:30:28.082 "uuid": "12ee0c91-caf4-5755-bfe6-d238003be2c9", 00:30:28.082 "is_configured": true, 00:30:28.082 "data_offset": 256, 00:30:28.082 "data_size": 7936 00:30:28.082 }, 00:30:28.082 { 00:30:28.082 "name": "BaseBdev2", 00:30:28.082 "uuid": "a93186ca-e7fb-5cef-8e35-416e62ab92a4", 00:30:28.082 "is_configured": true, 00:30:28.082 "data_offset": 256, 00:30:28.082 "data_size": 7936 00:30:28.082 } 00:30:28.082 ] 00:30:28.082 }' 00:30:28.082 20:06:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:28.082 20:06:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:28.650 20:06:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:30:28.909 [2024-07-24 20:06:20.383367] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:28.909 [2024-07-24 20:06:20.383397] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:28.909 [2024-07-24 20:06:20.383454] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:28.909 [2024-07-24 20:06:20.383512] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:28.909 [2024-07-24 20:06:20.383524] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa72bd0 name raid_bdev1, state offline 00:30:28.909 20:06:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:28.909 20:06:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@735 -- # jq length 00:30:29.168 20:06:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:30:29.168 20:06:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@737 -- # '[' false = true ']' 00:30:29.168 20:06:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@758 -- # '[' true = true ']' 00:30:29.168 20:06:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:30:29.427 20:06:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:30:29.686 [2024-07-24 20:06:21.157552] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:30:29.686 [2024-07-24 20:06:21.157598] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:29.686 [2024-07-24 20:06:21.157619] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8cff60 00:30:29.686 [2024-07-24 20:06:21.157632] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:29.686 [2024-07-24 20:06:21.159133] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:29.686 [2024-07-24 20:06:21.159161] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:30:29.686 [2024-07-24 20:06:21.159219] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:30:29.686 [2024-07-24 20:06:21.159246] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:29.686 [2024-07-24 20:06:21.159334] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:30:29.686 spare 00:30:29.686 20:06:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:29.686 20:06:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:29.686 20:06:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:29.686 20:06:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:29.686 20:06:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:29.686 20:06:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:29.686 20:06:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:29.686 20:06:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:29.686 20:06:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:29.686 20:06:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:29.686 20:06:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:29.686 20:06:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:29.686 [2024-07-24 20:06:21.259650] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x8d1ea0 00:30:29.686 [2024-07-24 20:06:21.259668] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:30:29.687 [2024-07-24 20:06:21.259745] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa6a100 00:30:29.687 [2024-07-24 20:06:21.259844] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x8d1ea0 00:30:29.687 [2024-07-24 20:06:21.259854] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x8d1ea0 00:30:29.687 [2024-07-24 20:06:21.259924] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:29.946 20:06:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:29.946 "name": "raid_bdev1", 00:30:29.946 "uuid": "73e285ff-c07d-43ed-9f90-c39171902b92", 00:30:29.946 "strip_size_kb": 0, 00:30:29.946 "state": "online", 00:30:29.946 "raid_level": "raid1", 00:30:29.946 "superblock": true, 00:30:29.946 "num_base_bdevs": 2, 00:30:29.946 "num_base_bdevs_discovered": 2, 00:30:29.946 "num_base_bdevs_operational": 2, 00:30:29.946 "base_bdevs_list": [ 00:30:29.946 { 00:30:29.946 "name": "spare", 00:30:29.946 "uuid": "12ee0c91-caf4-5755-bfe6-d238003be2c9", 00:30:29.946 "is_configured": true, 00:30:29.946 "data_offset": 256, 00:30:29.946 "data_size": 7936 00:30:29.946 }, 00:30:29.946 { 00:30:29.946 "name": "BaseBdev2", 00:30:29.946 "uuid": "a93186ca-e7fb-5cef-8e35-416e62ab92a4", 00:30:29.946 "is_configured": true, 00:30:29.946 "data_offset": 256, 00:30:29.946 "data_size": 7936 00:30:29.946 } 00:30:29.946 ] 00:30:29.946 }' 00:30:29.946 20:06:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:29.946 20:06:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:30.883 20:06:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:30.883 20:06:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:30.883 20:06:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:30.883 20:06:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:30.883 20:06:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:30.883 20:06:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:30.883 20:06:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:31.141 20:06:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:31.141 "name": "raid_bdev1", 00:30:31.141 "uuid": "73e285ff-c07d-43ed-9f90-c39171902b92", 00:30:31.141 "strip_size_kb": 0, 00:30:31.141 "state": "online", 00:30:31.141 "raid_level": "raid1", 00:30:31.141 "superblock": true, 00:30:31.141 "num_base_bdevs": 2, 00:30:31.141 "num_base_bdevs_discovered": 2, 00:30:31.141 "num_base_bdevs_operational": 2, 00:30:31.141 "base_bdevs_list": [ 00:30:31.141 { 00:30:31.141 "name": "spare", 00:30:31.141 "uuid": "12ee0c91-caf4-5755-bfe6-d238003be2c9", 00:30:31.141 "is_configured": true, 00:30:31.141 "data_offset": 256, 00:30:31.141 "data_size": 7936 00:30:31.141 }, 00:30:31.141 { 00:30:31.141 "name": "BaseBdev2", 00:30:31.141 "uuid": "a93186ca-e7fb-5cef-8e35-416e62ab92a4", 00:30:31.141 "is_configured": true, 00:30:31.141 "data_offset": 256, 00:30:31.141 "data_size": 7936 00:30:31.141 } 00:30:31.141 ] 00:30:31.141 }' 00:30:31.141 20:06:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:31.141 20:06:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:31.141 20:06:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:31.400 20:06:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:31.400 20:06:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:31.400 20:06:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@765 -- # jq -r '.[].base_bdevs_list[0].name' 00:30:31.658 20:06:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@765 -- # [[ spare == \s\p\a\r\e ]] 00:30:31.658 20:06:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:30:31.917 [2024-07-24 20:06:23.303386] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:31.917 20:06:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:31.917 20:06:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:31.917 20:06:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:31.917 20:06:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:31.917 20:06:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:31.917 20:06:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:31.917 20:06:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:31.917 20:06:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:31.917 20:06:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:31.917 20:06:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:31.917 20:06:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:31.917 20:06:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:32.177 20:06:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:32.177 "name": "raid_bdev1", 00:30:32.177 "uuid": "73e285ff-c07d-43ed-9f90-c39171902b92", 00:30:32.177 "strip_size_kb": 0, 00:30:32.177 "state": "online", 00:30:32.177 "raid_level": "raid1", 00:30:32.177 "superblock": true, 00:30:32.177 "num_base_bdevs": 2, 00:30:32.177 "num_base_bdevs_discovered": 1, 00:30:32.177 "num_base_bdevs_operational": 1, 00:30:32.177 "base_bdevs_list": [ 00:30:32.177 { 00:30:32.177 "name": null, 00:30:32.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:32.177 "is_configured": false, 00:30:32.177 "data_offset": 256, 00:30:32.177 "data_size": 7936 00:30:32.177 }, 00:30:32.177 { 00:30:32.177 "name": "BaseBdev2", 00:30:32.177 "uuid": "a93186ca-e7fb-5cef-8e35-416e62ab92a4", 00:30:32.177 "is_configured": true, 00:30:32.177 "data_offset": 256, 00:30:32.177 "data_size": 7936 00:30:32.177 } 00:30:32.177 ] 00:30:32.177 }' 00:30:32.177 20:06:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:32.177 20:06:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:33.113 20:06:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@770 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:30:33.371 [2024-07-24 20:06:24.779313] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:33.371 [2024-07-24 20:06:24.779474] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:30:33.371 [2024-07-24 20:06:24.779497] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:30:33.371 [2024-07-24 20:06:24.779525] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:33.371 [2024-07-24 20:06:24.783015] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8cf7d0 00:30:33.371 [2024-07-24 20:06:24.784407] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:33.371 20:06:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # sleep 1 00:30:34.308 20:06:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:34.308 20:06:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:34.308 20:06:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:34.308 20:06:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:34.308 20:06:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:34.308 20:06:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:34.308 20:06:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:34.567 20:06:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:34.567 "name": "raid_bdev1", 00:30:34.567 "uuid": "73e285ff-c07d-43ed-9f90-c39171902b92", 00:30:34.567 "strip_size_kb": 0, 00:30:34.567 "state": "online", 00:30:34.567 "raid_level": "raid1", 00:30:34.567 "superblock": true, 00:30:34.567 "num_base_bdevs": 2, 00:30:34.567 "num_base_bdevs_discovered": 2, 00:30:34.567 "num_base_bdevs_operational": 2, 00:30:34.567 "process": { 00:30:34.567 "type": "rebuild", 00:30:34.567 "target": "spare", 00:30:34.567 "progress": { 00:30:34.567 "blocks": 3072, 00:30:34.567 "percent": 38 00:30:34.567 } 00:30:34.567 }, 00:30:34.567 "base_bdevs_list": [ 00:30:34.567 { 00:30:34.567 "name": "spare", 00:30:34.567 "uuid": "12ee0c91-caf4-5755-bfe6-d238003be2c9", 00:30:34.567 "is_configured": true, 00:30:34.567 "data_offset": 256, 00:30:34.567 "data_size": 7936 00:30:34.567 }, 00:30:34.567 { 00:30:34.567 "name": "BaseBdev2", 00:30:34.567 "uuid": "a93186ca-e7fb-5cef-8e35-416e62ab92a4", 00:30:34.567 "is_configured": true, 00:30:34.567 "data_offset": 256, 00:30:34.567 "data_size": 7936 00:30:34.567 } 00:30:34.567 ] 00:30:34.567 }' 00:30:34.567 20:06:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:34.567 20:06:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:34.567 20:06:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:34.826 20:06:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:34.826 20:06:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:30:34.826 [2024-07-24 20:06:26.405825] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:35.085 [2024-07-24 20:06:26.497599] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:30:35.085 [2024-07-24 20:06:26.497651] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:35.085 [2024-07-24 20:06:26.497667] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:35.085 [2024-07-24 20:06:26.497675] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:30:35.085 20:06:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:35.085 20:06:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:35.085 20:06:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:35.085 20:06:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:35.085 20:06:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:35.085 20:06:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:35.085 20:06:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:35.085 20:06:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:35.085 20:06:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:35.085 20:06:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:35.085 20:06:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:35.085 20:06:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:35.344 20:06:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:35.344 "name": "raid_bdev1", 00:30:35.344 "uuid": "73e285ff-c07d-43ed-9f90-c39171902b92", 00:30:35.344 "strip_size_kb": 0, 00:30:35.344 "state": "online", 00:30:35.344 "raid_level": "raid1", 00:30:35.344 "superblock": true, 00:30:35.344 "num_base_bdevs": 2, 00:30:35.344 "num_base_bdevs_discovered": 1, 00:30:35.344 "num_base_bdevs_operational": 1, 00:30:35.344 "base_bdevs_list": [ 00:30:35.344 { 00:30:35.344 "name": null, 00:30:35.344 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:35.344 "is_configured": false, 00:30:35.344 "data_offset": 256, 00:30:35.344 "data_size": 7936 00:30:35.344 }, 00:30:35.344 { 00:30:35.344 "name": "BaseBdev2", 00:30:35.344 "uuid": "a93186ca-e7fb-5cef-8e35-416e62ab92a4", 00:30:35.344 "is_configured": true, 00:30:35.344 "data_offset": 256, 00:30:35.344 "data_size": 7936 00:30:35.344 } 00:30:35.344 ] 00:30:35.344 }' 00:30:35.344 20:06:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:35.344 20:06:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:36.281 20:06:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:30:36.540 [2024-07-24 20:06:27.913690] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:30:36.540 [2024-07-24 20:06:27.913739] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:36.540 [2024-07-24 20:06:27.913760] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8d2960 00:30:36.540 [2024-07-24 20:06:27.913774] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:36.540 [2024-07-24 20:06:27.913966] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:36.540 [2024-07-24 20:06:27.913982] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:30:36.540 [2024-07-24 20:06:27.914039] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:30:36.540 [2024-07-24 20:06:27.914052] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:30:36.540 [2024-07-24 20:06:27.914063] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:30:36.540 [2024-07-24 20:06:27.914081] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:36.540 [2024-07-24 20:06:27.917562] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8d2bf0 00:30:36.540 [2024-07-24 20:06:27.918920] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:36.540 spare 00:30:36.540 20:06:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # sleep 1 00:30:37.477 20:06:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:37.477 20:06:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:37.477 20:06:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:37.477 20:06:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:37.477 20:06:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:37.477 20:06:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:37.477 20:06:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:37.737 20:06:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:37.737 "name": "raid_bdev1", 00:30:37.737 "uuid": "73e285ff-c07d-43ed-9f90-c39171902b92", 00:30:37.737 "strip_size_kb": 0, 00:30:37.737 "state": "online", 00:30:37.737 "raid_level": "raid1", 00:30:37.737 "superblock": true, 00:30:37.737 "num_base_bdevs": 2, 00:30:37.737 "num_base_bdevs_discovered": 2, 00:30:37.737 "num_base_bdevs_operational": 2, 00:30:37.737 "process": { 00:30:37.737 "type": "rebuild", 00:30:37.737 "target": "spare", 00:30:37.737 "progress": { 00:30:37.737 "blocks": 3072, 00:30:37.737 "percent": 38 00:30:37.737 } 00:30:37.737 }, 00:30:37.737 "base_bdevs_list": [ 00:30:37.737 { 00:30:37.737 "name": "spare", 00:30:37.737 "uuid": "12ee0c91-caf4-5755-bfe6-d238003be2c9", 00:30:37.737 "is_configured": true, 00:30:37.737 "data_offset": 256, 00:30:37.737 "data_size": 7936 00:30:37.737 }, 00:30:37.737 { 00:30:37.737 "name": "BaseBdev2", 00:30:37.737 "uuid": "a93186ca-e7fb-5cef-8e35-416e62ab92a4", 00:30:37.737 "is_configured": true, 00:30:37.737 "data_offset": 256, 00:30:37.737 "data_size": 7936 00:30:37.737 } 00:30:37.737 ] 00:30:37.737 }' 00:30:37.737 20:06:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:37.737 20:06:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:37.737 20:06:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:38.029 20:06:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:38.029 20:06:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:30:38.029 [2024-07-24 20:06:29.560288] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:38.288 [2024-07-24 20:06:29.632409] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:30:38.288 [2024-07-24 20:06:29.632466] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:38.288 [2024-07-24 20:06:29.632483] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:38.288 [2024-07-24 20:06:29.632492] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:30:38.288 20:06:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@783 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:38.288 20:06:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:38.288 20:06:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:38.288 20:06:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:38.288 20:06:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:38.288 20:06:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:38.288 20:06:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:38.288 20:06:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:38.288 20:06:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:38.288 20:06:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:38.288 20:06:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:38.288 20:06:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:38.547 20:06:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:38.547 "name": "raid_bdev1", 00:30:38.547 "uuid": "73e285ff-c07d-43ed-9f90-c39171902b92", 00:30:38.547 "strip_size_kb": 0, 00:30:38.547 "state": "online", 00:30:38.547 "raid_level": "raid1", 00:30:38.547 "superblock": true, 00:30:38.547 "num_base_bdevs": 2, 00:30:38.547 "num_base_bdevs_discovered": 1, 00:30:38.547 "num_base_bdevs_operational": 1, 00:30:38.547 "base_bdevs_list": [ 00:30:38.547 { 00:30:38.547 "name": null, 00:30:38.547 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:38.547 "is_configured": false, 00:30:38.547 "data_offset": 256, 00:30:38.547 "data_size": 7936 00:30:38.547 }, 00:30:38.547 { 00:30:38.547 "name": "BaseBdev2", 00:30:38.547 "uuid": "a93186ca-e7fb-5cef-8e35-416e62ab92a4", 00:30:38.547 "is_configured": true, 00:30:38.547 "data_offset": 256, 00:30:38.547 "data_size": 7936 00:30:38.547 } 00:30:38.547 ] 00:30:38.547 }' 00:30:38.547 20:06:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:38.547 20:06:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:39.482 20:06:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:39.482 20:06:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:39.482 20:06:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:39.482 20:06:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:39.482 20:06:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:39.482 20:06:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:39.482 20:06:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:39.482 20:06:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:39.482 "name": "raid_bdev1", 00:30:39.482 "uuid": "73e285ff-c07d-43ed-9f90-c39171902b92", 00:30:39.482 "strip_size_kb": 0, 00:30:39.482 "state": "online", 00:30:39.482 "raid_level": "raid1", 00:30:39.482 "superblock": true, 00:30:39.482 "num_base_bdevs": 2, 00:30:39.482 "num_base_bdevs_discovered": 1, 00:30:39.482 "num_base_bdevs_operational": 1, 00:30:39.482 "base_bdevs_list": [ 00:30:39.482 { 00:30:39.482 "name": null, 00:30:39.482 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:39.482 "is_configured": false, 00:30:39.482 "data_offset": 256, 00:30:39.482 "data_size": 7936 00:30:39.482 }, 00:30:39.482 { 00:30:39.482 "name": "BaseBdev2", 00:30:39.482 "uuid": "a93186ca-e7fb-5cef-8e35-416e62ab92a4", 00:30:39.482 "is_configured": true, 00:30:39.482 "data_offset": 256, 00:30:39.482 "data_size": 7936 00:30:39.482 } 00:30:39.482 ] 00:30:39.482 }' 00:30:39.482 20:06:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:39.740 20:06:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:39.740 20:06:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:39.740 20:06:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:39.740 20:06:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@787 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:30:39.998 20:06:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@788 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:30:40.256 [2024-07-24 20:06:31.605822] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:30:40.256 [2024-07-24 20:06:31.605865] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:40.256 [2024-07-24 20:06:31.605887] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8d1940 00:30:40.256 [2024-07-24 20:06:31.605900] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:40.256 [2024-07-24 20:06:31.606066] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:40.256 [2024-07-24 20:06:31.606082] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:30:40.256 [2024-07-24 20:06:31.606134] bdev_raid.c:3849:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:30:40.256 [2024-07-24 20:06:31.606146] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:30:40.256 [2024-07-24 20:06:31.606157] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:30:40.256 BaseBdev1 00:30:40.256 20:06:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@789 -- # sleep 1 00:30:41.189 20:06:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@790 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:41.189 20:06:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:41.189 20:06:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:41.190 20:06:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:41.190 20:06:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:41.190 20:06:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:41.190 20:06:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:41.190 20:06:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:41.190 20:06:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:41.190 20:06:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:41.190 20:06:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:41.190 20:06:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:41.447 20:06:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:41.447 "name": "raid_bdev1", 00:30:41.447 "uuid": "73e285ff-c07d-43ed-9f90-c39171902b92", 00:30:41.447 "strip_size_kb": 0, 00:30:41.447 "state": "online", 00:30:41.447 "raid_level": "raid1", 00:30:41.447 "superblock": true, 00:30:41.447 "num_base_bdevs": 2, 00:30:41.448 "num_base_bdevs_discovered": 1, 00:30:41.448 "num_base_bdevs_operational": 1, 00:30:41.448 "base_bdevs_list": [ 00:30:41.448 { 00:30:41.448 "name": null, 00:30:41.448 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:41.448 "is_configured": false, 00:30:41.448 "data_offset": 256, 00:30:41.448 "data_size": 7936 00:30:41.448 }, 00:30:41.448 { 00:30:41.448 "name": "BaseBdev2", 00:30:41.448 "uuid": "a93186ca-e7fb-5cef-8e35-416e62ab92a4", 00:30:41.448 "is_configured": true, 00:30:41.448 "data_offset": 256, 00:30:41.448 "data_size": 7936 00:30:41.448 } 00:30:41.448 ] 00:30:41.448 }' 00:30:41.448 20:06:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:41.448 20:06:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:42.380 20:06:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:42.380 20:06:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:42.380 20:06:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:42.380 20:06:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:42.380 20:06:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:42.380 20:06:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:42.380 20:06:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:42.638 20:06:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:42.638 "name": "raid_bdev1", 00:30:42.638 "uuid": "73e285ff-c07d-43ed-9f90-c39171902b92", 00:30:42.638 "strip_size_kb": 0, 00:30:42.638 "state": "online", 00:30:42.638 "raid_level": "raid1", 00:30:42.638 "superblock": true, 00:30:42.638 "num_base_bdevs": 2, 00:30:42.638 "num_base_bdevs_discovered": 1, 00:30:42.638 "num_base_bdevs_operational": 1, 00:30:42.638 "base_bdevs_list": [ 00:30:42.638 { 00:30:42.638 "name": null, 00:30:42.638 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:42.638 "is_configured": false, 00:30:42.638 "data_offset": 256, 00:30:42.638 "data_size": 7936 00:30:42.638 }, 00:30:42.638 { 00:30:42.638 "name": "BaseBdev2", 00:30:42.638 "uuid": "a93186ca-e7fb-5cef-8e35-416e62ab92a4", 00:30:42.638 "is_configured": true, 00:30:42.638 "data_offset": 256, 00:30:42.638 "data_size": 7936 00:30:42.638 } 00:30:42.638 ] 00:30:42.638 }' 00:30:42.638 20:06:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:42.896 20:06:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:42.896 20:06:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:42.896 20:06:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:42.896 20:06:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@792 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:30:42.896 20:06:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@650 -- # local es=0 00:30:42.896 20:06:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:30:42.896 20:06:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:42.896 20:06:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:30:42.896 20:06:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:42.896 20:06:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:30:42.896 20:06:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:42.897 20:06:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:30:42.897 20:06:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:42.897 20:06:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:30:42.897 20:06:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:30:43.155 [2024-07-24 20:06:34.513672] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:43.155 [2024-07-24 20:06:34.513794] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:30:43.155 [2024-07-24 20:06:34.513811] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:30:43.155 request: 00:30:43.155 { 00:30:43.155 "base_bdev": "BaseBdev1", 00:30:43.155 "raid_bdev": "raid_bdev1", 00:30:43.155 "method": "bdev_raid_add_base_bdev", 00:30:43.155 "req_id": 1 00:30:43.155 } 00:30:43.155 Got JSON-RPC error response 00:30:43.155 response: 00:30:43.155 { 00:30:43.155 "code": -22, 00:30:43.155 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:30:43.155 } 00:30:43.155 20:06:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@653 -- # es=1 00:30:43.155 20:06:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:30:43.155 20:06:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:30:43.155 20:06:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:30:43.155 20:06:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@793 -- # sleep 1 00:30:44.090 20:06:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@794 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:44.090 20:06:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:44.090 20:06:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:44.090 20:06:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:44.090 20:06:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:44.090 20:06:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:44.090 20:06:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:44.090 20:06:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:44.090 20:06:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:44.090 20:06:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:44.090 20:06:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:44.090 20:06:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:44.348 20:06:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:44.348 "name": "raid_bdev1", 00:30:44.348 "uuid": "73e285ff-c07d-43ed-9f90-c39171902b92", 00:30:44.348 "strip_size_kb": 0, 00:30:44.348 "state": "online", 00:30:44.348 "raid_level": "raid1", 00:30:44.348 "superblock": true, 00:30:44.348 "num_base_bdevs": 2, 00:30:44.348 "num_base_bdevs_discovered": 1, 00:30:44.348 "num_base_bdevs_operational": 1, 00:30:44.348 "base_bdevs_list": [ 00:30:44.348 { 00:30:44.348 "name": null, 00:30:44.348 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:44.348 "is_configured": false, 00:30:44.348 "data_offset": 256, 00:30:44.348 "data_size": 7936 00:30:44.348 }, 00:30:44.348 { 00:30:44.348 "name": "BaseBdev2", 00:30:44.348 "uuid": "a93186ca-e7fb-5cef-8e35-416e62ab92a4", 00:30:44.348 "is_configured": true, 00:30:44.348 "data_offset": 256, 00:30:44.348 "data_size": 7936 00:30:44.348 } 00:30:44.348 ] 00:30:44.348 }' 00:30:44.348 20:06:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:44.348 20:06:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:44.914 20:06:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@795 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:44.914 20:06:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:44.914 20:06:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:44.914 20:06:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:44.914 20:06:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:44.914 20:06:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:44.914 20:06:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:45.480 20:06:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:45.480 "name": "raid_bdev1", 00:30:45.480 "uuid": "73e285ff-c07d-43ed-9f90-c39171902b92", 00:30:45.480 "strip_size_kb": 0, 00:30:45.481 "state": "online", 00:30:45.481 "raid_level": "raid1", 00:30:45.481 "superblock": true, 00:30:45.481 "num_base_bdevs": 2, 00:30:45.481 "num_base_bdevs_discovered": 1, 00:30:45.481 "num_base_bdevs_operational": 1, 00:30:45.481 "base_bdevs_list": [ 00:30:45.481 { 00:30:45.481 "name": null, 00:30:45.481 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:45.481 "is_configured": false, 00:30:45.481 "data_offset": 256, 00:30:45.481 "data_size": 7936 00:30:45.481 }, 00:30:45.481 { 00:30:45.481 "name": "BaseBdev2", 00:30:45.481 "uuid": "a93186ca-e7fb-5cef-8e35-416e62ab92a4", 00:30:45.481 "is_configured": true, 00:30:45.481 "data_offset": 256, 00:30:45.481 "data_size": 7936 00:30:45.481 } 00:30:45.481 ] 00:30:45.481 }' 00:30:45.481 20:06:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:45.481 20:06:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:45.481 20:06:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:45.738 20:06:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:45.738 20:06:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@798 -- # killprocess 1540322 00:30:45.738 20:06:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@950 -- # '[' -z 1540322 ']' 00:30:45.738 20:06:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # kill -0 1540322 00:30:45.738 20:06:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # uname 00:30:45.738 20:06:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:30:45.738 20:06:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1540322 00:30:45.738 20:06:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:30:45.738 20:06:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:30:45.738 20:06:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1540322' 00:30:45.738 killing process with pid 1540322 00:30:45.738 20:06:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@969 -- # kill 1540322 00:30:45.738 Received shutdown signal, test time was about 60.000000 seconds 00:30:45.738 00:30:45.738 Latency(us) 00:30:45.738 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:45.738 =================================================================================================================== 00:30:45.738 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:30:45.739 [2024-07-24 20:06:37.120677] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:30:45.739 [2024-07-24 20:06:37.120761] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:45.739 [2024-07-24 20:06:37.120806] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:45.739 [2024-07-24 20:06:37.120818] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8d1ea0 name raid_bdev1, state offline 00:30:45.739 20:06:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@974 -- # wait 1540322 00:30:45.739 [2024-07-24 20:06:37.147895] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:30:45.997 20:06:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@800 -- # return 0 00:30:45.997 00:30:45.997 real 0m31.663s 00:30:45.997 user 0m51.396s 00:30:45.997 sys 0m4.127s 00:30:45.997 20:06:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1126 -- # xtrace_disable 00:30:45.997 20:06:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:45.997 ************************************ 00:30:45.997 END TEST raid_rebuild_test_sb_md_interleaved 00:30:45.997 ************************************ 00:30:45.997 20:06:37 bdev_raid -- bdev/bdev_raid.sh@996 -- # trap - EXIT 00:30:45.997 20:06:37 bdev_raid -- bdev/bdev_raid.sh@997 -- # cleanup 00:30:45.997 20:06:37 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 1540322 ']' 00:30:45.997 20:06:37 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 1540322 00:30:45.997 20:06:37 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:30:45.997 00:30:45.997 real 19m47.732s 00:30:45.997 user 33m41.123s 00:30:45.997 sys 3m34.561s 00:30:45.997 20:06:37 bdev_raid -- common/autotest_common.sh@1126 -- # xtrace_disable 00:30:45.997 20:06:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:30:45.997 ************************************ 00:30:45.997 END TEST bdev_raid 00:30:45.997 ************************************ 00:30:45.997 20:06:37 -- spdk/autotest.sh@195 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:30:45.997 20:06:37 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:30:45.997 20:06:37 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:30:45.997 20:06:37 -- common/autotest_common.sh@10 -- # set +x 00:30:45.997 ************************************ 00:30:45.997 START TEST bdevperf_config 00:30:45.997 ************************************ 00:30:45.997 20:06:37 bdevperf_config -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:30:46.256 * Looking for test storage... 00:30:46.256 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:46.256 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:46.256 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:46.256 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:46.256 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:46.256 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:46.256 20:06:37 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:49.542 20:06:40 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-07-24 20:06:37.737187] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:30:49.542 [2024-07-24 20:06:37.737259] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1544888 ] 00:30:49.542 Using job config with 4 jobs 00:30:49.542 [2024-07-24 20:06:37.884053] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:49.542 [2024-07-24 20:06:38.005652] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:49.542 cpumask for '\''job0'\'' is too big 00:30:49.542 cpumask for '\''job1'\'' is too big 00:30:49.542 cpumask for '\''job2'\'' is too big 00:30:49.542 cpumask for '\''job3'\'' is too big 00:30:49.542 Running I/O for 2 seconds... 00:30:49.542 00:30:49.542 Latency(us) 00:30:49.542 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:49.542 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:49.542 Malloc0 : 2.01 23916.11 23.36 0.00 0.00 10690.93 1880.60 16412.49 00:30:49.542 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:49.542 Malloc0 : 2.02 23925.01 23.36 0.00 0.00 10663.17 1852.10 14531.90 00:30:49.542 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:49.542 Malloc0 : 2.02 23903.06 23.34 0.00 0.00 10650.13 1852.10 12651.30 00:30:49.542 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:49.542 Malloc0 : 2.03 23881.15 23.32 0.00 0.00 10635.69 1852.10 10941.66 00:30:49.542 =================================================================================================================== 00:30:49.542 Total : 95625.33 93.38 0.00 0.00 10659.94 1852.10 16412.49' 00:30:49.542 20:06:40 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-07-24 20:06:37.737187] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:30:49.542 [2024-07-24 20:06:37.737259] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1544888 ] 00:30:49.542 Using job config with 4 jobs 00:30:49.542 [2024-07-24 20:06:37.884053] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:49.542 [2024-07-24 20:06:38.005652] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:49.542 cpumask for '\''job0'\'' is too big 00:30:49.542 cpumask for '\''job1'\'' is too big 00:30:49.542 cpumask for '\''job2'\'' is too big 00:30:49.542 cpumask for '\''job3'\'' is too big 00:30:49.542 Running I/O for 2 seconds... 00:30:49.542 00:30:49.542 Latency(us) 00:30:49.542 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:49.542 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:49.542 Malloc0 : 2.01 23916.11 23.36 0.00 0.00 10690.93 1880.60 16412.49 00:30:49.542 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:49.542 Malloc0 : 2.02 23925.01 23.36 0.00 0.00 10663.17 1852.10 14531.90 00:30:49.542 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:49.542 Malloc0 : 2.02 23903.06 23.34 0.00 0.00 10650.13 1852.10 12651.30 00:30:49.542 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:49.542 Malloc0 : 2.03 23881.15 23.32 0.00 0.00 10635.69 1852.10 10941.66 00:30:49.542 =================================================================================================================== 00:30:49.542 Total : 95625.33 93.38 0.00 0.00 10659.94 1852.10 16412.49' 00:30:49.542 20:06:40 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-24 20:06:37.737187] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:30:49.542 [2024-07-24 20:06:37.737259] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1544888 ] 00:30:49.542 Using job config with 4 jobs 00:30:49.542 [2024-07-24 20:06:37.884053] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:49.542 [2024-07-24 20:06:38.005652] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:49.542 cpumask for '\''job0'\'' is too big 00:30:49.542 cpumask for '\''job1'\'' is too big 00:30:49.542 cpumask for '\''job2'\'' is too big 00:30:49.542 cpumask for '\''job3'\'' is too big 00:30:49.542 Running I/O for 2 seconds... 00:30:49.542 00:30:49.542 Latency(us) 00:30:49.542 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:49.542 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:49.542 Malloc0 : 2.01 23916.11 23.36 0.00 0.00 10690.93 1880.60 16412.49 00:30:49.542 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:49.542 Malloc0 : 2.02 23925.01 23.36 0.00 0.00 10663.17 1852.10 14531.90 00:30:49.542 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:49.542 Malloc0 : 2.02 23903.06 23.34 0.00 0.00 10650.13 1852.10 12651.30 00:30:49.542 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:49.542 Malloc0 : 2.03 23881.15 23.32 0.00 0.00 10635.69 1852.10 10941.66 00:30:49.542 =================================================================================================================== 00:30:49.542 Total : 95625.33 93.38 0.00 0.00 10659.94 1852.10 16412.49' 00:30:49.542 20:06:40 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:30:49.542 20:06:40 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:30:49.542 20:06:40 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:30:49.542 20:06:40 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:49.542 [2024-07-24 20:06:40.520930] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:30:49.542 [2024-07-24 20:06:40.520999] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1545239 ] 00:30:49.542 [2024-07-24 20:06:40.666901] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:49.542 [2024-07-24 20:06:40.783881] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:49.542 cpumask for 'job0' is too big 00:30:49.542 cpumask for 'job1' is too big 00:30:49.542 cpumask for 'job2' is too big 00:30:49.542 cpumask for 'job3' is too big 00:30:52.073 20:06:43 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:30:52.073 Running I/O for 2 seconds... 00:30:52.073 00:30:52.073 Latency(us) 00:30:52.073 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:52.073 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:52.073 Malloc0 : 2.01 23891.10 23.33 0.00 0.00 10698.49 1852.10 16412.49 00:30:52.073 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:52.073 Malloc0 : 2.02 23868.98 23.31 0.00 0.00 10684.65 1837.86 14531.90 00:30:52.073 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:52.073 Malloc0 : 2.02 23910.01 23.35 0.00 0.00 10641.68 1852.10 12708.29 00:30:52.073 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:52.073 Malloc0 : 2.03 23888.06 23.33 0.00 0.00 10628.13 1837.86 11055.64 00:30:52.073 =================================================================================================================== 00:30:52.073 Total : 95558.14 93.32 0.00 0.00 10663.16 1837.86 16412.49' 00:30:52.073 20:06:43 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:30:52.073 20:06:43 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:52.073 20:06:43 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:30:52.073 20:06:43 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:30:52.073 20:06:43 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:30:52.073 20:06:43 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:30:52.073 20:06:43 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:30:52.073 20:06:43 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:30:52.073 20:06:43 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:52.073 00:30:52.073 20:06:43 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:52.073 20:06:43 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:30:52.073 20:06:43 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:30:52.073 20:06:43 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:30:52.073 20:06:43 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:30:52.073 20:06:43 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:30:52.073 20:06:43 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:30:52.073 20:06:43 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:52.073 00:30:52.073 20:06:43 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:52.073 20:06:43 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:30:52.073 20:06:43 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:30:52.073 20:06:43 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:30:52.073 20:06:43 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:30:52.073 20:06:43 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:30:52.073 20:06:43 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:30:52.073 20:06:43 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:52.073 00:30:52.073 20:06:43 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:52.073 20:06:43 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-07-24 20:06:43.321960] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:30:54.604 [2024-07-24 20:06:43.322102] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1545589 ] 00:30:54.604 Using job config with 3 jobs 00:30:54.604 [2024-07-24 20:06:43.540630] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:54.604 [2024-07-24 20:06:43.664011] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:54.604 cpumask for '\''job0'\'' is too big 00:30:54.604 cpumask for '\''job1'\'' is too big 00:30:54.604 cpumask for '\''job2'\'' is too big 00:30:54.604 Running I/O for 2 seconds... 00:30:54.604 00:30:54.604 Latency(us) 00:30:54.604 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:54.604 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:54.604 Malloc0 : 2.01 32459.80 31.70 0.00 0.00 7887.30 1823.61 11625.52 00:30:54.604 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:54.604 Malloc0 : 2.01 32429.73 31.67 0.00 0.00 7877.06 1816.49 9801.91 00:30:54.604 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:54.604 Malloc0 : 2.02 32483.65 31.72 0.00 0.00 7846.36 940.30 8092.27 00:30:54.604 =================================================================================================================== 00:30:54.604 Total : 97373.18 95.09 0.00 0.00 7870.21 940.30 11625.52' 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-07-24 20:06:43.321960] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:30:54.604 [2024-07-24 20:06:43.322102] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1545589 ] 00:30:54.604 Using job config with 3 jobs 00:30:54.604 [2024-07-24 20:06:43.540630] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:54.604 [2024-07-24 20:06:43.664011] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:54.604 cpumask for '\''job0'\'' is too big 00:30:54.604 cpumask for '\''job1'\'' is too big 00:30:54.604 cpumask for '\''job2'\'' is too big 00:30:54.604 Running I/O for 2 seconds... 00:30:54.604 00:30:54.604 Latency(us) 00:30:54.604 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:54.604 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:54.604 Malloc0 : 2.01 32459.80 31.70 0.00 0.00 7887.30 1823.61 11625.52 00:30:54.604 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:54.604 Malloc0 : 2.01 32429.73 31.67 0.00 0.00 7877.06 1816.49 9801.91 00:30:54.604 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:54.604 Malloc0 : 2.02 32483.65 31.72 0.00 0.00 7846.36 940.30 8092.27 00:30:54.604 =================================================================================================================== 00:30:54.604 Total : 97373.18 95.09 0.00 0.00 7870.21 940.30 11625.52' 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-24 20:06:43.321960] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:30:54.604 [2024-07-24 20:06:43.322102] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1545589 ] 00:30:54.604 Using job config with 3 jobs 00:30:54.604 [2024-07-24 20:06:43.540630] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:54.604 [2024-07-24 20:06:43.664011] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:54.604 cpumask for '\''job0'\'' is too big 00:30:54.604 cpumask for '\''job1'\'' is too big 00:30:54.604 cpumask for '\''job2'\'' is too big 00:30:54.604 Running I/O for 2 seconds... 00:30:54.604 00:30:54.604 Latency(us) 00:30:54.604 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:54.604 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:54.604 Malloc0 : 2.01 32459.80 31.70 0.00 0.00 7887.30 1823.61 11625.52 00:30:54.604 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:54.604 Malloc0 : 2.01 32429.73 31.67 0.00 0.00 7877.06 1816.49 9801.91 00:30:54.604 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:54.604 Malloc0 : 2.02 32483.65 31.72 0.00 0.00 7846.36 940.30 8092.27 00:30:54.604 =================================================================================================================== 00:30:54.604 Total : 97373.18 95.09 0.00 0.00 7870.21 940.30 11625.52' 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:54.604 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:54.604 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:54.604 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:30:54.604 20:06:46 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:54.605 20:06:46 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:54.605 20:06:46 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:30:54.605 20:06:46 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:30:54.605 20:06:46 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:54.605 00:30:54.605 20:06:46 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:54.605 20:06:46 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:30:54.605 20:06:46 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:30:54.605 20:06:46 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:54.605 20:06:46 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:54.605 20:06:46 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:30:54.605 20:06:46 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:30:54.605 20:06:46 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:54.605 00:30:54.605 20:06:46 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:54.605 20:06:46 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:57.892 20:06:48 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-07-24 20:06:46.208686] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:30:57.892 [2024-07-24 20:06:46.208757] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1546018 ] 00:30:57.892 Using job config with 4 jobs 00:30:57.892 [2024-07-24 20:06:46.356038] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:57.892 [2024-07-24 20:06:46.474526] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:57.892 cpumask for '\''job0'\'' is too big 00:30:57.892 cpumask for '\''job1'\'' is too big 00:30:57.892 cpumask for '\''job2'\'' is too big 00:30:57.892 cpumask for '\''job3'\'' is too big 00:30:57.892 Running I/O for 2 seconds... 00:30:57.892 00:30:57.892 Latency(us) 00:30:57.892 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:57.892 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:57.892 Malloc0 : 2.04 11942.64 11.66 0.00 0.00 21425.41 3818.18 33280.89 00:30:57.892 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:57.892 Malloc1 : 2.04 11931.38 11.65 0.00 0.00 21423.07 4644.51 33280.89 00:30:57.892 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:57.892 Malloc0 : 2.04 11920.48 11.64 0.00 0.00 21366.89 3960.65 29177.77 00:30:57.892 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:57.892 Malloc1 : 2.04 11909.36 11.63 0.00 0.00 21368.10 4786.98 29177.77 00:30:57.892 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:57.893 Malloc0 : 2.04 11898.48 11.62 0.00 0.00 21310.05 3789.69 25416.57 00:30:57.893 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:57.893 Malloc1 : 2.05 11887.39 11.61 0.00 0.00 21309.48 4616.01 25302.59 00:30:57.893 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:57.893 Malloc0 : 2.05 11876.59 11.60 0.00 0.00 21250.41 3789.69 21769.35 00:30:57.893 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:57.893 Malloc1 : 2.05 11865.51 11.59 0.00 0.00 21249.66 4616.01 21769.35 00:30:57.893 =================================================================================================================== 00:30:57.893 Total : 95231.83 93.00 0.00 0.00 21337.88 3789.69 33280.89' 00:30:57.893 20:06:48 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-07-24 20:06:46.208686] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:30:57.893 [2024-07-24 20:06:46.208757] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1546018 ] 00:30:57.893 Using job config with 4 jobs 00:30:57.893 [2024-07-24 20:06:46.356038] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:57.893 [2024-07-24 20:06:46.474526] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:57.893 cpumask for '\''job0'\'' is too big 00:30:57.893 cpumask for '\''job1'\'' is too big 00:30:57.893 cpumask for '\''job2'\'' is too big 00:30:57.893 cpumask for '\''job3'\'' is too big 00:30:57.893 Running I/O for 2 seconds... 00:30:57.893 00:30:57.893 Latency(us) 00:30:57.893 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:57.893 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:57.893 Malloc0 : 2.04 11942.64 11.66 0.00 0.00 21425.41 3818.18 33280.89 00:30:57.893 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:57.893 Malloc1 : 2.04 11931.38 11.65 0.00 0.00 21423.07 4644.51 33280.89 00:30:57.893 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:57.893 Malloc0 : 2.04 11920.48 11.64 0.00 0.00 21366.89 3960.65 29177.77 00:30:57.893 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:57.893 Malloc1 : 2.04 11909.36 11.63 0.00 0.00 21368.10 4786.98 29177.77 00:30:57.893 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:57.893 Malloc0 : 2.04 11898.48 11.62 0.00 0.00 21310.05 3789.69 25416.57 00:30:57.893 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:57.893 Malloc1 : 2.05 11887.39 11.61 0.00 0.00 21309.48 4616.01 25302.59 00:30:57.893 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:57.893 Malloc0 : 2.05 11876.59 11.60 0.00 0.00 21250.41 3789.69 21769.35 00:30:57.893 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:57.893 Malloc1 : 2.05 11865.51 11.59 0.00 0.00 21249.66 4616.01 21769.35 00:30:57.893 =================================================================================================================== 00:30:57.893 Total : 95231.83 93.00 0.00 0.00 21337.88 3789.69 33280.89' 00:30:57.893 20:06:48 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-24 20:06:46.208686] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:30:57.893 [2024-07-24 20:06:46.208757] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1546018 ] 00:30:57.893 Using job config with 4 jobs 00:30:57.893 [2024-07-24 20:06:46.356038] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:57.893 [2024-07-24 20:06:46.474526] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:57.893 cpumask for '\''job0'\'' is too big 00:30:57.893 cpumask for '\''job1'\'' is too big 00:30:57.893 cpumask for '\''job2'\'' is too big 00:30:57.893 cpumask for '\''job3'\'' is too big 00:30:57.893 Running I/O for 2 seconds... 00:30:57.893 00:30:57.893 Latency(us) 00:30:57.893 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:57.893 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:57.893 Malloc0 : 2.04 11942.64 11.66 0.00 0.00 21425.41 3818.18 33280.89 00:30:57.893 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:57.893 Malloc1 : 2.04 11931.38 11.65 0.00 0.00 21423.07 4644.51 33280.89 00:30:57.893 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:57.893 Malloc0 : 2.04 11920.48 11.64 0.00 0.00 21366.89 3960.65 29177.77 00:30:57.893 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:57.893 Malloc1 : 2.04 11909.36 11.63 0.00 0.00 21368.10 4786.98 29177.77 00:30:57.893 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:57.893 Malloc0 : 2.04 11898.48 11.62 0.00 0.00 21310.05 3789.69 25416.57 00:30:57.893 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:57.893 Malloc1 : 2.05 11887.39 11.61 0.00 0.00 21309.48 4616.01 25302.59 00:30:57.893 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:57.893 Malloc0 : 2.05 11876.59 11.60 0.00 0.00 21250.41 3789.69 21769.35 00:30:57.893 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:57.893 Malloc1 : 2.05 11865.51 11.59 0.00 0.00 21249.66 4616.01 21769.35 00:30:57.893 =================================================================================================================== 00:30:57.893 Total : 95231.83 93.00 0.00 0.00 21337.88 3789.69 33280.89' 00:30:57.893 20:06:48 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:30:57.893 20:06:48 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:30:57.893 20:06:48 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:30:57.893 20:06:48 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:30:57.893 20:06:48 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:57.893 20:06:48 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:30:57.893 00:30:57.893 real 0m11.431s 00:30:57.893 user 0m10.020s 00:30:57.893 sys 0m1.259s 00:30:57.893 20:06:48 bdevperf_config -- common/autotest_common.sh@1126 -- # xtrace_disable 00:30:57.893 20:06:48 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:30:57.893 ************************************ 00:30:57.893 END TEST bdevperf_config 00:30:57.893 ************************************ 00:30:57.893 20:06:49 -- spdk/autotest.sh@196 -- # uname -s 00:30:57.893 20:06:49 -- spdk/autotest.sh@196 -- # [[ Linux == Linux ]] 00:30:57.893 20:06:49 -- spdk/autotest.sh@197 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:30:57.893 20:06:49 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:30:57.893 20:06:49 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:30:57.893 20:06:49 -- common/autotest_common.sh@10 -- # set +x 00:30:57.893 ************************************ 00:30:57.893 START TEST reactor_set_interrupt 00:30:57.893 ************************************ 00:30:57.893 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:30:57.893 * Looking for test storage... 00:30:57.893 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:57.893 20:06:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:30:57.893 20:06:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:30:57.893 20:06:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:57.893 20:06:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:57.893 20:06:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:30:57.893 20:06:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:57.894 20:06:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:30:57.894 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:30:57.894 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:30:57.894 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:30:57.894 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:30:57.894 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:30:57.894 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:30:57.894 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:30:57.894 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:30:57.894 20:06:49 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:30:57.894 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:30:57.894 20:06:49 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:30:57.894 20:06:49 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:30:57.894 20:06:49 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:30:57.894 20:06:49 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:57.894 20:06:49 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:57.894 20:06:49 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:30:57.894 20:06:49 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:57.894 20:06:49 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:30:57.894 20:06:49 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:30:57.894 20:06:49 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:30:57.894 20:06:49 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:30:57.894 20:06:49 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:30:57.894 20:06:49 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:30:57.894 20:06:49 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:30:57.894 20:06:49 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:30:57.894 #define SPDK_CONFIG_H 00:30:57.894 #define SPDK_CONFIG_APPS 1 00:30:57.894 #define SPDK_CONFIG_ARCH native 00:30:57.894 #undef SPDK_CONFIG_ASAN 00:30:57.894 #undef SPDK_CONFIG_AVAHI 00:30:57.894 #undef SPDK_CONFIG_CET 00:30:57.894 #define SPDK_CONFIG_COVERAGE 1 00:30:57.894 #define SPDK_CONFIG_CROSS_PREFIX 00:30:57.895 #define SPDK_CONFIG_CRYPTO 1 00:30:57.895 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:30:57.895 #undef SPDK_CONFIG_CUSTOMOCF 00:30:57.895 #undef SPDK_CONFIG_DAOS 00:30:57.895 #define SPDK_CONFIG_DAOS_DIR 00:30:57.895 #define SPDK_CONFIG_DEBUG 1 00:30:57.895 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:30:57.895 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:30:57.895 #define SPDK_CONFIG_DPDK_INC_DIR 00:30:57.895 #define SPDK_CONFIG_DPDK_LIB_DIR 00:30:57.895 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:30:57.895 #undef SPDK_CONFIG_DPDK_UADK 00:30:57.895 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:30:57.895 #define SPDK_CONFIG_EXAMPLES 1 00:30:57.895 #undef SPDK_CONFIG_FC 00:30:57.895 #define SPDK_CONFIG_FC_PATH 00:30:57.895 #define SPDK_CONFIG_FIO_PLUGIN 1 00:30:57.895 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:30:57.895 #undef SPDK_CONFIG_FUSE 00:30:57.895 #undef SPDK_CONFIG_FUZZER 00:30:57.895 #define SPDK_CONFIG_FUZZER_LIB 00:30:57.895 #undef SPDK_CONFIG_GOLANG 00:30:57.895 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:30:57.895 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:30:57.895 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:30:57.895 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:30:57.895 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:30:57.895 #undef SPDK_CONFIG_HAVE_LIBBSD 00:30:57.895 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:30:57.895 #define SPDK_CONFIG_IDXD 1 00:30:57.895 #define SPDK_CONFIG_IDXD_KERNEL 1 00:30:57.895 #define SPDK_CONFIG_IPSEC_MB 1 00:30:57.895 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:30:57.895 #define SPDK_CONFIG_ISAL 1 00:30:57.895 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:30:57.895 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:30:57.895 #define SPDK_CONFIG_LIBDIR 00:30:57.895 #undef SPDK_CONFIG_LTO 00:30:57.895 #define SPDK_CONFIG_MAX_LCORES 128 00:30:57.895 #define SPDK_CONFIG_NVME_CUSE 1 00:30:57.895 #undef SPDK_CONFIG_OCF 00:30:57.895 #define SPDK_CONFIG_OCF_PATH 00:30:57.895 #define SPDK_CONFIG_OPENSSL_PATH 00:30:57.895 #undef SPDK_CONFIG_PGO_CAPTURE 00:30:57.895 #define SPDK_CONFIG_PGO_DIR 00:30:57.895 #undef SPDK_CONFIG_PGO_USE 00:30:57.895 #define SPDK_CONFIG_PREFIX /usr/local 00:30:57.895 #undef SPDK_CONFIG_RAID5F 00:30:57.895 #undef SPDK_CONFIG_RBD 00:30:57.895 #define SPDK_CONFIG_RDMA 1 00:30:57.895 #define SPDK_CONFIG_RDMA_PROV verbs 00:30:57.895 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:30:57.895 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:30:57.895 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:30:57.895 #define SPDK_CONFIG_SHARED 1 00:30:57.895 #undef SPDK_CONFIG_SMA 00:30:57.895 #define SPDK_CONFIG_TESTS 1 00:30:57.895 #undef SPDK_CONFIG_TSAN 00:30:57.895 #define SPDK_CONFIG_UBLK 1 00:30:57.895 #define SPDK_CONFIG_UBSAN 1 00:30:57.895 #undef SPDK_CONFIG_UNIT_TESTS 00:30:57.895 #undef SPDK_CONFIG_URING 00:30:57.895 #define SPDK_CONFIG_URING_PATH 00:30:57.895 #undef SPDK_CONFIG_URING_ZNS 00:30:57.895 #undef SPDK_CONFIG_USDT 00:30:57.895 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:30:57.895 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:30:57.895 #undef SPDK_CONFIG_VFIO_USER 00:30:57.895 #define SPDK_CONFIG_VFIO_USER_DIR 00:30:57.895 #define SPDK_CONFIG_VHOST 1 00:30:57.895 #define SPDK_CONFIG_VIRTIO 1 00:30:57.895 #undef SPDK_CONFIG_VTUNE 00:30:57.895 #define SPDK_CONFIG_VTUNE_DIR 00:30:57.895 #define SPDK_CONFIG_WERROR 1 00:30:57.895 #define SPDK_CONFIG_WPDK_DIR 00:30:57.895 #undef SPDK_CONFIG_XNVME 00:30:57.895 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:30:57.895 20:06:49 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:30:57.895 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:30:57.895 20:06:49 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:30:57.895 20:06:49 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:30:57.895 20:06:49 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:30:57.895 20:06:49 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:57.895 20:06:49 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:57.895 20:06:49 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:57.895 20:06:49 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:30:57.895 20:06:49 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:57.895 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:30:57.895 20:06:49 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:30:57.895 20:06:49 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:30:57.895 20:06:49 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:30:57.895 20:06:49 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:30:57.895 20:06:49 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:57.895 20:06:49 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:30:57.895 20:06:49 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:30:57.895 20:06:49 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:30:57.895 20:06:49 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:30:57.895 20:06:49 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:30:57.895 20:06:49 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:30:57.895 20:06:49 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:30:57.895 20:06:49 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:30:57.895 20:06:49 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:30:57.895 20:06:49 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:30:57.895 20:06:49 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:30:57.895 20:06:49 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:30:57.895 20:06:49 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:30:57.895 20:06:49 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:30:57.895 20:06:49 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:30:57.895 20:06:49 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:30:57.895 20:06:49 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:30:57.895 20:06:49 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:30:57.895 20:06:49 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:30:57.895 20:06:49 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:30:57.895 20:06:49 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:30:57.895 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 0 00:30:57.895 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:30:57.895 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:30:57.895 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:30:57.895 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:30:57.895 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:30:57.895 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:30:57.895 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:30:57.895 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:30:57.895 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:30:57.895 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:30:57.895 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:30:57.895 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:30:57.895 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:30:57.895 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:30:57.895 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:30:57.895 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:30:57.895 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:30:57.895 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 1 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@166 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@173 -- # : 0 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@177 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@178 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@179 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@179 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@180 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@180 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:57.896 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@183 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@183 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@187 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@187 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@191 -- # export PYTHONDONTWRITEBYTECODE=1 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@191 -- # PYTHONDONTWRITEBYTECODE=1 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@195 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@195 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@196 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@196 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@200 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@201 -- # rm -rf /var/tmp/asan_suppression_file 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@202 -- # cat 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@238 -- # echo leak:libfuse3.so 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@240 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@242 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@242 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@244 -- # '[' -z /var/spdk/dependencies ']' 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@247 -- # export DEPENDENCY_DIR 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@251 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@251 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@252 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@252 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@255 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@255 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@256 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@258 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@258 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@261 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@261 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@264 -- # '[' 0 -eq 0 ']' 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@265 -- # export valgrind= 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@265 -- # valgrind= 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@271 -- # uname -s 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@271 -- # '[' Linux = Linux ']' 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@272 -- # HUGEMEM=4096 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@273 -- # export CLEAR_HUGE=yes 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@273 -- # CLEAR_HUGE=yes 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@274 -- # [[ 1 -eq 1 ]] 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@278 -- # export HUGE_EVEN_ALLOC=yes 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@278 -- # HUGE_EVEN_ALLOC=yes 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@281 -- # MAKE=make 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@282 -- # MAKEFLAGS=-j72 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@298 -- # export HUGEMEM=4096 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@298 -- # HUGEMEM=4096 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@300 -- # NO_HUGE=() 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@301 -- # TEST_MODE= 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@320 -- # [[ -z 1546467 ]] 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@320 -- # kill -0 1546467 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@330 -- # [[ -v testdir ]] 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@332 -- # local requested_size=2147483648 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local mount target_dir 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@335 -- # local -A mounts fss sizes avails uses 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local source fs size avail mount use 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@338 -- # local storage_fallback storage_candidates 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@340 -- # mktemp -udt spdk.XXXXXX 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@340 -- # storage_fallback=/tmp/spdk.JxWBUc 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@345 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@347 -- # [[ -n '' ]] 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@352 -- # [[ -n '' ]] 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@357 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.JxWBUc/tests/interrupt /tmp/spdk.JxWBUc 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@360 -- # requested_size=2214592512 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@329 -- # df -T 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@329 -- # grep -v Filesystem 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=spdk_devtmpfs 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=devtmpfs 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=67108864 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=67108864 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=0 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=/dev/pmem0 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=ext2 00:30:57.897 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=946290688 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=5284429824 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=4338139136 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=spdk_root 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=overlay 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=88883335168 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=94508535808 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=5625200640 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=47250890752 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=47254265856 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=3375104 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=18892296192 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=18901708800 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=9412608 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=47253770240 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=47254269952 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=499712 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=9450848256 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=9450852352 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=4096 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@368 -- # printf '* Looking for test storage...\n' 00:30:57.898 * Looking for test storage... 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@370 -- # local target_space new_size 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@371 -- # for target_dir in "${storage_candidates[@]}" 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@374 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@374 -- # awk '$1 !~ /Filesystem/{print $6}' 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@374 -- # mount=/ 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@376 -- # target_space=88883335168 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@377 -- # (( target_space == 0 || target_space < requested_size )) 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@380 -- # (( target_space >= requested_size )) 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@382 -- # [[ overlay == tmpfs ]] 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@382 -- # [[ overlay == ramfs ]] 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@382 -- # [[ / == / ]] 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@383 -- # new_size=7839793152 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@384 -- # (( new_size * 100 / sizes[/] > 95 )) 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@389 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@389 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@390 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:57.898 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@391 -- # return 0 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # set -o errtrace 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@1687 -- # true 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@1689 -- # xtrace_fd 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:30:57.898 20:06:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:30:57.898 20:06:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:57.898 20:06:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:30:57.898 20:06:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:30:57.898 20:06:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:30:57.898 20:06:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:30:57.898 20:06:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:30:57.898 20:06:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:30:57.898 20:06:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:30:57.898 20:06:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:30:57.898 20:06:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:57.898 20:06:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:30:57.898 20:06:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1546511 00:30:57.898 20:06:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:57.898 20:06:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:30:57.898 20:06:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1546511 /var/tmp/spdk.sock 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@831 -- # '[' -z 1546511 ']' 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@836 -- # local max_retries=100 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:57.898 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@840 -- # xtrace_disable 00:30:57.898 20:06:49 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:30:57.898 [2024-07-24 20:06:49.382442] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:30:57.898 [2024-07-24 20:06:49.382493] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1546511 ] 00:30:58.157 [2024-07-24 20:06:49.490986] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:58.157 [2024-07-24 20:06:49.596794] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:58.157 [2024-07-24 20:06:49.596897] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:58.157 [2024-07-24 20:06:49.596897] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:58.157 [2024-07-24 20:06:49.671421] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:59.095 20:06:50 reactor_set_interrupt -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:30:59.095 20:06:50 reactor_set_interrupt -- common/autotest_common.sh@864 -- # return 0 00:30:59.095 20:06:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:30:59.095 20:06:50 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:59.095 Malloc0 00:30:59.095 Malloc1 00:30:59.095 Malloc2 00:30:59.095 20:06:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:30:59.095 20:06:50 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:30:59.095 20:06:50 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:30:59.095 20:06:50 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:30:59.095 5000+0 records in 00:30:59.095 5000+0 records out 00:30:59.095 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0254603 s, 402 MB/s 00:30:59.095 20:06:50 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:30:59.354 AIO0 00:30:59.354 20:06:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 1546511 00:30:59.354 20:06:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 1546511 without_thd 00:30:59.354 20:06:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=1546511 00:30:59.354 20:06:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:30:59.354 20:06:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:30:59.354 20:06:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:30:59.354 20:06:50 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:30:59.354 20:06:50 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:30:59.354 20:06:50 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:30:59.354 20:06:50 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:59.612 20:06:50 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:30:59.612 20:06:50 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:59.612 20:06:51 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:30:59.612 20:06:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:30:59.612 20:06:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:30:59.612 20:06:51 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:30:59.612 20:06:51 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:30:59.612 20:06:51 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:30:59.612 20:06:51 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:59.900 20:06:51 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:59.900 20:06:51 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:30:59.900 20:06:51 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:30:59.900 20:06:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:30:59.900 20:06:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:30:59.900 spdk_thread ids are 1 on reactor0. 00:30:59.900 20:06:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:59.900 20:06:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1546511 0 00:30:59.900 20:06:51 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1546511 0 idle 00:30:59.900 20:06:51 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1546511 00:30:59.900 20:06:51 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:59.900 20:06:51 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:59.900 20:06:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:59.900 20:06:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:59.900 20:06:51 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:59.900 20:06:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:59.900 20:06:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:59.900 20:06:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1546511 -w 256 00:30:59.900 20:06:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:31:00.159 20:06:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1546511 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.38 reactor_0' 00:31:00.159 20:06:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1546511 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.38 reactor_0 00:31:00.159 20:06:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:00.159 20:06:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:00.159 20:06:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:00.159 20:06:51 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:00.159 20:06:51 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:00.159 20:06:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:00.160 20:06:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:00.160 20:06:51 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:00.160 20:06:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:31:00.160 20:06:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1546511 1 00:31:00.160 20:06:51 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1546511 1 idle 00:31:00.160 20:06:51 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1546511 00:31:00.160 20:06:51 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:31:00.160 20:06:51 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:00.160 20:06:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:00.160 20:06:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:00.160 20:06:51 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:00.160 20:06:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:00.160 20:06:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:00.160 20:06:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1546511 -w 256 00:31:00.160 20:06:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:31:00.418 20:06:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1546514 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1' 00:31:00.418 20:06:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:00.418 20:06:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1546514 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1 00:31:00.418 20:06:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:00.418 20:06:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:00.418 20:06:51 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:00.418 20:06:51 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:00.418 20:06:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:00.418 20:06:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:00.418 20:06:51 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:00.419 20:06:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:31:00.419 20:06:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1546511 2 00:31:00.419 20:06:51 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1546511 2 idle 00:31:00.419 20:06:51 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1546511 00:31:00.419 20:06:51 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:31:00.419 20:06:51 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:00.419 20:06:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:00.419 20:06:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:00.419 20:06:51 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:00.419 20:06:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:00.419 20:06:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:00.419 20:06:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1546511 -w 256 00:31:00.419 20:06:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:31:00.419 20:06:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1546515 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2' 00:31:00.419 20:06:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1546515 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2 00:31:00.419 20:06:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:00.419 20:06:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:00.419 20:06:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:00.419 20:06:51 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:00.419 20:06:51 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:00.419 20:06:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:00.419 20:06:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:00.419 20:06:51 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:00.419 20:06:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:31:00.419 20:06:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:31:00.419 20:06:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:31:00.678 [2024-07-24 20:06:52.229919] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:31:00.678 20:06:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:31:00.937 [2024-07-24 20:06:52.493667] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:31:00.937 [2024-07-24 20:06:52.494046] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:01.196 20:06:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:31:01.196 [2024-07-24 20:06:52.757515] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:31:01.196 [2024-07-24 20:06:52.757706] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:01.454 20:06:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:31:01.454 20:06:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1546511 0 00:31:01.454 20:06:52 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1546511 0 busy 00:31:01.454 20:06:52 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1546511 00:31:01.454 20:06:52 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:31:01.454 20:06:52 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:31:01.454 20:06:52 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:31:01.454 20:06:52 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:01.454 20:06:52 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:01.454 20:06:52 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:01.454 20:06:52 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1546511 -w 256 00:31:01.454 20:06:52 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:31:01.454 20:06:52 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1546511 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.84 reactor_0' 00:31:01.454 20:06:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1546511 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.84 reactor_0 00:31:01.454 20:06:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:01.454 20:06:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:01.454 20:06:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:31:01.454 20:06:52 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:31:01.454 20:06:52 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:31:01.454 20:06:52 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:31:01.454 20:06:52 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:31:01.454 20:06:52 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:01.454 20:06:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:31:01.454 20:06:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1546511 2 00:31:01.455 20:06:52 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1546511 2 busy 00:31:01.455 20:06:52 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1546511 00:31:01.455 20:06:52 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:31:01.455 20:06:52 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:31:01.455 20:06:52 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:31:01.455 20:06:52 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:01.455 20:06:52 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:01.455 20:06:52 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:01.455 20:06:52 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1546511 -w 256 00:31:01.455 20:06:52 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:31:01.713 20:06:53 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1546515 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.37 reactor_2' 00:31:01.713 20:06:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1546515 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.37 reactor_2 00:31:01.713 20:06:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:01.713 20:06:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:01.713 20:06:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:31:01.714 20:06:53 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:31:01.714 20:06:53 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:31:01.714 20:06:53 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:31:01.714 20:06:53 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:31:01.714 20:06:53 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:01.714 20:06:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:31:01.971 [2024-07-24 20:06:53.381497] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:31:01.971 [2024-07-24 20:06:53.381614] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:01.971 20:06:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:31:01.971 20:06:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 1546511 2 00:31:01.971 20:06:53 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1546511 2 idle 00:31:01.971 20:06:53 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1546511 00:31:01.971 20:06:53 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:31:01.971 20:06:53 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:01.971 20:06:53 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:01.971 20:06:53 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:01.971 20:06:53 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:01.971 20:06:53 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:01.971 20:06:53 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:01.971 20:06:53 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1546511 -w 256 00:31:01.971 20:06:53 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:31:02.229 20:06:53 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1546515 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.62 reactor_2' 00:31:02.229 20:06:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1546515 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.62 reactor_2 00:31:02.229 20:06:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:02.229 20:06:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:02.229 20:06:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:02.229 20:06:53 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:02.229 20:06:53 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:02.229 20:06:53 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:02.230 20:06:53 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:02.230 20:06:53 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:02.230 20:06:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:31:02.230 [2024-07-24 20:06:53.821507] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:31:02.230 [2024-07-24 20:06:53.821664] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:02.488 20:06:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:31:02.488 20:06:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:31:02.488 20:06:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:31:02.747 [2024-07-24 20:06:54.085949] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:31:02.747 20:06:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 1546511 0 00:31:02.747 20:06:54 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1546511 0 idle 00:31:02.747 20:06:54 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1546511 00:31:02.747 20:06:54 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:31:02.747 20:06:54 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:02.747 20:06:54 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:02.747 20:06:54 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:02.747 20:06:54 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:02.747 20:06:54 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:02.747 20:06:54 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:02.747 20:06:54 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1546511 -w 256 00:31:02.747 20:06:54 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:31:02.747 20:06:54 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1546511 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.70 reactor_0' 00:31:02.747 20:06:54 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:02.747 20:06:54 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1546511 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.70 reactor_0 00:31:02.747 20:06:54 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:02.747 20:06:54 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:02.747 20:06:54 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:02.747 20:06:54 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:02.747 20:06:54 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:02.748 20:06:54 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:02.748 20:06:54 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:02.748 20:06:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:31:02.748 20:06:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:31:02.748 20:06:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:31:02.748 20:06:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 1546511 00:31:02.748 20:06:54 reactor_set_interrupt -- common/autotest_common.sh@950 -- # '[' -z 1546511 ']' 00:31:02.748 20:06:54 reactor_set_interrupt -- common/autotest_common.sh@954 -- # kill -0 1546511 00:31:02.748 20:06:54 reactor_set_interrupt -- common/autotest_common.sh@955 -- # uname 00:31:02.748 20:06:54 reactor_set_interrupt -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:31:02.748 20:06:54 reactor_set_interrupt -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1546511 00:31:03.007 20:06:54 reactor_set_interrupt -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:31:03.007 20:06:54 reactor_set_interrupt -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:31:03.007 20:06:54 reactor_set_interrupt -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1546511' 00:31:03.007 killing process with pid 1546511 00:31:03.007 20:06:54 reactor_set_interrupt -- common/autotest_common.sh@969 -- # kill 1546511 00:31:03.007 20:06:54 reactor_set_interrupt -- common/autotest_common.sh@974 -- # wait 1546511 00:31:03.007 20:06:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:31:03.007 20:06:54 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:31:03.007 20:06:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:31:03.007 20:06:54 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:03.007 20:06:54 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:31:03.007 20:06:54 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1547238 00:31:03.007 20:06:54 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:31:03.007 20:06:54 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:03.007 20:06:54 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1547238 /var/tmp/spdk.sock 00:31:03.007 20:06:54 reactor_set_interrupt -- common/autotest_common.sh@831 -- # '[' -z 1547238 ']' 00:31:03.007 20:06:54 reactor_set_interrupt -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:03.007 20:06:54 reactor_set_interrupt -- common/autotest_common.sh@836 -- # local max_retries=100 00:31:03.007 20:06:54 reactor_set_interrupt -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:03.007 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:03.007 20:06:54 reactor_set_interrupt -- common/autotest_common.sh@840 -- # xtrace_disable 00:31:03.007 20:06:54 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:31:03.266 [2024-07-24 20:06:54.630732] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:31:03.266 [2024-07-24 20:06:54.630804] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1547238 ] 00:31:03.266 [2024-07-24 20:06:54.760395] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:03.524 [2024-07-24 20:06:54.862705] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:03.524 [2024-07-24 20:06:54.862807] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:03.524 [2024-07-24 20:06:54.862809] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:03.524 [2024-07-24 20:06:54.942755] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:31:04.092 20:06:55 reactor_set_interrupt -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:31:04.092 20:06:55 reactor_set_interrupt -- common/autotest_common.sh@864 -- # return 0 00:31:04.092 20:06:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:31:04.092 20:06:55 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:04.351 Malloc0 00:31:04.351 Malloc1 00:31:04.351 Malloc2 00:31:04.351 20:06:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:31:04.351 20:06:55 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:31:04.351 20:06:55 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:31:04.351 20:06:55 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:31:04.351 5000+0 records in 00:31:04.351 5000+0 records out 00:31:04.351 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0254316 s, 403 MB/s 00:31:04.351 20:06:55 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:31:04.608 AIO0 00:31:04.608 20:06:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 1547238 00:31:04.608 20:06:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 1547238 00:31:04.608 20:06:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=1547238 00:31:04.608 20:06:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:31:04.608 20:06:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:31:04.608 20:06:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:31:04.608 20:06:56 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:31:04.608 20:06:56 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:31:04.608 20:06:56 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:31:04.608 20:06:56 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:31:04.608 20:06:56 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:31:04.608 20:06:56 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:31:04.867 20:06:56 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:31:04.867 20:06:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:31:04.867 20:06:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:31:04.867 20:06:56 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:31:04.867 20:06:56 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:31:04.867 20:06:56 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:31:04.867 20:06:56 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:31:04.867 20:06:56 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:31:04.867 20:06:56 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:31:05.436 20:06:56 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:31:05.436 20:06:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:31:05.436 20:06:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:31:05.436 spdk_thread ids are 1 on reactor0. 00:31:05.436 20:06:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:31:05.436 20:06:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1547238 0 00:31:05.436 20:06:56 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1547238 0 idle 00:31:05.436 20:06:56 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1547238 00:31:05.436 20:06:56 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:31:05.436 20:06:56 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:05.436 20:06:56 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:05.436 20:06:56 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:05.436 20:06:56 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:05.436 20:06:56 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:05.436 20:06:56 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:05.436 20:06:56 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:31:05.436 20:06:56 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1547238 -w 256 00:31:05.695 20:06:57 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1547238 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.40 reactor_0' 00:31:05.695 20:06:57 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1547238 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.40 reactor_0 00:31:05.695 20:06:57 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:05.695 20:06:57 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:05.695 20:06:57 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:05.695 20:06:57 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:05.695 20:06:57 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:05.695 20:06:57 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:05.695 20:06:57 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:05.695 20:06:57 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:05.695 20:06:57 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:31:05.695 20:06:57 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1547238 1 00:31:05.695 20:06:57 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1547238 1 idle 00:31:05.695 20:06:57 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1547238 00:31:05.695 20:06:57 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:31:05.695 20:06:57 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:05.695 20:06:57 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:05.695 20:06:57 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:05.695 20:06:57 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:05.695 20:06:57 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:05.695 20:06:57 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:05.695 20:06:57 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1547238 -w 256 00:31:05.695 20:06:57 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1547281 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.00 reactor_1' 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1547281 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.00 reactor_1 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1547238 2 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1547238 2 idle 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1547238 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1547238 -w 256 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1547282 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.00 reactor_2' 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1547282 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.00 reactor_2 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:31:05.955 20:06:57 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:31:06.214 [2024-07-24 20:06:57.727586] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:31:06.214 [2024-07-24 20:06:57.727799] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:31:06.214 [2024-07-24 20:06:57.727978] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:06.214 20:06:57 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:31:06.473 [2024-07-24 20:06:57.980035] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:31:06.473 [2024-07-24 20:06:57.980254] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:06.473 20:06:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:31:06.473 20:06:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1547238 0 00:31:06.473 20:06:58 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1547238 0 busy 00:31:06.473 20:06:58 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1547238 00:31:06.473 20:06:58 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:31:06.473 20:06:58 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:31:06.473 20:06:58 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:31:06.473 20:06:58 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:06.473 20:06:58 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:06.473 20:06:58 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:06.473 20:06:58 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1547238 -w 256 00:31:06.473 20:06:58 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:31:06.732 20:06:58 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1547238 root 20 0 128.2g 36288 23040 R 99.9 0.0 0:00.84 reactor_0' 00:31:06.732 20:06:58 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1547238 root 20 0 128.2g 36288 23040 R 99.9 0.0 0:00.84 reactor_0 00:31:06.732 20:06:58 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:06.732 20:06:58 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:06.732 20:06:58 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:31:06.732 20:06:58 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:31:06.732 20:06:58 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:31:06.732 20:06:58 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:31:06.732 20:06:58 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:31:06.732 20:06:58 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:06.732 20:06:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:31:06.732 20:06:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1547238 2 00:31:06.732 20:06:58 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1547238 2 busy 00:31:06.732 20:06:58 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1547238 00:31:06.732 20:06:58 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:31:06.732 20:06:58 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:31:06.732 20:06:58 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:31:06.732 20:06:58 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:06.732 20:06:58 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:06.732 20:06:58 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:06.732 20:06:58 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1547238 -w 256 00:31:06.732 20:06:58 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:31:06.991 20:06:58 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1547282 root 20 0 128.2g 36288 23040 R 99.9 0.0 0:00.36 reactor_2' 00:31:06.991 20:06:58 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1547282 root 20 0 128.2g 36288 23040 R 99.9 0.0 0:00.36 reactor_2 00:31:06.991 20:06:58 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:06.991 20:06:58 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:06.991 20:06:58 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:31:06.991 20:06:58 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:31:06.991 20:06:58 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:31:06.991 20:06:58 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:31:06.991 20:06:58 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:31:06.991 20:06:58 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:06.991 20:06:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:31:07.249 [2024-07-24 20:06:58.589826] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:31:07.249 [2024-07-24 20:06:58.589971] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:07.249 20:06:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:31:07.249 20:06:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 1547238 2 00:31:07.249 20:06:58 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1547238 2 idle 00:31:07.249 20:06:58 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1547238 00:31:07.249 20:06:58 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:31:07.249 20:06:58 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:07.249 20:06:58 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:07.249 20:06:58 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:07.249 20:06:58 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:07.249 20:06:58 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:07.249 20:06:58 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:07.249 20:06:58 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1547238 -w 256 00:31:07.249 20:06:58 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:31:07.249 20:06:58 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1547282 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.60 reactor_2' 00:31:07.249 20:06:58 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1547282 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.60 reactor_2 00:31:07.249 20:06:58 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:07.249 20:06:58 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:07.249 20:06:58 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:07.249 20:06:58 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:07.249 20:06:58 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:07.249 20:06:58 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:07.249 20:06:58 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:07.249 20:06:58 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:07.249 20:06:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:31:07.508 [2024-07-24 20:06:59.022943] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:31:07.508 [2024-07-24 20:06:59.023474] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:31:07.508 [2024-07-24 20:06:59.023499] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:07.508 20:06:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:31:07.508 20:06:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 1547238 0 00:31:07.508 20:06:59 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1547238 0 idle 00:31:07.508 20:06:59 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1547238 00:31:07.508 20:06:59 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:31:07.508 20:06:59 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:07.508 20:06:59 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:07.508 20:06:59 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:07.508 20:06:59 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:07.508 20:06:59 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:07.508 20:06:59 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:07.508 20:06:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1547238 -w 256 00:31:07.508 20:06:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:31:07.767 20:06:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1547238 root 20 0 128.2g 36288 23040 R 0.0 0.0 0:01.69 reactor_0' 00:31:07.767 20:06:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1547238 root 20 0 128.2g 36288 23040 R 0.0 0.0 0:01.69 reactor_0 00:31:07.767 20:06:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:07.767 20:06:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:07.767 20:06:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:07.767 20:06:59 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:07.767 20:06:59 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:07.767 20:06:59 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:07.767 20:06:59 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:07.767 20:06:59 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:07.767 20:06:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:31:07.767 20:06:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:31:07.767 20:06:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:31:07.767 20:06:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 1547238 00:31:07.767 20:06:59 reactor_set_interrupt -- common/autotest_common.sh@950 -- # '[' -z 1547238 ']' 00:31:07.767 20:06:59 reactor_set_interrupt -- common/autotest_common.sh@954 -- # kill -0 1547238 00:31:07.767 20:06:59 reactor_set_interrupt -- common/autotest_common.sh@955 -- # uname 00:31:07.767 20:06:59 reactor_set_interrupt -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:31:07.767 20:06:59 reactor_set_interrupt -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1547238 00:31:07.767 20:06:59 reactor_set_interrupt -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:31:07.767 20:06:59 reactor_set_interrupt -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:31:07.767 20:06:59 reactor_set_interrupt -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1547238' 00:31:07.767 killing process with pid 1547238 00:31:07.767 20:06:59 reactor_set_interrupt -- common/autotest_common.sh@969 -- # kill 1547238 00:31:07.767 20:06:59 reactor_set_interrupt -- common/autotest_common.sh@974 -- # wait 1547238 00:31:08.026 20:06:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:31:08.026 20:06:59 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:31:08.026 00:31:08.026 real 0m10.490s 00:31:08.026 user 0m9.948s 00:31:08.026 sys 0m2.219s 00:31:08.026 20:06:59 reactor_set_interrupt -- common/autotest_common.sh@1126 -- # xtrace_disable 00:31:08.026 20:06:59 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:31:08.026 ************************************ 00:31:08.026 END TEST reactor_set_interrupt 00:31:08.026 ************************************ 00:31:08.026 20:06:59 -- spdk/autotest.sh@198 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:31:08.026 20:06:59 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:31:08.026 20:06:59 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:31:08.026 20:06:59 -- common/autotest_common.sh@10 -- # set +x 00:31:08.287 ************************************ 00:31:08.287 START TEST reap_unregistered_poller 00:31:08.287 ************************************ 00:31:08.287 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:31:08.287 * Looking for test storage... 00:31:08.287 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:08.287 20:06:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:31:08.287 20:06:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:31:08.287 20:06:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:08.287 20:06:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:08.287 20:06:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:31:08.287 20:06:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:08.287 20:06:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:31:08.287 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:31:08.287 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:31:08.287 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:31:08.287 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:31:08.287 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:31:08.287 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:31:08.287 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:31:08.287 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:31:08.287 20:06:59 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:31:08.288 20:06:59 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:31:08.288 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:31:08.288 20:06:59 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:31:08.288 20:06:59 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:31:08.288 20:06:59 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:31:08.288 20:06:59 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:08.288 20:06:59 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:31:08.288 20:06:59 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:31:08.288 20:06:59 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:31:08.288 20:06:59 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:31:08.288 20:06:59 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:31:08.288 20:06:59 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:31:08.288 20:06:59 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:31:08.288 20:06:59 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:31:08.288 20:06:59 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:31:08.288 20:06:59 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:31:08.288 20:06:59 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:31:08.288 #define SPDK_CONFIG_H 00:31:08.288 #define SPDK_CONFIG_APPS 1 00:31:08.288 #define SPDK_CONFIG_ARCH native 00:31:08.288 #undef SPDK_CONFIG_ASAN 00:31:08.288 #undef SPDK_CONFIG_AVAHI 00:31:08.288 #undef SPDK_CONFIG_CET 00:31:08.288 #define SPDK_CONFIG_COVERAGE 1 00:31:08.288 #define SPDK_CONFIG_CROSS_PREFIX 00:31:08.288 #define SPDK_CONFIG_CRYPTO 1 00:31:08.288 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:31:08.288 #undef SPDK_CONFIG_CUSTOMOCF 00:31:08.288 #undef SPDK_CONFIG_DAOS 00:31:08.288 #define SPDK_CONFIG_DAOS_DIR 00:31:08.288 #define SPDK_CONFIG_DEBUG 1 00:31:08.288 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:31:08.288 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:31:08.288 #define SPDK_CONFIG_DPDK_INC_DIR 00:31:08.288 #define SPDK_CONFIG_DPDK_LIB_DIR 00:31:08.288 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:31:08.288 #undef SPDK_CONFIG_DPDK_UADK 00:31:08.288 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:31:08.288 #define SPDK_CONFIG_EXAMPLES 1 00:31:08.288 #undef SPDK_CONFIG_FC 00:31:08.288 #define SPDK_CONFIG_FC_PATH 00:31:08.288 #define SPDK_CONFIG_FIO_PLUGIN 1 00:31:08.288 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:31:08.288 #undef SPDK_CONFIG_FUSE 00:31:08.288 #undef SPDK_CONFIG_FUZZER 00:31:08.288 #define SPDK_CONFIG_FUZZER_LIB 00:31:08.288 #undef SPDK_CONFIG_GOLANG 00:31:08.288 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:31:08.288 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:31:08.288 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:31:08.288 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:31:08.288 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:31:08.288 #undef SPDK_CONFIG_HAVE_LIBBSD 00:31:08.288 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:31:08.288 #define SPDK_CONFIG_IDXD 1 00:31:08.288 #define SPDK_CONFIG_IDXD_KERNEL 1 00:31:08.288 #define SPDK_CONFIG_IPSEC_MB 1 00:31:08.288 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:31:08.288 #define SPDK_CONFIG_ISAL 1 00:31:08.288 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:31:08.288 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:31:08.288 #define SPDK_CONFIG_LIBDIR 00:31:08.288 #undef SPDK_CONFIG_LTO 00:31:08.288 #define SPDK_CONFIG_MAX_LCORES 128 00:31:08.288 #define SPDK_CONFIG_NVME_CUSE 1 00:31:08.288 #undef SPDK_CONFIG_OCF 00:31:08.288 #define SPDK_CONFIG_OCF_PATH 00:31:08.288 #define SPDK_CONFIG_OPENSSL_PATH 00:31:08.288 #undef SPDK_CONFIG_PGO_CAPTURE 00:31:08.288 #define SPDK_CONFIG_PGO_DIR 00:31:08.288 #undef SPDK_CONFIG_PGO_USE 00:31:08.288 #define SPDK_CONFIG_PREFIX /usr/local 00:31:08.288 #undef SPDK_CONFIG_RAID5F 00:31:08.288 #undef SPDK_CONFIG_RBD 00:31:08.288 #define SPDK_CONFIG_RDMA 1 00:31:08.288 #define SPDK_CONFIG_RDMA_PROV verbs 00:31:08.288 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:31:08.288 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:31:08.288 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:31:08.288 #define SPDK_CONFIG_SHARED 1 00:31:08.288 #undef SPDK_CONFIG_SMA 00:31:08.288 #define SPDK_CONFIG_TESTS 1 00:31:08.288 #undef SPDK_CONFIG_TSAN 00:31:08.288 #define SPDK_CONFIG_UBLK 1 00:31:08.288 #define SPDK_CONFIG_UBSAN 1 00:31:08.288 #undef SPDK_CONFIG_UNIT_TESTS 00:31:08.288 #undef SPDK_CONFIG_URING 00:31:08.288 #define SPDK_CONFIG_URING_PATH 00:31:08.288 #undef SPDK_CONFIG_URING_ZNS 00:31:08.288 #undef SPDK_CONFIG_USDT 00:31:08.288 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:31:08.288 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:31:08.288 #undef SPDK_CONFIG_VFIO_USER 00:31:08.288 #define SPDK_CONFIG_VFIO_USER_DIR 00:31:08.288 #define SPDK_CONFIG_VHOST 1 00:31:08.288 #define SPDK_CONFIG_VIRTIO 1 00:31:08.288 #undef SPDK_CONFIG_VTUNE 00:31:08.288 #define SPDK_CONFIG_VTUNE_DIR 00:31:08.288 #define SPDK_CONFIG_WERROR 1 00:31:08.288 #define SPDK_CONFIG_WPDK_DIR 00:31:08.288 #undef SPDK_CONFIG_XNVME 00:31:08.288 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:31:08.288 20:06:59 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:31:08.288 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:31:08.288 20:06:59 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:31:08.288 20:06:59 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:31:08.288 20:06:59 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:31:08.288 20:06:59 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:08.288 20:06:59 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:08.289 20:06:59 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:08.289 20:06:59 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:31:08.289 20:06:59 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:31:08.289 20:06:59 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:31:08.289 20:06:59 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:31:08.289 20:06:59 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:31:08.289 20:06:59 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:31:08.289 20:06:59 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:08.289 20:06:59 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:31:08.289 20:06:59 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:31:08.289 20:06:59 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:31:08.289 20:06:59 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:31:08.289 20:06:59 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:31:08.289 20:06:59 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:31:08.289 20:06:59 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:31:08.289 20:06:59 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:31:08.289 20:06:59 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:31:08.289 20:06:59 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:31:08.289 20:06:59 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:31:08.289 20:06:59 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:31:08.289 20:06:59 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:31:08.289 20:06:59 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:31:08.289 20:06:59 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:31:08.289 20:06:59 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:31:08.289 20:06:59 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:31:08.289 20:06:59 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:31:08.289 20:06:59 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:31:08.289 20:06:59 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:31:08.289 20:06:59 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:31:08.289 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 1 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@166 -- # : 0 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@173 -- # : 0 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@177 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@178 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@179 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@179 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@180 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@180 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@183 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@183 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@187 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@187 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@191 -- # export PYTHONDONTWRITEBYTECODE=1 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@191 -- # PYTHONDONTWRITEBYTECODE=1 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@195 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@195 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@196 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@196 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@200 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@201 -- # rm -rf /var/tmp/asan_suppression_file 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@202 -- # cat 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@238 -- # echo leak:libfuse3.so 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@240 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@242 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@242 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@244 -- # '[' -z /var/spdk/dependencies ']' 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@247 -- # export DEPENDENCY_DIR 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@251 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@251 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@252 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@252 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@255 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@255 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@256 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@258 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@258 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@261 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@261 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@264 -- # '[' 0 -eq 0 ']' 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@265 -- # export valgrind= 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@265 -- # valgrind= 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@271 -- # uname -s 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@271 -- # '[' Linux = Linux ']' 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@272 -- # HUGEMEM=4096 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@273 -- # export CLEAR_HUGE=yes 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@273 -- # CLEAR_HUGE=yes 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@274 -- # [[ 1 -eq 1 ]] 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@278 -- # export HUGE_EVEN_ALLOC=yes 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@278 -- # HUGE_EVEN_ALLOC=yes 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@281 -- # MAKE=make 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@282 -- # MAKEFLAGS=-j72 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@298 -- # export HUGEMEM=4096 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@298 -- # HUGEMEM=4096 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@300 -- # NO_HUGE=() 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@301 -- # TEST_MODE= 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@320 -- # [[ -z 1547919 ]] 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@320 -- # kill -0 1547919 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@330 -- # [[ -v testdir ]] 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@332 -- # local requested_size=2147483648 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local mount target_dir 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@335 -- # local -A mounts fss sizes avails uses 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local source fs size avail mount use 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@338 -- # local storage_fallback storage_candidates 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@340 -- # mktemp -udt spdk.XXXXXX 00:31:08.290 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@340 -- # storage_fallback=/tmp/spdk.ZwxlQC 00:31:08.291 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@345 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:31:08.291 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@347 -- # [[ -n '' ]] 00:31:08.291 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@352 -- # [[ -n '' ]] 00:31:08.291 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@357 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.ZwxlQC/tests/interrupt /tmp/spdk.ZwxlQC 00:31:08.291 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@360 -- # requested_size=2214592512 00:31:08.291 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:31:08.291 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@329 -- # df -T 00:31:08.291 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@329 -- # grep -v Filesystem 00:31:08.291 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=spdk_devtmpfs 00:31:08.291 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=devtmpfs 00:31:08.291 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=67108864 00:31:08.291 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=67108864 00:31:08.291 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=0 00:31:08.291 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:31:08.291 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=/dev/pmem0 00:31:08.291 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=ext2 00:31:08.291 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=946290688 00:31:08.291 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=5284429824 00:31:08.291 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=4338139136 00:31:08.291 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:31:08.291 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=spdk_root 00:31:08.291 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=overlay 00:31:08.291 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=88883187712 00:31:08.291 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=94508535808 00:31:08.291 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=5625348096 00:31:08.291 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=47250890752 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=47254265856 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=3375104 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=18892296192 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=18901708800 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=9412608 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=47253770240 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=47254269952 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=499712 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=9450848256 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=9450852352 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=4096 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@368 -- # printf '* Looking for test storage...\n' 00:31:08.551 * Looking for test storage... 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@370 -- # local target_space new_size 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@371 -- # for target_dir in "${storage_candidates[@]}" 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@374 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@374 -- # awk '$1 !~ /Filesystem/{print $6}' 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@374 -- # mount=/ 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@376 -- # target_space=88883187712 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@377 -- # (( target_space == 0 || target_space < requested_size )) 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@380 -- # (( target_space >= requested_size )) 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@382 -- # [[ overlay == tmpfs ]] 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@382 -- # [[ overlay == ramfs ]] 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@382 -- # [[ / == / ]] 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@383 -- # new_size=7839940608 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@384 -- # (( new_size * 100 / sizes[/] > 95 )) 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@389 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@389 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@390 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:08.551 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@391 -- # return 0 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # set -o errtrace 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@1687 -- # true 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@1689 -- # xtrace_fd 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:31:08.551 20:06:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:31:08.551 20:06:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:08.551 20:06:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:31:08.551 20:06:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:31:08.551 20:06:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:31:08.551 20:06:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:31:08.551 20:06:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:31:08.551 20:06:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:31:08.551 20:06:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:31:08.551 20:06:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:31:08.551 20:06:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:08.551 20:06:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:31:08.551 20:06:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1547966 00:31:08.551 20:06:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:08.551 20:06:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:31:08.551 20:06:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1547966 /var/tmp/spdk.sock 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@831 -- # '[' -z 1547966 ']' 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@836 -- # local max_retries=100 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:08.551 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@840 -- # xtrace_disable 00:31:08.551 20:06:59 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:31:08.551 [2024-07-24 20:06:59.946583] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:31:08.551 [2024-07-24 20:06:59.946659] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1547966 ] 00:31:08.551 [2024-07-24 20:07:00.079249] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:08.811 [2024-07-24 20:07:00.188540] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:08.811 [2024-07-24 20:07:00.188640] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:08.811 [2024-07-24 20:07:00.188641] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:08.811 [2024-07-24 20:07:00.262977] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:31:09.379 20:07:00 reap_unregistered_poller -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:31:09.379 20:07:00 reap_unregistered_poller -- common/autotest_common.sh@864 -- # return 0 00:31:09.379 20:07:00 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:31:09.379 20:07:00 reap_unregistered_poller -- common/autotest_common.sh@561 -- # xtrace_disable 00:31:09.379 20:07:00 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:31:09.379 20:07:00 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:31:09.379 20:07:00 reap_unregistered_poller -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:31:09.379 20:07:00 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:31:09.379 "name": "app_thread", 00:31:09.379 "id": 1, 00:31:09.379 "active_pollers": [], 00:31:09.379 "timed_pollers": [ 00:31:09.379 { 00:31:09.379 "name": "rpc_subsystem_poll_servers", 00:31:09.379 "id": 1, 00:31:09.379 "state": "waiting", 00:31:09.379 "run_count": 0, 00:31:09.379 "busy_count": 0, 00:31:09.379 "period_ticks": 9200000 00:31:09.379 } 00:31:09.379 ], 00:31:09.379 "paused_pollers": [] 00:31:09.379 }' 00:31:09.379 20:07:00 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:31:09.639 20:07:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:31:09.639 20:07:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:31:09.639 20:07:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:31:09.639 20:07:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:31:09.639 20:07:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:31:09.639 20:07:01 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:31:09.639 20:07:01 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:31:09.639 20:07:01 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:31:09.639 5000+0 records in 00:31:09.639 5000+0 records out 00:31:09.639 10240000 bytes (10 MB, 9.8 MiB) copied, 0.026603 s, 385 MB/s 00:31:09.639 20:07:01 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:31:09.898 AIO0 00:31:09.898 20:07:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:10.157 20:07:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:31:10.157 20:07:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:31:10.157 20:07:01 reap_unregistered_poller -- common/autotest_common.sh@561 -- # xtrace_disable 00:31:10.157 20:07:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:31:10.157 20:07:01 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:31:10.416 20:07:01 reap_unregistered_poller -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:31:10.416 20:07:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:31:10.416 "name": "app_thread", 00:31:10.416 "id": 1, 00:31:10.416 "active_pollers": [], 00:31:10.416 "timed_pollers": [ 00:31:10.416 { 00:31:10.416 "name": "rpc_subsystem_poll_servers", 00:31:10.416 "id": 1, 00:31:10.416 "state": "waiting", 00:31:10.416 "run_count": 0, 00:31:10.416 "busy_count": 0, 00:31:10.416 "period_ticks": 9200000 00:31:10.416 } 00:31:10.416 ], 00:31:10.416 "paused_pollers": [] 00:31:10.416 }' 00:31:10.416 20:07:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:31:10.416 20:07:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:31:10.416 20:07:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:31:10.416 20:07:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:31:10.416 20:07:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:31:10.416 20:07:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:31:10.416 20:07:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:31:10.416 20:07:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 1547966 00:31:10.416 20:07:01 reap_unregistered_poller -- common/autotest_common.sh@950 -- # '[' -z 1547966 ']' 00:31:10.416 20:07:01 reap_unregistered_poller -- common/autotest_common.sh@954 -- # kill -0 1547966 00:31:10.416 20:07:01 reap_unregistered_poller -- common/autotest_common.sh@955 -- # uname 00:31:10.416 20:07:01 reap_unregistered_poller -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:31:10.416 20:07:01 reap_unregistered_poller -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1547966 00:31:10.416 20:07:01 reap_unregistered_poller -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:31:10.416 20:07:01 reap_unregistered_poller -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:31:10.416 20:07:01 reap_unregistered_poller -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1547966' 00:31:10.416 killing process with pid 1547966 00:31:10.416 20:07:01 reap_unregistered_poller -- common/autotest_common.sh@969 -- # kill 1547966 00:31:10.416 20:07:01 reap_unregistered_poller -- common/autotest_common.sh@974 -- # wait 1547966 00:31:10.675 20:07:02 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:31:10.675 20:07:02 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:31:10.675 00:31:10.675 real 0m2.544s 00:31:10.675 user 0m1.641s 00:31:10.675 sys 0m0.662s 00:31:10.675 20:07:02 reap_unregistered_poller -- common/autotest_common.sh@1126 -- # xtrace_disable 00:31:10.675 20:07:02 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:31:10.675 ************************************ 00:31:10.675 END TEST reap_unregistered_poller 00:31:10.675 ************************************ 00:31:10.675 20:07:02 -- spdk/autotest.sh@202 -- # uname -s 00:31:10.675 20:07:02 -- spdk/autotest.sh@202 -- # [[ Linux == Linux ]] 00:31:10.675 20:07:02 -- spdk/autotest.sh@203 -- # [[ 1 -eq 1 ]] 00:31:10.675 20:07:02 -- spdk/autotest.sh@209 -- # [[ 1 -eq 0 ]] 00:31:10.675 20:07:02 -- spdk/autotest.sh@215 -- # '[' 0 -eq 1 ']' 00:31:10.675 20:07:02 -- spdk/autotest.sh@260 -- # '[' 0 -eq 1 ']' 00:31:10.675 20:07:02 -- spdk/autotest.sh@264 -- # timing_exit lib 00:31:10.675 20:07:02 -- common/autotest_common.sh@730 -- # xtrace_disable 00:31:10.675 20:07:02 -- common/autotest_common.sh@10 -- # set +x 00:31:10.675 20:07:02 -- spdk/autotest.sh@266 -- # '[' 0 -eq 1 ']' 00:31:10.675 20:07:02 -- spdk/autotest.sh@274 -- # '[' 0 -eq 1 ']' 00:31:10.675 20:07:02 -- spdk/autotest.sh@283 -- # '[' 0 -eq 1 ']' 00:31:10.675 20:07:02 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:31:10.675 20:07:02 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:31:10.675 20:07:02 -- spdk/autotest.sh@320 -- # '[' 0 -eq 1 ']' 00:31:10.675 20:07:02 -- spdk/autotest.sh@325 -- # '[' 0 -eq 1 ']' 00:31:10.675 20:07:02 -- spdk/autotest.sh@334 -- # '[' 0 -eq 1 ']' 00:31:10.675 20:07:02 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:31:10.675 20:07:02 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:31:10.675 20:07:02 -- spdk/autotest.sh@347 -- # '[' 0 -eq 1 ']' 00:31:10.675 20:07:02 -- spdk/autotest.sh@351 -- # '[' 1 -eq 1 ']' 00:31:10.675 20:07:02 -- spdk/autotest.sh@352 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:31:10.675 20:07:02 -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:31:10.675 20:07:02 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:31:10.675 20:07:02 -- common/autotest_common.sh@10 -- # set +x 00:31:10.934 ************************************ 00:31:10.934 START TEST compress_compdev 00:31:10.934 ************************************ 00:31:10.934 20:07:02 compress_compdev -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:31:10.934 * Looking for test storage... 00:31:10.934 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:31:10.934 20:07:02 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:31:10.934 20:07:02 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:31:10.934 20:07:02 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:31:10.934 20:07:02 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:31:10.934 20:07:02 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:31:10.934 20:07:02 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:31:10.934 20:07:02 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:31:10.934 20:07:02 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:31:10.934 20:07:02 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:31:10.934 20:07:02 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:31:10.934 20:07:02 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:31:10.934 20:07:02 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:31:10.934 20:07:02 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:31:10.934 20:07:02 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:31:10.934 20:07:02 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:31:10.934 20:07:02 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:31:10.934 20:07:02 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:31:10.934 20:07:02 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:31:10.934 20:07:02 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:31:10.934 20:07:02 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:31:10.934 20:07:02 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:31:10.934 20:07:02 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:31:10.934 20:07:02 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:10.934 20:07:02 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:10.934 20:07:02 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:10.934 20:07:02 compress_compdev -- paths/export.sh@5 -- # export PATH 00:31:10.934 20:07:02 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:10.934 20:07:02 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:31:10.934 20:07:02 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:31:10.934 20:07:02 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:31:10.934 20:07:02 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:31:10.934 20:07:02 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:31:10.934 20:07:02 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:31:10.934 20:07:02 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:31:10.934 20:07:02 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:31:10.934 20:07:02 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:31:10.934 20:07:02 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:10.934 20:07:02 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:31:10.934 20:07:02 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:31:10.934 20:07:02 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:31:10.934 20:07:02 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:31:10.934 20:07:02 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1548405 00:31:10.934 20:07:02 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:10.934 20:07:02 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:31:10.934 20:07:02 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1548405 00:31:10.934 20:07:02 compress_compdev -- common/autotest_common.sh@831 -- # '[' -z 1548405 ']' 00:31:10.934 20:07:02 compress_compdev -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:10.934 20:07:02 compress_compdev -- common/autotest_common.sh@836 -- # local max_retries=100 00:31:10.934 20:07:02 compress_compdev -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:10.934 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:10.935 20:07:02 compress_compdev -- common/autotest_common.sh@840 -- # xtrace_disable 00:31:10.935 20:07:02 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:31:10.935 [2024-07-24 20:07:02.508047] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:31:10.935 [2024-07-24 20:07:02.508134] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1548405 ] 00:31:11.193 [2024-07-24 20:07:02.656320] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:11.193 [2024-07-24 20:07:02.775569] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:11.193 [2024-07-24 20:07:02.775575] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:12.568 [2024-07-24 20:07:03.753846] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:31:12.568 20:07:03 compress_compdev -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:31:12.568 20:07:03 compress_compdev -- common/autotest_common.sh@864 -- # return 0 00:31:12.568 20:07:03 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:31:12.568 20:07:03 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:12.568 20:07:03 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:12.827 [2024-07-24 20:07:04.360962] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1121430 PMD being used: compress_qat 00:31:12.827 20:07:04 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:12.827 20:07:04 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:31:12.827 20:07:04 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:12.827 20:07:04 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:31:12.827 20:07:04 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:12.827 20:07:04 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:12.827 20:07:04 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:13.084 20:07:04 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:13.343 [ 00:31:13.343 { 00:31:13.343 "name": "Nvme0n1", 00:31:13.343 "aliases": [ 00:31:13.343 "01000000-0000-0000-5cd2-e43197705251" 00:31:13.343 ], 00:31:13.343 "product_name": "NVMe disk", 00:31:13.343 "block_size": 512, 00:31:13.343 "num_blocks": 15002931888, 00:31:13.343 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:13.343 "assigned_rate_limits": { 00:31:13.343 "rw_ios_per_sec": 0, 00:31:13.343 "rw_mbytes_per_sec": 0, 00:31:13.343 "r_mbytes_per_sec": 0, 00:31:13.343 "w_mbytes_per_sec": 0 00:31:13.343 }, 00:31:13.343 "claimed": false, 00:31:13.343 "zoned": false, 00:31:13.343 "supported_io_types": { 00:31:13.343 "read": true, 00:31:13.343 "write": true, 00:31:13.343 "unmap": true, 00:31:13.343 "flush": true, 00:31:13.343 "reset": true, 00:31:13.343 "nvme_admin": true, 00:31:13.343 "nvme_io": true, 00:31:13.343 "nvme_io_md": false, 00:31:13.343 "write_zeroes": true, 00:31:13.343 "zcopy": false, 00:31:13.343 "get_zone_info": false, 00:31:13.343 "zone_management": false, 00:31:13.343 "zone_append": false, 00:31:13.343 "compare": false, 00:31:13.343 "compare_and_write": false, 00:31:13.343 "abort": true, 00:31:13.343 "seek_hole": false, 00:31:13.343 "seek_data": false, 00:31:13.343 "copy": false, 00:31:13.343 "nvme_iov_md": false 00:31:13.343 }, 00:31:13.343 "driver_specific": { 00:31:13.343 "nvme": [ 00:31:13.343 { 00:31:13.343 "pci_address": "0000:5e:00.0", 00:31:13.343 "trid": { 00:31:13.343 "trtype": "PCIe", 00:31:13.343 "traddr": "0000:5e:00.0" 00:31:13.343 }, 00:31:13.343 "ctrlr_data": { 00:31:13.343 "cntlid": 0, 00:31:13.343 "vendor_id": "0x8086", 00:31:13.343 "model_number": "INTEL SSDPF2KX076TZO", 00:31:13.343 "serial_number": "PHAC0301002G7P6CGN", 00:31:13.343 "firmware_revision": "JCV10200", 00:31:13.343 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:13.343 "oacs": { 00:31:13.343 "security": 1, 00:31:13.343 "format": 1, 00:31:13.343 "firmware": 1, 00:31:13.343 "ns_manage": 1 00:31:13.343 }, 00:31:13.343 "multi_ctrlr": false, 00:31:13.343 "ana_reporting": false 00:31:13.343 }, 00:31:13.343 "vs": { 00:31:13.343 "nvme_version": "1.3" 00:31:13.343 }, 00:31:13.343 "ns_data": { 00:31:13.343 "id": 1, 00:31:13.343 "can_share": false 00:31:13.343 }, 00:31:13.343 "security": { 00:31:13.343 "opal": true 00:31:13.343 } 00:31:13.343 } 00:31:13.343 ], 00:31:13.343 "mp_policy": "active_passive" 00:31:13.343 } 00:31:13.343 } 00:31:13.343 ] 00:31:13.343 20:07:04 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:31:13.343 20:07:04 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:13.910 [2024-07-24 20:07:05.400970] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xf58210 PMD being used: compress_qat 00:31:16.443 63ab9d08-9dfc-4b4e-b6e7-53fcd1a9dfa5 00:31:16.443 20:07:07 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:16.443 b4654daa-c07c-47db-90af-9801f0122125 00:31:16.443 20:07:07 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:16.443 20:07:07 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:31:16.443 20:07:07 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:16.443 20:07:07 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:31:16.443 20:07:07 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:16.443 20:07:07 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:16.443 20:07:07 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:16.701 20:07:08 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:16.964 [ 00:31:16.964 { 00:31:16.964 "name": "b4654daa-c07c-47db-90af-9801f0122125", 00:31:16.964 "aliases": [ 00:31:16.964 "lvs0/lv0" 00:31:16.964 ], 00:31:16.964 "product_name": "Logical Volume", 00:31:16.964 "block_size": 512, 00:31:16.964 "num_blocks": 204800, 00:31:16.964 "uuid": "b4654daa-c07c-47db-90af-9801f0122125", 00:31:16.964 "assigned_rate_limits": { 00:31:16.964 "rw_ios_per_sec": 0, 00:31:16.964 "rw_mbytes_per_sec": 0, 00:31:16.964 "r_mbytes_per_sec": 0, 00:31:16.964 "w_mbytes_per_sec": 0 00:31:16.964 }, 00:31:16.964 "claimed": false, 00:31:16.964 "zoned": false, 00:31:16.964 "supported_io_types": { 00:31:16.964 "read": true, 00:31:16.964 "write": true, 00:31:16.964 "unmap": true, 00:31:16.964 "flush": false, 00:31:16.964 "reset": true, 00:31:16.964 "nvme_admin": false, 00:31:16.964 "nvme_io": false, 00:31:16.964 "nvme_io_md": false, 00:31:16.964 "write_zeroes": true, 00:31:16.964 "zcopy": false, 00:31:16.964 "get_zone_info": false, 00:31:16.964 "zone_management": false, 00:31:16.964 "zone_append": false, 00:31:16.964 "compare": false, 00:31:16.964 "compare_and_write": false, 00:31:16.964 "abort": false, 00:31:16.964 "seek_hole": true, 00:31:16.964 "seek_data": true, 00:31:16.964 "copy": false, 00:31:16.964 "nvme_iov_md": false 00:31:16.964 }, 00:31:16.964 "driver_specific": { 00:31:16.964 "lvol": { 00:31:16.964 "lvol_store_uuid": "63ab9d08-9dfc-4b4e-b6e7-53fcd1a9dfa5", 00:31:16.964 "base_bdev": "Nvme0n1", 00:31:16.964 "thin_provision": true, 00:31:16.964 "num_allocated_clusters": 0, 00:31:16.964 "snapshot": false, 00:31:16.964 "clone": false, 00:31:16.964 "esnap_clone": false 00:31:16.964 } 00:31:16.964 } 00:31:16.964 } 00:31:16.964 ] 00:31:16.964 20:07:08 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:31:16.964 20:07:08 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:31:16.964 20:07:08 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:31:17.291 [2024-07-24 20:07:08.573169] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:17.291 COMP_lvs0/lv0 00:31:17.291 20:07:08 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:17.291 20:07:08 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:31:17.291 20:07:08 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:17.291 20:07:08 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:31:17.291 20:07:08 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:17.291 20:07:08 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:17.291 20:07:08 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:17.549 20:07:09 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:17.808 [ 00:31:17.808 { 00:31:17.808 "name": "COMP_lvs0/lv0", 00:31:17.808 "aliases": [ 00:31:17.808 "89fa8b72-7a51-5b04-9c59-554eb9d366f2" 00:31:17.808 ], 00:31:17.808 "product_name": "compress", 00:31:17.808 "block_size": 512, 00:31:17.808 "num_blocks": 200704, 00:31:17.808 "uuid": "89fa8b72-7a51-5b04-9c59-554eb9d366f2", 00:31:17.808 "assigned_rate_limits": { 00:31:17.808 "rw_ios_per_sec": 0, 00:31:17.808 "rw_mbytes_per_sec": 0, 00:31:17.808 "r_mbytes_per_sec": 0, 00:31:17.808 "w_mbytes_per_sec": 0 00:31:17.808 }, 00:31:17.808 "claimed": false, 00:31:17.808 "zoned": false, 00:31:17.808 "supported_io_types": { 00:31:17.808 "read": true, 00:31:17.808 "write": true, 00:31:17.808 "unmap": false, 00:31:17.808 "flush": false, 00:31:17.808 "reset": false, 00:31:17.808 "nvme_admin": false, 00:31:17.808 "nvme_io": false, 00:31:17.808 "nvme_io_md": false, 00:31:17.808 "write_zeroes": true, 00:31:17.808 "zcopy": false, 00:31:17.808 "get_zone_info": false, 00:31:17.808 "zone_management": false, 00:31:17.808 "zone_append": false, 00:31:17.808 "compare": false, 00:31:17.808 "compare_and_write": false, 00:31:17.808 "abort": false, 00:31:17.808 "seek_hole": false, 00:31:17.808 "seek_data": false, 00:31:17.808 "copy": false, 00:31:17.808 "nvme_iov_md": false 00:31:17.808 }, 00:31:17.808 "driver_specific": { 00:31:17.808 "compress": { 00:31:17.808 "name": "COMP_lvs0/lv0", 00:31:17.808 "base_bdev_name": "b4654daa-c07c-47db-90af-9801f0122125", 00:31:17.808 "pm_path": "/tmp/pmem/ef6f1a2d-22d4-4274-b693-05418de828fe" 00:31:17.808 } 00:31:17.808 } 00:31:17.808 } 00:31:17.808 ] 00:31:17.808 20:07:09 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:31:17.808 20:07:09 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:18.067 [2024-07-24 20:07:09.593567] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fa1d41b1600 PMD being used: compress_qat 00:31:18.067 [2024-07-24 20:07:09.596918] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xf57760 PMD being used: compress_qat 00:31:18.067 Running I/O for 3 seconds... 00:31:21.363 00:31:21.363 Latency(us) 00:31:21.363 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:21.363 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:21.363 Verification LBA range: start 0x0 length 0x3100 00:31:21.363 COMP_lvs0/lv0 : 3.01 1657.75 6.48 0.00 0.00 19201.60 2080.06 19717.79 00:31:21.363 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:21.363 Verification LBA range: start 0x3100 length 0x3100 00:31:21.363 COMP_lvs0/lv0 : 3.01 1756.08 6.86 0.00 0.00 18105.40 1367.71 19033.93 00:31:21.363 =================================================================================================================== 00:31:21.363 Total : 3413.82 13.34 0.00 0.00 18637.93 1367.71 19717.79 00:31:21.363 0 00:31:21.363 20:07:12 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:31:21.363 20:07:12 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:21.363 20:07:12 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:21.623 20:07:13 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:21.623 20:07:13 compress_compdev -- compress/compress.sh@78 -- # killprocess 1548405 00:31:21.623 20:07:13 compress_compdev -- common/autotest_common.sh@950 -- # '[' -z 1548405 ']' 00:31:21.623 20:07:13 compress_compdev -- common/autotest_common.sh@954 -- # kill -0 1548405 00:31:21.623 20:07:13 compress_compdev -- common/autotest_common.sh@955 -- # uname 00:31:21.623 20:07:13 compress_compdev -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:31:21.623 20:07:13 compress_compdev -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1548405 00:31:21.882 20:07:13 compress_compdev -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:31:21.882 20:07:13 compress_compdev -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:31:21.882 20:07:13 compress_compdev -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1548405' 00:31:21.882 killing process with pid 1548405 00:31:21.882 20:07:13 compress_compdev -- common/autotest_common.sh@969 -- # kill 1548405 00:31:21.882 Received shutdown signal, test time was about 3.000000 seconds 00:31:21.882 00:31:21.882 Latency(us) 00:31:21.882 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:21.882 =================================================================================================================== 00:31:21.882 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:21.882 20:07:13 compress_compdev -- common/autotest_common.sh@974 -- # wait 1548405 00:31:25.169 20:07:16 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:31:25.169 20:07:16 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:31:25.169 20:07:16 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1550175 00:31:25.169 20:07:16 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:25.169 20:07:16 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:31:25.169 20:07:16 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1550175 00:31:25.169 20:07:16 compress_compdev -- common/autotest_common.sh@831 -- # '[' -z 1550175 ']' 00:31:25.169 20:07:16 compress_compdev -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:25.169 20:07:16 compress_compdev -- common/autotest_common.sh@836 -- # local max_retries=100 00:31:25.169 20:07:16 compress_compdev -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:25.169 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:25.169 20:07:16 compress_compdev -- common/autotest_common.sh@840 -- # xtrace_disable 00:31:25.169 20:07:16 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:31:25.169 [2024-07-24 20:07:16.344707] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:31:25.169 [2024-07-24 20:07:16.344778] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1550175 ] 00:31:25.169 [2024-07-24 20:07:16.480068] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:25.169 [2024-07-24 20:07:16.597851] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:25.169 [2024-07-24 20:07:16.597855] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:26.107 [2024-07-24 20:07:17.567198] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:31:26.107 20:07:17 compress_compdev -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:31:26.107 20:07:17 compress_compdev -- common/autotest_common.sh@864 -- # return 0 00:31:26.107 20:07:17 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:31:26.107 20:07:17 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:26.107 20:07:17 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:26.676 [2024-07-24 20:07:18.256444] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1781430 PMD being used: compress_qat 00:31:26.936 20:07:18 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:26.936 20:07:18 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:31:26.936 20:07:18 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:26.936 20:07:18 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:31:26.936 20:07:18 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:26.936 20:07:18 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:26.936 20:07:18 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:27.195 20:07:18 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:27.454 [ 00:31:27.454 { 00:31:27.454 "name": "Nvme0n1", 00:31:27.454 "aliases": [ 00:31:27.454 "01000000-0000-0000-5cd2-e43197705251" 00:31:27.454 ], 00:31:27.454 "product_name": "NVMe disk", 00:31:27.454 "block_size": 512, 00:31:27.454 "num_blocks": 15002931888, 00:31:27.454 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:27.454 "assigned_rate_limits": { 00:31:27.454 "rw_ios_per_sec": 0, 00:31:27.454 "rw_mbytes_per_sec": 0, 00:31:27.454 "r_mbytes_per_sec": 0, 00:31:27.454 "w_mbytes_per_sec": 0 00:31:27.454 }, 00:31:27.454 "claimed": false, 00:31:27.454 "zoned": false, 00:31:27.454 "supported_io_types": { 00:31:27.454 "read": true, 00:31:27.454 "write": true, 00:31:27.454 "unmap": true, 00:31:27.454 "flush": true, 00:31:27.454 "reset": true, 00:31:27.454 "nvme_admin": true, 00:31:27.454 "nvme_io": true, 00:31:27.454 "nvme_io_md": false, 00:31:27.454 "write_zeroes": true, 00:31:27.454 "zcopy": false, 00:31:27.454 "get_zone_info": false, 00:31:27.454 "zone_management": false, 00:31:27.454 "zone_append": false, 00:31:27.454 "compare": false, 00:31:27.454 "compare_and_write": false, 00:31:27.454 "abort": true, 00:31:27.454 "seek_hole": false, 00:31:27.454 "seek_data": false, 00:31:27.454 "copy": false, 00:31:27.454 "nvme_iov_md": false 00:31:27.454 }, 00:31:27.454 "driver_specific": { 00:31:27.454 "nvme": [ 00:31:27.454 { 00:31:27.454 "pci_address": "0000:5e:00.0", 00:31:27.454 "trid": { 00:31:27.454 "trtype": "PCIe", 00:31:27.454 "traddr": "0000:5e:00.0" 00:31:27.454 }, 00:31:27.454 "ctrlr_data": { 00:31:27.454 "cntlid": 0, 00:31:27.454 "vendor_id": "0x8086", 00:31:27.454 "model_number": "INTEL SSDPF2KX076TZO", 00:31:27.454 "serial_number": "PHAC0301002G7P6CGN", 00:31:27.454 "firmware_revision": "JCV10200", 00:31:27.454 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:27.454 "oacs": { 00:31:27.454 "security": 1, 00:31:27.454 "format": 1, 00:31:27.454 "firmware": 1, 00:31:27.454 "ns_manage": 1 00:31:27.454 }, 00:31:27.454 "multi_ctrlr": false, 00:31:27.454 "ana_reporting": false 00:31:27.454 }, 00:31:27.454 "vs": { 00:31:27.454 "nvme_version": "1.3" 00:31:27.454 }, 00:31:27.454 "ns_data": { 00:31:27.454 "id": 1, 00:31:27.454 "can_share": false 00:31:27.454 }, 00:31:27.454 "security": { 00:31:27.454 "opal": true 00:31:27.454 } 00:31:27.454 } 00:31:27.454 ], 00:31:27.454 "mp_policy": "active_passive" 00:31:27.454 } 00:31:27.454 } 00:31:27.454 ] 00:31:27.454 20:07:18 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:31:27.454 20:07:18 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:27.713 [2024-07-24 20:07:19.054954] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x15b8210 PMD being used: compress_qat 00:31:30.249 7f11e3a8-e7a1-4981-8ac3-68f7329e09d2 00:31:30.249 20:07:21 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:30.249 c7bdce7b-4d16-42ca-b43e-b0269ea7f70f 00:31:30.249 20:07:21 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:30.249 20:07:21 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:31:30.249 20:07:21 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:30.249 20:07:21 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:31:30.249 20:07:21 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:30.249 20:07:21 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:30.249 20:07:21 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:30.249 20:07:21 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:30.508 [ 00:31:30.508 { 00:31:30.508 "name": "c7bdce7b-4d16-42ca-b43e-b0269ea7f70f", 00:31:30.508 "aliases": [ 00:31:30.508 "lvs0/lv0" 00:31:30.508 ], 00:31:30.508 "product_name": "Logical Volume", 00:31:30.508 "block_size": 512, 00:31:30.508 "num_blocks": 204800, 00:31:30.508 "uuid": "c7bdce7b-4d16-42ca-b43e-b0269ea7f70f", 00:31:30.508 "assigned_rate_limits": { 00:31:30.508 "rw_ios_per_sec": 0, 00:31:30.508 "rw_mbytes_per_sec": 0, 00:31:30.508 "r_mbytes_per_sec": 0, 00:31:30.508 "w_mbytes_per_sec": 0 00:31:30.508 }, 00:31:30.508 "claimed": false, 00:31:30.509 "zoned": false, 00:31:30.509 "supported_io_types": { 00:31:30.509 "read": true, 00:31:30.509 "write": true, 00:31:30.509 "unmap": true, 00:31:30.509 "flush": false, 00:31:30.509 "reset": true, 00:31:30.509 "nvme_admin": false, 00:31:30.509 "nvme_io": false, 00:31:30.509 "nvme_io_md": false, 00:31:30.509 "write_zeroes": true, 00:31:30.509 "zcopy": false, 00:31:30.509 "get_zone_info": false, 00:31:30.509 "zone_management": false, 00:31:30.509 "zone_append": false, 00:31:30.509 "compare": false, 00:31:30.509 "compare_and_write": false, 00:31:30.509 "abort": false, 00:31:30.509 "seek_hole": true, 00:31:30.509 "seek_data": true, 00:31:30.509 "copy": false, 00:31:30.509 "nvme_iov_md": false 00:31:30.509 }, 00:31:30.509 "driver_specific": { 00:31:30.509 "lvol": { 00:31:30.509 "lvol_store_uuid": "7f11e3a8-e7a1-4981-8ac3-68f7329e09d2", 00:31:30.509 "base_bdev": "Nvme0n1", 00:31:30.509 "thin_provision": true, 00:31:30.509 "num_allocated_clusters": 0, 00:31:30.509 "snapshot": false, 00:31:30.509 "clone": false, 00:31:30.509 "esnap_clone": false 00:31:30.509 } 00:31:30.509 } 00:31:30.509 } 00:31:30.509 ] 00:31:30.509 20:07:22 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:31:30.509 20:07:22 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:31:30.509 20:07:22 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:31:30.768 [2024-07-24 20:07:22.296715] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:30.768 COMP_lvs0/lv0 00:31:30.768 20:07:22 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:30.768 20:07:22 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:31:30.768 20:07:22 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:30.768 20:07:22 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:31:30.768 20:07:22 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:30.768 20:07:22 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:30.768 20:07:22 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:31.027 20:07:22 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:31.286 [ 00:31:31.286 { 00:31:31.286 "name": "COMP_lvs0/lv0", 00:31:31.286 "aliases": [ 00:31:31.286 "ca25b57f-e6a2-5b40-be94-01fad959f37b" 00:31:31.286 ], 00:31:31.286 "product_name": "compress", 00:31:31.286 "block_size": 512, 00:31:31.286 "num_blocks": 200704, 00:31:31.286 "uuid": "ca25b57f-e6a2-5b40-be94-01fad959f37b", 00:31:31.286 "assigned_rate_limits": { 00:31:31.286 "rw_ios_per_sec": 0, 00:31:31.286 "rw_mbytes_per_sec": 0, 00:31:31.286 "r_mbytes_per_sec": 0, 00:31:31.286 "w_mbytes_per_sec": 0 00:31:31.286 }, 00:31:31.286 "claimed": false, 00:31:31.286 "zoned": false, 00:31:31.286 "supported_io_types": { 00:31:31.286 "read": true, 00:31:31.286 "write": true, 00:31:31.286 "unmap": false, 00:31:31.286 "flush": false, 00:31:31.286 "reset": false, 00:31:31.286 "nvme_admin": false, 00:31:31.286 "nvme_io": false, 00:31:31.286 "nvme_io_md": false, 00:31:31.286 "write_zeroes": true, 00:31:31.286 "zcopy": false, 00:31:31.286 "get_zone_info": false, 00:31:31.286 "zone_management": false, 00:31:31.286 "zone_append": false, 00:31:31.286 "compare": false, 00:31:31.286 "compare_and_write": false, 00:31:31.286 "abort": false, 00:31:31.286 "seek_hole": false, 00:31:31.286 "seek_data": false, 00:31:31.286 "copy": false, 00:31:31.286 "nvme_iov_md": false 00:31:31.286 }, 00:31:31.286 "driver_specific": { 00:31:31.286 "compress": { 00:31:31.286 "name": "COMP_lvs0/lv0", 00:31:31.286 "base_bdev_name": "c7bdce7b-4d16-42ca-b43e-b0269ea7f70f", 00:31:31.286 "pm_path": "/tmp/pmem/1d391707-a4cd-4921-9900-3179d79f4f69" 00:31:31.286 } 00:31:31.286 } 00:31:31.286 } 00:31:31.286 ] 00:31:31.286 20:07:22 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:31:31.286 20:07:22 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:31.546 [2024-07-24 20:07:22.981243] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1661f20 PMD being used: compress_qat 00:31:31.546 [2024-07-24 20:07:22.985346] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fdbdc19bc50 PMD being used: compress_qat 00:31:31.546 Running I/O for 3 seconds... 00:31:34.835 00:31:34.835 Latency(us) 00:31:34.835 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:34.835 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:34.835 Verification LBA range: start 0x0 length 0x3100 00:31:34.835 COMP_lvs0/lv0 : 3.01 2771.16 10.82 0.00 0.00 11460.42 869.06 10029.86 00:31:34.835 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:34.835 Verification LBA range: start 0x3100 length 0x3100 00:31:34.835 COMP_lvs0/lv0 : 3.01 2610.66 10.20 0.00 0.00 12122.93 1025.78 10941.66 00:31:34.835 =================================================================================================================== 00:31:34.835 Total : 5381.82 21.02 0.00 0.00 11781.87 869.06 10941.66 00:31:34.835 0 00:31:34.835 20:07:26 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:31:34.835 20:07:26 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:34.835 20:07:26 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:35.094 20:07:26 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:35.094 20:07:26 compress_compdev -- compress/compress.sh@78 -- # killprocess 1550175 00:31:35.094 20:07:26 compress_compdev -- common/autotest_common.sh@950 -- # '[' -z 1550175 ']' 00:31:35.094 20:07:26 compress_compdev -- common/autotest_common.sh@954 -- # kill -0 1550175 00:31:35.094 20:07:26 compress_compdev -- common/autotest_common.sh@955 -- # uname 00:31:35.094 20:07:26 compress_compdev -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:31:35.094 20:07:26 compress_compdev -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1550175 00:31:35.094 20:07:26 compress_compdev -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:31:35.094 20:07:26 compress_compdev -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:31:35.094 20:07:26 compress_compdev -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1550175' 00:31:35.094 killing process with pid 1550175 00:31:35.094 20:07:26 compress_compdev -- common/autotest_common.sh@969 -- # kill 1550175 00:31:35.094 Received shutdown signal, test time was about 3.000000 seconds 00:31:35.094 00:31:35.094 Latency(us) 00:31:35.094 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:35.094 =================================================================================================================== 00:31:35.094 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:35.095 20:07:26 compress_compdev -- common/autotest_common.sh@974 -- # wait 1550175 00:31:38.386 20:07:29 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:31:38.386 20:07:29 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:31:38.386 20:07:29 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1551941 00:31:38.386 20:07:29 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:38.386 20:07:29 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:31:38.386 20:07:29 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1551941 00:31:38.386 20:07:29 compress_compdev -- common/autotest_common.sh@831 -- # '[' -z 1551941 ']' 00:31:38.386 20:07:29 compress_compdev -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:38.386 20:07:29 compress_compdev -- common/autotest_common.sh@836 -- # local max_retries=100 00:31:38.386 20:07:29 compress_compdev -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:38.386 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:38.386 20:07:29 compress_compdev -- common/autotest_common.sh@840 -- # xtrace_disable 00:31:38.386 20:07:29 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:31:38.386 [2024-07-24 20:07:29.719674] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:31:38.386 [2024-07-24 20:07:29.719747] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1551941 ] 00:31:38.386 [2024-07-24 20:07:29.854479] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:38.386 [2024-07-24 20:07:29.971274] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:38.386 [2024-07-24 20:07:29.971280] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:39.764 [2024-07-24 20:07:30.925695] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:31:39.764 20:07:31 compress_compdev -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:31:39.764 20:07:31 compress_compdev -- common/autotest_common.sh@864 -- # return 0 00:31:39.764 20:07:31 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:31:39.764 20:07:31 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:39.764 20:07:31 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:40.023 [2024-07-24 20:07:31.609297] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x24c6430 PMD being used: compress_qat 00:31:40.282 20:07:31 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:40.282 20:07:31 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:31:40.282 20:07:31 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:40.282 20:07:31 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:31:40.282 20:07:31 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:40.282 20:07:31 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:40.282 20:07:31 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:40.543 20:07:31 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:40.543 [ 00:31:40.543 { 00:31:40.543 "name": "Nvme0n1", 00:31:40.543 "aliases": [ 00:31:40.543 "01000000-0000-0000-5cd2-e43197705251" 00:31:40.543 ], 00:31:40.543 "product_name": "NVMe disk", 00:31:40.543 "block_size": 512, 00:31:40.543 "num_blocks": 15002931888, 00:31:40.543 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:40.543 "assigned_rate_limits": { 00:31:40.543 "rw_ios_per_sec": 0, 00:31:40.543 "rw_mbytes_per_sec": 0, 00:31:40.543 "r_mbytes_per_sec": 0, 00:31:40.543 "w_mbytes_per_sec": 0 00:31:40.543 }, 00:31:40.543 "claimed": false, 00:31:40.543 "zoned": false, 00:31:40.543 "supported_io_types": { 00:31:40.543 "read": true, 00:31:40.543 "write": true, 00:31:40.543 "unmap": true, 00:31:40.543 "flush": true, 00:31:40.543 "reset": true, 00:31:40.543 "nvme_admin": true, 00:31:40.543 "nvme_io": true, 00:31:40.543 "nvme_io_md": false, 00:31:40.543 "write_zeroes": true, 00:31:40.543 "zcopy": false, 00:31:40.543 "get_zone_info": false, 00:31:40.543 "zone_management": false, 00:31:40.543 "zone_append": false, 00:31:40.543 "compare": false, 00:31:40.543 "compare_and_write": false, 00:31:40.543 "abort": true, 00:31:40.543 "seek_hole": false, 00:31:40.543 "seek_data": false, 00:31:40.543 "copy": false, 00:31:40.543 "nvme_iov_md": false 00:31:40.543 }, 00:31:40.543 "driver_specific": { 00:31:40.543 "nvme": [ 00:31:40.543 { 00:31:40.543 "pci_address": "0000:5e:00.0", 00:31:40.543 "trid": { 00:31:40.543 "trtype": "PCIe", 00:31:40.544 "traddr": "0000:5e:00.0" 00:31:40.544 }, 00:31:40.544 "ctrlr_data": { 00:31:40.544 "cntlid": 0, 00:31:40.544 "vendor_id": "0x8086", 00:31:40.544 "model_number": "INTEL SSDPF2KX076TZO", 00:31:40.544 "serial_number": "PHAC0301002G7P6CGN", 00:31:40.544 "firmware_revision": "JCV10200", 00:31:40.544 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:40.544 "oacs": { 00:31:40.544 "security": 1, 00:31:40.544 "format": 1, 00:31:40.544 "firmware": 1, 00:31:40.544 "ns_manage": 1 00:31:40.544 }, 00:31:40.544 "multi_ctrlr": false, 00:31:40.544 "ana_reporting": false 00:31:40.544 }, 00:31:40.544 "vs": { 00:31:40.544 "nvme_version": "1.3" 00:31:40.544 }, 00:31:40.544 "ns_data": { 00:31:40.544 "id": 1, 00:31:40.544 "can_share": false 00:31:40.544 }, 00:31:40.544 "security": { 00:31:40.544 "opal": true 00:31:40.544 } 00:31:40.544 } 00:31:40.544 ], 00:31:40.544 "mp_policy": "active_passive" 00:31:40.544 } 00:31:40.544 } 00:31:40.544 ] 00:31:40.871 20:07:32 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:31:40.871 20:07:32 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:40.871 [2024-07-24 20:07:32.308062] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x22fd210 PMD being used: compress_qat 00:31:43.404 fec360df-8d7a-4095-ad04-8ae3943e4257 00:31:43.404 20:07:34 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:43.404 e309c6cd-5267-437c-bcc9-c858b9b561d9 00:31:43.404 20:07:34 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:43.404 20:07:34 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:31:43.404 20:07:34 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:43.404 20:07:34 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:31:43.404 20:07:34 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:43.404 20:07:34 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:43.404 20:07:34 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:43.404 20:07:34 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:43.664 [ 00:31:43.664 { 00:31:43.664 "name": "e309c6cd-5267-437c-bcc9-c858b9b561d9", 00:31:43.664 "aliases": [ 00:31:43.664 "lvs0/lv0" 00:31:43.664 ], 00:31:43.664 "product_name": "Logical Volume", 00:31:43.664 "block_size": 512, 00:31:43.664 "num_blocks": 204800, 00:31:43.664 "uuid": "e309c6cd-5267-437c-bcc9-c858b9b561d9", 00:31:43.664 "assigned_rate_limits": { 00:31:43.664 "rw_ios_per_sec": 0, 00:31:43.664 "rw_mbytes_per_sec": 0, 00:31:43.664 "r_mbytes_per_sec": 0, 00:31:43.664 "w_mbytes_per_sec": 0 00:31:43.664 }, 00:31:43.664 "claimed": false, 00:31:43.664 "zoned": false, 00:31:43.664 "supported_io_types": { 00:31:43.664 "read": true, 00:31:43.664 "write": true, 00:31:43.664 "unmap": true, 00:31:43.664 "flush": false, 00:31:43.664 "reset": true, 00:31:43.664 "nvme_admin": false, 00:31:43.664 "nvme_io": false, 00:31:43.664 "nvme_io_md": false, 00:31:43.664 "write_zeroes": true, 00:31:43.664 "zcopy": false, 00:31:43.664 "get_zone_info": false, 00:31:43.664 "zone_management": false, 00:31:43.664 "zone_append": false, 00:31:43.664 "compare": false, 00:31:43.664 "compare_and_write": false, 00:31:43.664 "abort": false, 00:31:43.664 "seek_hole": true, 00:31:43.664 "seek_data": true, 00:31:43.664 "copy": false, 00:31:43.664 "nvme_iov_md": false 00:31:43.664 }, 00:31:43.664 "driver_specific": { 00:31:43.664 "lvol": { 00:31:43.664 "lvol_store_uuid": "fec360df-8d7a-4095-ad04-8ae3943e4257", 00:31:43.664 "base_bdev": "Nvme0n1", 00:31:43.664 "thin_provision": true, 00:31:43.664 "num_allocated_clusters": 0, 00:31:43.664 "snapshot": false, 00:31:43.664 "clone": false, 00:31:43.664 "esnap_clone": false 00:31:43.664 } 00:31:43.664 } 00:31:43.664 } 00:31:43.664 ] 00:31:43.664 20:07:35 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:31:43.664 20:07:35 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:31:43.664 20:07:35 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:31:43.923 [2024-07-24 20:07:35.440851] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:43.923 COMP_lvs0/lv0 00:31:43.923 20:07:35 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:43.923 20:07:35 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:31:43.923 20:07:35 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:43.923 20:07:35 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:31:43.923 20:07:35 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:43.923 20:07:35 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:43.923 20:07:35 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:44.183 20:07:35 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:44.442 [ 00:31:44.442 { 00:31:44.442 "name": "COMP_lvs0/lv0", 00:31:44.442 "aliases": [ 00:31:44.442 "950381ba-eac7-59ce-baab-ba62e5404b5d" 00:31:44.442 ], 00:31:44.442 "product_name": "compress", 00:31:44.442 "block_size": 4096, 00:31:44.442 "num_blocks": 25088, 00:31:44.442 "uuid": "950381ba-eac7-59ce-baab-ba62e5404b5d", 00:31:44.442 "assigned_rate_limits": { 00:31:44.442 "rw_ios_per_sec": 0, 00:31:44.442 "rw_mbytes_per_sec": 0, 00:31:44.442 "r_mbytes_per_sec": 0, 00:31:44.442 "w_mbytes_per_sec": 0 00:31:44.442 }, 00:31:44.442 "claimed": false, 00:31:44.442 "zoned": false, 00:31:44.442 "supported_io_types": { 00:31:44.442 "read": true, 00:31:44.442 "write": true, 00:31:44.442 "unmap": false, 00:31:44.442 "flush": false, 00:31:44.442 "reset": false, 00:31:44.442 "nvme_admin": false, 00:31:44.442 "nvme_io": false, 00:31:44.442 "nvme_io_md": false, 00:31:44.442 "write_zeroes": true, 00:31:44.442 "zcopy": false, 00:31:44.442 "get_zone_info": false, 00:31:44.442 "zone_management": false, 00:31:44.442 "zone_append": false, 00:31:44.442 "compare": false, 00:31:44.442 "compare_and_write": false, 00:31:44.442 "abort": false, 00:31:44.442 "seek_hole": false, 00:31:44.442 "seek_data": false, 00:31:44.442 "copy": false, 00:31:44.442 "nvme_iov_md": false 00:31:44.442 }, 00:31:44.442 "driver_specific": { 00:31:44.442 "compress": { 00:31:44.442 "name": "COMP_lvs0/lv0", 00:31:44.442 "base_bdev_name": "e309c6cd-5267-437c-bcc9-c858b9b561d9", 00:31:44.442 "pm_path": "/tmp/pmem/369c64cf-2651-4c1e-b694-80813b7095b1" 00:31:44.442 } 00:31:44.442 } 00:31:44.442 } 00:31:44.442 ] 00:31:44.442 20:07:35 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:31:44.442 20:07:35 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:44.701 [2024-07-24 20:07:36.109776] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x23a6f20 PMD being used: compress_qat 00:31:44.701 [2024-07-24 20:07:36.113957] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f11a819bc50 PMD being used: compress_qat 00:31:44.701 Running I/O for 3 seconds... 00:31:47.986 00:31:47.986 Latency(us) 00:31:47.986 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:47.986 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:47.986 Verification LBA range: start 0x0 length 0x3100 00:31:47.986 COMP_lvs0/lv0 : 3.01 2785.89 10.88 0.00 0.00 11401.35 701.66 10029.86 00:31:47.986 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:47.986 Verification LBA range: start 0x3100 length 0x3100 00:31:47.986 COMP_lvs0/lv0 : 3.01 2610.04 10.20 0.00 0.00 12127.13 1161.13 10941.66 00:31:47.986 =================================================================================================================== 00:31:47.986 Total : 5395.93 21.08 0.00 0.00 11752.50 701.66 10941.66 00:31:47.986 0 00:31:47.986 20:07:39 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:31:47.986 20:07:39 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:47.986 20:07:39 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:48.245 20:07:39 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:48.245 20:07:39 compress_compdev -- compress/compress.sh@78 -- # killprocess 1551941 00:31:48.245 20:07:39 compress_compdev -- common/autotest_common.sh@950 -- # '[' -z 1551941 ']' 00:31:48.245 20:07:39 compress_compdev -- common/autotest_common.sh@954 -- # kill -0 1551941 00:31:48.245 20:07:39 compress_compdev -- common/autotest_common.sh@955 -- # uname 00:31:48.245 20:07:39 compress_compdev -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:31:48.245 20:07:39 compress_compdev -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1551941 00:31:48.245 20:07:39 compress_compdev -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:31:48.245 20:07:39 compress_compdev -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:31:48.245 20:07:39 compress_compdev -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1551941' 00:31:48.245 killing process with pid 1551941 00:31:48.245 20:07:39 compress_compdev -- common/autotest_common.sh@969 -- # kill 1551941 00:31:48.245 Received shutdown signal, test time was about 3.000000 seconds 00:31:48.245 00:31:48.245 Latency(us) 00:31:48.245 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:48.245 =================================================================================================================== 00:31:48.245 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:48.245 20:07:39 compress_compdev -- common/autotest_common.sh@974 -- # wait 1551941 00:31:51.534 20:07:42 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:31:51.534 20:07:42 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:31:51.534 20:07:42 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=1553545 00:31:51.534 20:07:42 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:51.534 20:07:42 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:31:51.534 20:07:42 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 1553545 00:31:51.534 20:07:42 compress_compdev -- common/autotest_common.sh@831 -- # '[' -z 1553545 ']' 00:31:51.534 20:07:42 compress_compdev -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:51.534 20:07:42 compress_compdev -- common/autotest_common.sh@836 -- # local max_retries=100 00:31:51.534 20:07:42 compress_compdev -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:51.534 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:51.534 20:07:42 compress_compdev -- common/autotest_common.sh@840 -- # xtrace_disable 00:31:51.534 20:07:42 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:31:51.534 [2024-07-24 20:07:42.864847] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:31:51.534 [2024-07-24 20:07:42.864929] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1553545 ] 00:31:51.534 [2024-07-24 20:07:42.997221] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:51.534 [2024-07-24 20:07:43.103845] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:51.534 [2024-07-24 20:07:43.103947] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:51.534 [2024-07-24 20:07:43.103948] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:52.469 [2024-07-24 20:07:43.863995] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:31:52.469 20:07:43 compress_compdev -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:31:52.469 20:07:43 compress_compdev -- common/autotest_common.sh@864 -- # return 0 00:31:52.469 20:07:43 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:31:52.469 20:07:43 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:52.469 20:07:43 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:53.037 [2024-07-24 20:07:44.448446] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2245d00 PMD being used: compress_qat 00:31:53.037 20:07:44 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:53.037 20:07:44 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:31:53.037 20:07:44 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:53.037 20:07:44 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:31:53.037 20:07:44 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:53.037 20:07:44 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:53.037 20:07:44 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:53.296 20:07:44 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:53.554 [ 00:31:53.554 { 00:31:53.554 "name": "Nvme0n1", 00:31:53.554 "aliases": [ 00:31:53.554 "01000000-0000-0000-5cd2-e43197705251" 00:31:53.554 ], 00:31:53.554 "product_name": "NVMe disk", 00:31:53.554 "block_size": 512, 00:31:53.554 "num_blocks": 15002931888, 00:31:53.554 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:53.554 "assigned_rate_limits": { 00:31:53.554 "rw_ios_per_sec": 0, 00:31:53.554 "rw_mbytes_per_sec": 0, 00:31:53.554 "r_mbytes_per_sec": 0, 00:31:53.554 "w_mbytes_per_sec": 0 00:31:53.554 }, 00:31:53.554 "claimed": false, 00:31:53.554 "zoned": false, 00:31:53.554 "supported_io_types": { 00:31:53.554 "read": true, 00:31:53.554 "write": true, 00:31:53.554 "unmap": true, 00:31:53.554 "flush": true, 00:31:53.554 "reset": true, 00:31:53.554 "nvme_admin": true, 00:31:53.554 "nvme_io": true, 00:31:53.554 "nvme_io_md": false, 00:31:53.554 "write_zeroes": true, 00:31:53.554 "zcopy": false, 00:31:53.554 "get_zone_info": false, 00:31:53.554 "zone_management": false, 00:31:53.554 "zone_append": false, 00:31:53.554 "compare": false, 00:31:53.554 "compare_and_write": false, 00:31:53.554 "abort": true, 00:31:53.554 "seek_hole": false, 00:31:53.554 "seek_data": false, 00:31:53.554 "copy": false, 00:31:53.554 "nvme_iov_md": false 00:31:53.554 }, 00:31:53.554 "driver_specific": { 00:31:53.554 "nvme": [ 00:31:53.554 { 00:31:53.554 "pci_address": "0000:5e:00.0", 00:31:53.554 "trid": { 00:31:53.554 "trtype": "PCIe", 00:31:53.554 "traddr": "0000:5e:00.0" 00:31:53.554 }, 00:31:53.554 "ctrlr_data": { 00:31:53.554 "cntlid": 0, 00:31:53.554 "vendor_id": "0x8086", 00:31:53.554 "model_number": "INTEL SSDPF2KX076TZO", 00:31:53.554 "serial_number": "PHAC0301002G7P6CGN", 00:31:53.554 "firmware_revision": "JCV10200", 00:31:53.555 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:53.555 "oacs": { 00:31:53.555 "security": 1, 00:31:53.555 "format": 1, 00:31:53.555 "firmware": 1, 00:31:53.555 "ns_manage": 1 00:31:53.555 }, 00:31:53.555 "multi_ctrlr": false, 00:31:53.555 "ana_reporting": false 00:31:53.555 }, 00:31:53.555 "vs": { 00:31:53.555 "nvme_version": "1.3" 00:31:53.555 }, 00:31:53.555 "ns_data": { 00:31:53.555 "id": 1, 00:31:53.555 "can_share": false 00:31:53.555 }, 00:31:53.555 "security": { 00:31:53.555 "opal": false 00:31:53.555 } 00:31:53.555 } 00:31:53.555 ], 00:31:53.555 "mp_policy": "active_passive" 00:31:53.555 } 00:31:53.555 } 00:31:53.555 ] 00:31:53.555 20:07:44 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:31:53.555 20:07:44 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:53.555 [2024-07-24 20:07:45.105882] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x207cfe0 PMD being used: compress_qat 00:31:56.090 18695a66-dc50-4c87-882a-b24534d06c8a 00:31:56.090 20:07:47 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:56.090 83ecbdbd-f95f-46da-8b25-812203322461 00:31:56.090 20:07:47 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:56.090 20:07:47 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:31:56.090 20:07:47 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:56.090 20:07:47 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:31:56.090 20:07:47 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:56.090 20:07:47 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:56.090 20:07:47 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:56.349 20:07:47 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:56.607 [ 00:31:56.607 { 00:31:56.607 "name": "83ecbdbd-f95f-46da-8b25-812203322461", 00:31:56.607 "aliases": [ 00:31:56.607 "lvs0/lv0" 00:31:56.607 ], 00:31:56.607 "product_name": "Logical Volume", 00:31:56.607 "block_size": 512, 00:31:56.607 "num_blocks": 204800, 00:31:56.607 "uuid": "83ecbdbd-f95f-46da-8b25-812203322461", 00:31:56.607 "assigned_rate_limits": { 00:31:56.607 "rw_ios_per_sec": 0, 00:31:56.607 "rw_mbytes_per_sec": 0, 00:31:56.607 "r_mbytes_per_sec": 0, 00:31:56.607 "w_mbytes_per_sec": 0 00:31:56.607 }, 00:31:56.607 "claimed": false, 00:31:56.607 "zoned": false, 00:31:56.607 "supported_io_types": { 00:31:56.607 "read": true, 00:31:56.607 "write": true, 00:31:56.607 "unmap": true, 00:31:56.607 "flush": false, 00:31:56.607 "reset": true, 00:31:56.607 "nvme_admin": false, 00:31:56.607 "nvme_io": false, 00:31:56.607 "nvme_io_md": false, 00:31:56.607 "write_zeroes": true, 00:31:56.607 "zcopy": false, 00:31:56.607 "get_zone_info": false, 00:31:56.607 "zone_management": false, 00:31:56.607 "zone_append": false, 00:31:56.607 "compare": false, 00:31:56.607 "compare_and_write": false, 00:31:56.607 "abort": false, 00:31:56.607 "seek_hole": true, 00:31:56.607 "seek_data": true, 00:31:56.607 "copy": false, 00:31:56.607 "nvme_iov_md": false 00:31:56.607 }, 00:31:56.607 "driver_specific": { 00:31:56.607 "lvol": { 00:31:56.607 "lvol_store_uuid": "18695a66-dc50-4c87-882a-b24534d06c8a", 00:31:56.608 "base_bdev": "Nvme0n1", 00:31:56.608 "thin_provision": true, 00:31:56.608 "num_allocated_clusters": 0, 00:31:56.608 "snapshot": false, 00:31:56.608 "clone": false, 00:31:56.608 "esnap_clone": false 00:31:56.608 } 00:31:56.608 } 00:31:56.608 } 00:31:56.608 ] 00:31:56.608 20:07:48 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:31:56.608 20:07:48 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:31:56.608 20:07:48 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:31:56.867 [2024-07-24 20:07:48.354134] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:56.867 COMP_lvs0/lv0 00:31:56.867 20:07:48 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:56.867 20:07:48 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:31:56.867 20:07:48 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:31:56.867 20:07:48 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:31:56.867 20:07:48 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:31:56.867 20:07:48 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:31:56.867 20:07:48 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:57.125 20:07:48 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:57.384 [ 00:31:57.384 { 00:31:57.384 "name": "COMP_lvs0/lv0", 00:31:57.384 "aliases": [ 00:31:57.384 "052688cc-4ae2-5d74-885b-5ccb0800254d" 00:31:57.384 ], 00:31:57.384 "product_name": "compress", 00:31:57.384 "block_size": 512, 00:31:57.384 "num_blocks": 200704, 00:31:57.384 "uuid": "052688cc-4ae2-5d74-885b-5ccb0800254d", 00:31:57.384 "assigned_rate_limits": { 00:31:57.384 "rw_ios_per_sec": 0, 00:31:57.384 "rw_mbytes_per_sec": 0, 00:31:57.384 "r_mbytes_per_sec": 0, 00:31:57.384 "w_mbytes_per_sec": 0 00:31:57.384 }, 00:31:57.384 "claimed": false, 00:31:57.384 "zoned": false, 00:31:57.384 "supported_io_types": { 00:31:57.384 "read": true, 00:31:57.384 "write": true, 00:31:57.384 "unmap": false, 00:31:57.384 "flush": false, 00:31:57.384 "reset": false, 00:31:57.384 "nvme_admin": false, 00:31:57.384 "nvme_io": false, 00:31:57.385 "nvme_io_md": false, 00:31:57.385 "write_zeroes": true, 00:31:57.385 "zcopy": false, 00:31:57.385 "get_zone_info": false, 00:31:57.385 "zone_management": false, 00:31:57.385 "zone_append": false, 00:31:57.385 "compare": false, 00:31:57.385 "compare_and_write": false, 00:31:57.385 "abort": false, 00:31:57.385 "seek_hole": false, 00:31:57.385 "seek_data": false, 00:31:57.385 "copy": false, 00:31:57.385 "nvme_iov_md": false 00:31:57.385 }, 00:31:57.385 "driver_specific": { 00:31:57.385 "compress": { 00:31:57.385 "name": "COMP_lvs0/lv0", 00:31:57.385 "base_bdev_name": "83ecbdbd-f95f-46da-8b25-812203322461", 00:31:57.385 "pm_path": "/tmp/pmem/6453b7d1-c5e9-4507-a57b-0a63be157fd9" 00:31:57.385 } 00:31:57.385 } 00:31:57.385 } 00:31:57.385 ] 00:31:57.385 20:07:48 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:31:57.385 20:07:48 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:31:57.643 [2024-07-24 20:07:49.027433] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f93f41b1390 PMD being used: compress_qat 00:31:57.643 I/O targets: 00:31:57.643 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:31:57.643 00:31:57.643 00:31:57.643 CUnit - A unit testing framework for C - Version 2.1-3 00:31:57.643 http://cunit.sourceforge.net/ 00:31:57.643 00:31:57.643 00:31:57.643 Suite: bdevio tests on: COMP_lvs0/lv0 00:31:57.643 Test: blockdev write read block ...passed 00:31:57.643 Test: blockdev write zeroes read block ...passed 00:31:57.643 Test: blockdev write zeroes read no split ...passed 00:31:57.643 Test: blockdev write zeroes read split ...passed 00:31:57.643 Test: blockdev write zeroes read split partial ...passed 00:31:57.643 Test: blockdev reset ...[2024-07-24 20:07:49.131541] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:31:57.643 passed 00:31:57.643 Test: blockdev write read 8 blocks ...passed 00:31:57.643 Test: blockdev write read size > 128k ...passed 00:31:57.643 Test: blockdev write read invalid size ...passed 00:31:57.643 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:57.643 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:57.643 Test: blockdev write read max offset ...passed 00:31:57.643 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:57.643 Test: blockdev writev readv 8 blocks ...passed 00:31:57.643 Test: blockdev writev readv 30 x 1block ...passed 00:31:57.643 Test: blockdev writev readv block ...passed 00:31:57.643 Test: blockdev writev readv size > 128k ...passed 00:31:57.643 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:57.643 Test: blockdev comparev and writev ...passed 00:31:57.643 Test: blockdev nvme passthru rw ...passed 00:31:57.643 Test: blockdev nvme passthru vendor specific ...passed 00:31:57.643 Test: blockdev nvme admin passthru ...passed 00:31:57.643 Test: blockdev copy ...passed 00:31:57.643 00:31:57.643 Run Summary: Type Total Ran Passed Failed Inactive 00:31:57.643 suites 1 1 n/a 0 0 00:31:57.643 tests 23 23 23 0 0 00:31:57.643 asserts 130 130 130 0 n/a 00:31:57.643 00:31:57.643 Elapsed time = 0.236 seconds 00:31:57.643 0 00:31:57.643 20:07:49 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:31:57.643 20:07:49 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:57.902 20:07:49 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:58.160 20:07:49 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:31:58.160 20:07:49 compress_compdev -- compress/compress.sh@62 -- # killprocess 1553545 00:31:58.160 20:07:49 compress_compdev -- common/autotest_common.sh@950 -- # '[' -z 1553545 ']' 00:31:58.160 20:07:49 compress_compdev -- common/autotest_common.sh@954 -- # kill -0 1553545 00:31:58.160 20:07:49 compress_compdev -- common/autotest_common.sh@955 -- # uname 00:31:58.161 20:07:49 compress_compdev -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:31:58.161 20:07:49 compress_compdev -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1553545 00:31:58.161 20:07:49 compress_compdev -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:31:58.161 20:07:49 compress_compdev -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:31:58.161 20:07:49 compress_compdev -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1553545' 00:31:58.161 killing process with pid 1553545 00:31:58.161 20:07:49 compress_compdev -- common/autotest_common.sh@969 -- # kill 1553545 00:31:58.161 20:07:49 compress_compdev -- common/autotest_common.sh@974 -- # wait 1553545 00:32:01.449 20:07:52 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:32:01.449 20:07:52 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:32:01.449 00:32:01.449 real 0m50.480s 00:32:01.449 user 1m55.705s 00:32:01.449 sys 0m6.517s 00:32:01.449 20:07:52 compress_compdev -- common/autotest_common.sh@1126 -- # xtrace_disable 00:32:01.449 20:07:52 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:32:01.449 ************************************ 00:32:01.449 END TEST compress_compdev 00:32:01.449 ************************************ 00:32:01.449 20:07:52 -- spdk/autotest.sh@353 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:32:01.449 20:07:52 -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:32:01.449 20:07:52 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:32:01.449 20:07:52 -- common/autotest_common.sh@10 -- # set +x 00:32:01.449 ************************************ 00:32:01.449 START TEST compress_isal 00:32:01.449 ************************************ 00:32:01.449 20:07:52 compress_isal -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:32:01.449 * Looking for test storage... 00:32:01.449 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:32:01.449 20:07:52 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:32:01.449 20:07:52 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:32:01.449 20:07:52 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:32:01.449 20:07:52 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:32:01.449 20:07:52 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:32:01.449 20:07:52 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:32:01.449 20:07:52 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:32:01.449 20:07:52 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:32:01.449 20:07:52 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:32:01.449 20:07:52 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:32:01.449 20:07:52 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:32:01.449 20:07:52 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:32:01.449 20:07:52 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:32:01.449 20:07:52 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:32:01.449 20:07:52 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:32:01.449 20:07:52 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:32:01.449 20:07:52 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:32:01.449 20:07:52 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:32:01.449 20:07:52 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:32:01.449 20:07:52 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:32:01.449 20:07:52 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:32:01.449 20:07:52 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:32:01.449 20:07:52 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:32:01.449 20:07:52 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:32:01.449 20:07:52 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:32:01.449 20:07:52 compress_isal -- paths/export.sh@5 -- # export PATH 00:32:01.449 20:07:52 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:32:01.449 20:07:52 compress_isal -- nvmf/common.sh@47 -- # : 0 00:32:01.449 20:07:52 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:32:01.449 20:07:52 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:32:01.449 20:07:52 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:32:01.449 20:07:52 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:32:01.449 20:07:52 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:32:01.449 20:07:52 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:32:01.449 20:07:52 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:32:01.449 20:07:52 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:32:01.449 20:07:52 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:32:01.449 20:07:52 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:32:01.449 20:07:52 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:32:01.449 20:07:52 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:32:01.449 20:07:52 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:32:01.449 20:07:52 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1554893 00:32:01.449 20:07:52 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:32:01.449 20:07:52 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1554893 00:32:01.449 20:07:52 compress_isal -- common/autotest_common.sh@831 -- # '[' -z 1554893 ']' 00:32:01.449 20:07:52 compress_isal -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:01.449 20:07:52 compress_isal -- common/autotest_common.sh@836 -- # local max_retries=100 00:32:01.449 20:07:52 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:32:01.449 20:07:52 compress_isal -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:01.449 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:01.449 20:07:52 compress_isal -- common/autotest_common.sh@840 -- # xtrace_disable 00:32:01.449 20:07:52 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:32:01.708 [2024-07-24 20:07:53.048499] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:32:01.708 [2024-07-24 20:07:53.048575] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1554893 ] 00:32:01.708 [2024-07-24 20:07:53.185511] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:01.967 [2024-07-24 20:07:53.308433] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:01.967 [2024-07-24 20:07:53.308440] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:02.534 20:07:54 compress_isal -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:32:02.534 20:07:54 compress_isal -- common/autotest_common.sh@864 -- # return 0 00:32:02.534 20:07:54 compress_isal -- compress/compress.sh@74 -- # create_vols 00:32:02.534 20:07:54 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:32:02.534 20:07:54 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:32:03.101 20:07:54 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:32:03.101 20:07:54 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:32:03.101 20:07:54 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:32:03.101 20:07:54 compress_isal -- common/autotest_common.sh@901 -- # local i 00:32:03.101 20:07:54 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:32:03.101 20:07:54 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:32:03.101 20:07:54 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:03.360 20:07:54 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:32:03.662 [ 00:32:03.662 { 00:32:03.662 "name": "Nvme0n1", 00:32:03.662 "aliases": [ 00:32:03.662 "01000000-0000-0000-5cd2-e43197705251" 00:32:03.662 ], 00:32:03.662 "product_name": "NVMe disk", 00:32:03.662 "block_size": 512, 00:32:03.662 "num_blocks": 15002931888, 00:32:03.662 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:32:03.662 "assigned_rate_limits": { 00:32:03.662 "rw_ios_per_sec": 0, 00:32:03.662 "rw_mbytes_per_sec": 0, 00:32:03.662 "r_mbytes_per_sec": 0, 00:32:03.662 "w_mbytes_per_sec": 0 00:32:03.662 }, 00:32:03.662 "claimed": false, 00:32:03.662 "zoned": false, 00:32:03.662 "supported_io_types": { 00:32:03.662 "read": true, 00:32:03.662 "write": true, 00:32:03.662 "unmap": true, 00:32:03.662 "flush": true, 00:32:03.662 "reset": true, 00:32:03.662 "nvme_admin": true, 00:32:03.662 "nvme_io": true, 00:32:03.662 "nvme_io_md": false, 00:32:03.662 "write_zeroes": true, 00:32:03.662 "zcopy": false, 00:32:03.662 "get_zone_info": false, 00:32:03.662 "zone_management": false, 00:32:03.662 "zone_append": false, 00:32:03.662 "compare": false, 00:32:03.662 "compare_and_write": false, 00:32:03.662 "abort": true, 00:32:03.662 "seek_hole": false, 00:32:03.662 "seek_data": false, 00:32:03.662 "copy": false, 00:32:03.662 "nvme_iov_md": false 00:32:03.662 }, 00:32:03.662 "driver_specific": { 00:32:03.662 "nvme": [ 00:32:03.662 { 00:32:03.662 "pci_address": "0000:5e:00.0", 00:32:03.662 "trid": { 00:32:03.662 "trtype": "PCIe", 00:32:03.662 "traddr": "0000:5e:00.0" 00:32:03.662 }, 00:32:03.662 "ctrlr_data": { 00:32:03.662 "cntlid": 0, 00:32:03.662 "vendor_id": "0x8086", 00:32:03.662 "model_number": "INTEL SSDPF2KX076TZO", 00:32:03.662 "serial_number": "PHAC0301002G7P6CGN", 00:32:03.662 "firmware_revision": "JCV10200", 00:32:03.662 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:32:03.662 "oacs": { 00:32:03.662 "security": 1, 00:32:03.662 "format": 1, 00:32:03.662 "firmware": 1, 00:32:03.662 "ns_manage": 1 00:32:03.662 }, 00:32:03.662 "multi_ctrlr": false, 00:32:03.662 "ana_reporting": false 00:32:03.662 }, 00:32:03.662 "vs": { 00:32:03.662 "nvme_version": "1.3" 00:32:03.662 }, 00:32:03.662 "ns_data": { 00:32:03.662 "id": 1, 00:32:03.662 "can_share": false 00:32:03.662 }, 00:32:03.662 "security": { 00:32:03.662 "opal": true 00:32:03.662 } 00:32:03.662 } 00:32:03.662 ], 00:32:03.662 "mp_policy": "active_passive" 00:32:03.662 } 00:32:03.662 } 00:32:03.662 ] 00:32:03.662 20:07:55 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:32:03.662 20:07:55 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:32:06.227 8f8ff449-fe90-4f02-9a82-0abb80b46d7f 00:32:06.227 20:07:57 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:32:06.227 3174a190-e284-4c1d-a171-9e8c6aeeaec1 00:32:06.485 20:07:57 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:32:06.485 20:07:57 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:32:06.485 20:07:57 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:32:06.485 20:07:57 compress_isal -- common/autotest_common.sh@901 -- # local i 00:32:06.485 20:07:57 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:32:06.485 20:07:57 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:32:06.485 20:07:57 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:06.744 20:07:58 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:32:06.744 [ 00:32:06.744 { 00:32:06.744 "name": "3174a190-e284-4c1d-a171-9e8c6aeeaec1", 00:32:06.744 "aliases": [ 00:32:06.744 "lvs0/lv0" 00:32:06.744 ], 00:32:06.744 "product_name": "Logical Volume", 00:32:06.744 "block_size": 512, 00:32:06.744 "num_blocks": 204800, 00:32:06.744 "uuid": "3174a190-e284-4c1d-a171-9e8c6aeeaec1", 00:32:06.744 "assigned_rate_limits": { 00:32:06.744 "rw_ios_per_sec": 0, 00:32:06.744 "rw_mbytes_per_sec": 0, 00:32:06.744 "r_mbytes_per_sec": 0, 00:32:06.744 "w_mbytes_per_sec": 0 00:32:06.744 }, 00:32:06.744 "claimed": false, 00:32:06.744 "zoned": false, 00:32:06.744 "supported_io_types": { 00:32:06.744 "read": true, 00:32:06.744 "write": true, 00:32:06.744 "unmap": true, 00:32:06.744 "flush": false, 00:32:06.744 "reset": true, 00:32:06.744 "nvme_admin": false, 00:32:06.744 "nvme_io": false, 00:32:06.744 "nvme_io_md": false, 00:32:06.744 "write_zeroes": true, 00:32:06.744 "zcopy": false, 00:32:06.744 "get_zone_info": false, 00:32:06.744 "zone_management": false, 00:32:06.744 "zone_append": false, 00:32:06.744 "compare": false, 00:32:06.744 "compare_and_write": false, 00:32:06.744 "abort": false, 00:32:06.744 "seek_hole": true, 00:32:06.744 "seek_data": true, 00:32:06.744 "copy": false, 00:32:06.744 "nvme_iov_md": false 00:32:06.744 }, 00:32:06.744 "driver_specific": { 00:32:06.744 "lvol": { 00:32:06.744 "lvol_store_uuid": "8f8ff449-fe90-4f02-9a82-0abb80b46d7f", 00:32:06.744 "base_bdev": "Nvme0n1", 00:32:06.744 "thin_provision": true, 00:32:06.744 "num_allocated_clusters": 0, 00:32:06.744 "snapshot": false, 00:32:06.744 "clone": false, 00:32:06.744 "esnap_clone": false 00:32:06.744 } 00:32:06.744 } 00:32:06.744 } 00:32:06.744 ] 00:32:07.002 20:07:58 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:32:07.002 20:07:58 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:32:07.002 20:07:58 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:32:07.261 [2024-07-24 20:07:58.844972] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:32:07.261 COMP_lvs0/lv0 00:32:07.520 20:07:58 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:32:07.520 20:07:58 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:32:07.520 20:07:58 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:32:07.520 20:07:58 compress_isal -- common/autotest_common.sh@901 -- # local i 00:32:07.520 20:07:58 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:32:07.520 20:07:58 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:32:07.520 20:07:58 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:07.779 20:07:59 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:32:07.779 [ 00:32:07.779 { 00:32:07.779 "name": "COMP_lvs0/lv0", 00:32:07.779 "aliases": [ 00:32:07.779 "9ad68490-3892-5dc5-b132-12feffa9a51e" 00:32:07.779 ], 00:32:07.779 "product_name": "compress", 00:32:07.779 "block_size": 512, 00:32:07.779 "num_blocks": 200704, 00:32:07.779 "uuid": "9ad68490-3892-5dc5-b132-12feffa9a51e", 00:32:07.779 "assigned_rate_limits": { 00:32:07.779 "rw_ios_per_sec": 0, 00:32:07.779 "rw_mbytes_per_sec": 0, 00:32:07.779 "r_mbytes_per_sec": 0, 00:32:07.779 "w_mbytes_per_sec": 0 00:32:07.779 }, 00:32:07.779 "claimed": false, 00:32:07.779 "zoned": false, 00:32:07.779 "supported_io_types": { 00:32:07.779 "read": true, 00:32:07.779 "write": true, 00:32:07.779 "unmap": false, 00:32:07.779 "flush": false, 00:32:07.779 "reset": false, 00:32:07.779 "nvme_admin": false, 00:32:07.779 "nvme_io": false, 00:32:07.779 "nvme_io_md": false, 00:32:07.779 "write_zeroes": true, 00:32:07.779 "zcopy": false, 00:32:07.779 "get_zone_info": false, 00:32:07.779 "zone_management": false, 00:32:07.779 "zone_append": false, 00:32:07.779 "compare": false, 00:32:07.779 "compare_and_write": false, 00:32:07.779 "abort": false, 00:32:07.779 "seek_hole": false, 00:32:07.779 "seek_data": false, 00:32:07.779 "copy": false, 00:32:07.779 "nvme_iov_md": false 00:32:07.779 }, 00:32:07.779 "driver_specific": { 00:32:07.779 "compress": { 00:32:07.779 "name": "COMP_lvs0/lv0", 00:32:07.779 "base_bdev_name": "3174a190-e284-4c1d-a171-9e8c6aeeaec1", 00:32:07.779 "pm_path": "/tmp/pmem/4b059b3e-f162-4610-bb5b-a75a6aeb41bb" 00:32:07.779 } 00:32:07.779 } 00:32:07.779 } 00:32:07.779 ] 00:32:07.779 20:07:59 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:32:07.779 20:07:59 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:32:08.037 Running I/O for 3 seconds... 00:32:11.323 00:32:11.323 Latency(us) 00:32:11.323 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:11.323 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:32:11.323 Verification LBA range: start 0x0 length 0x3100 00:32:11.323 COMP_lvs0/lv0 : 3.01 1273.55 4.97 0.00 0.00 25015.05 2478.97 21769.35 00:32:11.323 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:32:11.323 Verification LBA range: start 0x3100 length 0x3100 00:32:11.323 COMP_lvs0/lv0 : 3.01 1275.84 4.98 0.00 0.00 24943.85 1531.55 21085.50 00:32:11.323 =================================================================================================================== 00:32:11.323 Total : 2549.40 9.96 0.00 0.00 24979.42 1531.55 21769.35 00:32:11.323 0 00:32:11.323 20:08:02 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:32:11.323 20:08:02 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:32:11.323 20:08:02 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:32:11.582 20:08:03 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:32:11.582 20:08:03 compress_isal -- compress/compress.sh@78 -- # killprocess 1554893 00:32:11.582 20:08:03 compress_isal -- common/autotest_common.sh@950 -- # '[' -z 1554893 ']' 00:32:11.582 20:08:03 compress_isal -- common/autotest_common.sh@954 -- # kill -0 1554893 00:32:11.582 20:08:03 compress_isal -- common/autotest_common.sh@955 -- # uname 00:32:11.582 20:08:03 compress_isal -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:32:11.582 20:08:03 compress_isal -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1554893 00:32:11.582 20:08:03 compress_isal -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:32:11.582 20:08:03 compress_isal -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:32:11.582 20:08:03 compress_isal -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1554893' 00:32:11.582 killing process with pid 1554893 00:32:11.582 20:08:03 compress_isal -- common/autotest_common.sh@969 -- # kill 1554893 00:32:11.582 Received shutdown signal, test time was about 3.000000 seconds 00:32:11.582 00:32:11.582 Latency(us) 00:32:11.582 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:11.582 =================================================================================================================== 00:32:11.582 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:11.582 20:08:03 compress_isal -- common/autotest_common.sh@974 -- # wait 1554893 00:32:14.867 20:08:06 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:32:14.867 20:08:06 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:32:14.867 20:08:06 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1556627 00:32:14.867 20:08:06 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:32:14.867 20:08:06 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1556627 00:32:14.867 20:08:06 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:32:14.867 20:08:06 compress_isal -- common/autotest_common.sh@831 -- # '[' -z 1556627 ']' 00:32:14.867 20:08:06 compress_isal -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:14.867 20:08:06 compress_isal -- common/autotest_common.sh@836 -- # local max_retries=100 00:32:14.867 20:08:06 compress_isal -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:14.867 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:14.867 20:08:06 compress_isal -- common/autotest_common.sh@840 -- # xtrace_disable 00:32:14.867 20:08:06 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:32:14.867 [2024-07-24 20:08:06.248884] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:32:14.867 [2024-07-24 20:08:06.249026] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1556627 ] 00:32:14.867 [2024-07-24 20:08:06.450120] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:15.126 [2024-07-24 20:08:06.581947] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:15.126 [2024-07-24 20:08:06.581953] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:15.693 20:08:07 compress_isal -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:32:15.693 20:08:07 compress_isal -- common/autotest_common.sh@864 -- # return 0 00:32:15.693 20:08:07 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:32:15.693 20:08:07 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:32:15.693 20:08:07 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:32:16.260 20:08:07 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:32:16.260 20:08:07 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:32:16.260 20:08:07 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:32:16.260 20:08:07 compress_isal -- common/autotest_common.sh@901 -- # local i 00:32:16.260 20:08:07 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:32:16.260 20:08:07 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:32:16.260 20:08:07 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:16.519 20:08:07 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:32:16.778 [ 00:32:16.778 { 00:32:16.778 "name": "Nvme0n1", 00:32:16.778 "aliases": [ 00:32:16.778 "01000000-0000-0000-5cd2-e43197705251" 00:32:16.778 ], 00:32:16.778 "product_name": "NVMe disk", 00:32:16.778 "block_size": 512, 00:32:16.778 "num_blocks": 15002931888, 00:32:16.778 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:32:16.778 "assigned_rate_limits": { 00:32:16.778 "rw_ios_per_sec": 0, 00:32:16.778 "rw_mbytes_per_sec": 0, 00:32:16.778 "r_mbytes_per_sec": 0, 00:32:16.778 "w_mbytes_per_sec": 0 00:32:16.778 }, 00:32:16.778 "claimed": false, 00:32:16.778 "zoned": false, 00:32:16.778 "supported_io_types": { 00:32:16.778 "read": true, 00:32:16.778 "write": true, 00:32:16.778 "unmap": true, 00:32:16.778 "flush": true, 00:32:16.778 "reset": true, 00:32:16.778 "nvme_admin": true, 00:32:16.778 "nvme_io": true, 00:32:16.778 "nvme_io_md": false, 00:32:16.778 "write_zeroes": true, 00:32:16.778 "zcopy": false, 00:32:16.778 "get_zone_info": false, 00:32:16.778 "zone_management": false, 00:32:16.778 "zone_append": false, 00:32:16.778 "compare": false, 00:32:16.778 "compare_and_write": false, 00:32:16.778 "abort": true, 00:32:16.778 "seek_hole": false, 00:32:16.778 "seek_data": false, 00:32:16.778 "copy": false, 00:32:16.778 "nvme_iov_md": false 00:32:16.778 }, 00:32:16.778 "driver_specific": { 00:32:16.778 "nvme": [ 00:32:16.778 { 00:32:16.778 "pci_address": "0000:5e:00.0", 00:32:16.778 "trid": { 00:32:16.778 "trtype": "PCIe", 00:32:16.778 "traddr": "0000:5e:00.0" 00:32:16.778 }, 00:32:16.778 "ctrlr_data": { 00:32:16.778 "cntlid": 0, 00:32:16.778 "vendor_id": "0x8086", 00:32:16.778 "model_number": "INTEL SSDPF2KX076TZO", 00:32:16.778 "serial_number": "PHAC0301002G7P6CGN", 00:32:16.778 "firmware_revision": "JCV10200", 00:32:16.778 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:32:16.778 "oacs": { 00:32:16.778 "security": 1, 00:32:16.778 "format": 1, 00:32:16.778 "firmware": 1, 00:32:16.778 "ns_manage": 1 00:32:16.778 }, 00:32:16.778 "multi_ctrlr": false, 00:32:16.778 "ana_reporting": false 00:32:16.778 }, 00:32:16.778 "vs": { 00:32:16.778 "nvme_version": "1.3" 00:32:16.778 }, 00:32:16.778 "ns_data": { 00:32:16.778 "id": 1, 00:32:16.778 "can_share": false 00:32:16.778 }, 00:32:16.778 "security": { 00:32:16.778 "opal": true 00:32:16.778 } 00:32:16.778 } 00:32:16.778 ], 00:32:16.778 "mp_policy": "active_passive" 00:32:16.778 } 00:32:16.778 } 00:32:16.778 ] 00:32:16.778 20:08:08 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:32:16.778 20:08:08 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:32:19.308 be076785-1df9-4f14-9574-be63725c6ac8 00:32:19.308 20:08:10 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:32:19.308 ac955c5b-9d94-4552-b26c-1fe7592abd05 00:32:19.566 20:08:10 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:32:19.566 20:08:10 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:32:19.566 20:08:10 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:32:19.566 20:08:10 compress_isal -- common/autotest_common.sh@901 -- # local i 00:32:19.566 20:08:10 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:32:19.566 20:08:10 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:32:19.566 20:08:10 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:19.824 20:08:11 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:32:19.824 [ 00:32:19.824 { 00:32:19.824 "name": "ac955c5b-9d94-4552-b26c-1fe7592abd05", 00:32:19.824 "aliases": [ 00:32:19.824 "lvs0/lv0" 00:32:19.824 ], 00:32:19.824 "product_name": "Logical Volume", 00:32:19.824 "block_size": 512, 00:32:19.824 "num_blocks": 204800, 00:32:19.824 "uuid": "ac955c5b-9d94-4552-b26c-1fe7592abd05", 00:32:19.824 "assigned_rate_limits": { 00:32:19.824 "rw_ios_per_sec": 0, 00:32:19.824 "rw_mbytes_per_sec": 0, 00:32:19.824 "r_mbytes_per_sec": 0, 00:32:19.824 "w_mbytes_per_sec": 0 00:32:19.824 }, 00:32:19.824 "claimed": false, 00:32:19.824 "zoned": false, 00:32:19.824 "supported_io_types": { 00:32:19.824 "read": true, 00:32:19.824 "write": true, 00:32:19.824 "unmap": true, 00:32:19.824 "flush": false, 00:32:19.824 "reset": true, 00:32:19.824 "nvme_admin": false, 00:32:19.824 "nvme_io": false, 00:32:19.824 "nvme_io_md": false, 00:32:19.824 "write_zeroes": true, 00:32:19.824 "zcopy": false, 00:32:19.824 "get_zone_info": false, 00:32:19.824 "zone_management": false, 00:32:19.824 "zone_append": false, 00:32:19.824 "compare": false, 00:32:19.824 "compare_and_write": false, 00:32:19.824 "abort": false, 00:32:19.824 "seek_hole": true, 00:32:19.824 "seek_data": true, 00:32:19.824 "copy": false, 00:32:19.824 "nvme_iov_md": false 00:32:19.824 }, 00:32:19.824 "driver_specific": { 00:32:19.824 "lvol": { 00:32:19.824 "lvol_store_uuid": "be076785-1df9-4f14-9574-be63725c6ac8", 00:32:19.824 "base_bdev": "Nvme0n1", 00:32:19.824 "thin_provision": true, 00:32:19.824 "num_allocated_clusters": 0, 00:32:19.824 "snapshot": false, 00:32:19.824 "clone": false, 00:32:19.824 "esnap_clone": false 00:32:19.824 } 00:32:19.824 } 00:32:19.824 } 00:32:19.824 ] 00:32:20.083 20:08:11 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:32:20.083 20:08:11 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:32:20.083 20:08:11 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:32:20.083 [2024-07-24 20:08:11.652896] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:32:20.083 COMP_lvs0/lv0 00:32:20.422 20:08:11 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:32:20.422 20:08:11 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:32:20.422 20:08:11 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:32:20.422 20:08:11 compress_isal -- common/autotest_common.sh@901 -- # local i 00:32:20.422 20:08:11 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:32:20.422 20:08:11 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:32:20.422 20:08:11 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:20.422 20:08:11 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:32:20.680 [ 00:32:20.680 { 00:32:20.680 "name": "COMP_lvs0/lv0", 00:32:20.680 "aliases": [ 00:32:20.680 "0cd8bf90-d924-56b5-baf9-83535c958e10" 00:32:20.680 ], 00:32:20.681 "product_name": "compress", 00:32:20.681 "block_size": 512, 00:32:20.681 "num_blocks": 200704, 00:32:20.681 "uuid": "0cd8bf90-d924-56b5-baf9-83535c958e10", 00:32:20.681 "assigned_rate_limits": { 00:32:20.681 "rw_ios_per_sec": 0, 00:32:20.681 "rw_mbytes_per_sec": 0, 00:32:20.681 "r_mbytes_per_sec": 0, 00:32:20.681 "w_mbytes_per_sec": 0 00:32:20.681 }, 00:32:20.681 "claimed": false, 00:32:20.681 "zoned": false, 00:32:20.681 "supported_io_types": { 00:32:20.681 "read": true, 00:32:20.681 "write": true, 00:32:20.681 "unmap": false, 00:32:20.681 "flush": false, 00:32:20.681 "reset": false, 00:32:20.681 "nvme_admin": false, 00:32:20.681 "nvme_io": false, 00:32:20.681 "nvme_io_md": false, 00:32:20.681 "write_zeroes": true, 00:32:20.681 "zcopy": false, 00:32:20.681 "get_zone_info": false, 00:32:20.681 "zone_management": false, 00:32:20.681 "zone_append": false, 00:32:20.681 "compare": false, 00:32:20.681 "compare_and_write": false, 00:32:20.681 "abort": false, 00:32:20.681 "seek_hole": false, 00:32:20.681 "seek_data": false, 00:32:20.681 "copy": false, 00:32:20.681 "nvme_iov_md": false 00:32:20.681 }, 00:32:20.681 "driver_specific": { 00:32:20.681 "compress": { 00:32:20.681 "name": "COMP_lvs0/lv0", 00:32:20.681 "base_bdev_name": "ac955c5b-9d94-4552-b26c-1fe7592abd05", 00:32:20.681 "pm_path": "/tmp/pmem/4bcc414b-c220-422f-8da6-2e038ac277c9" 00:32:20.681 } 00:32:20.681 } 00:32:20.681 } 00:32:20.681 ] 00:32:20.681 20:08:12 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:32:20.681 20:08:12 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:32:20.939 Running I/O for 3 seconds... 00:32:24.223 00:32:24.223 Latency(us) 00:32:24.223 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:24.223 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:32:24.223 Verification LBA range: start 0x0 length 0x3100 00:32:24.223 COMP_lvs0/lv0 : 3.01 1273.68 4.98 0.00 0.00 25009.05 2407.74 21541.40 00:32:24.223 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:32:24.223 Verification LBA range: start 0x3100 length 0x3100 00:32:24.223 COMP_lvs0/lv0 : 3.01 1275.97 4.98 0.00 0.00 24940.99 1517.30 20743.57 00:32:24.223 =================================================================================================================== 00:32:24.223 Total : 2549.66 9.96 0.00 0.00 24974.99 1517.30 21541.40 00:32:24.223 0 00:32:24.223 20:08:15 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:32:24.223 20:08:15 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:32:24.223 20:08:15 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:32:24.790 20:08:16 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:32:24.790 20:08:16 compress_isal -- compress/compress.sh@78 -- # killprocess 1556627 00:32:24.790 20:08:16 compress_isal -- common/autotest_common.sh@950 -- # '[' -z 1556627 ']' 00:32:24.790 20:08:16 compress_isal -- common/autotest_common.sh@954 -- # kill -0 1556627 00:32:24.790 20:08:16 compress_isal -- common/autotest_common.sh@955 -- # uname 00:32:24.790 20:08:16 compress_isal -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:32:24.791 20:08:16 compress_isal -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1556627 00:32:24.791 20:08:16 compress_isal -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:32:24.791 20:08:16 compress_isal -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:32:24.791 20:08:16 compress_isal -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1556627' 00:32:24.791 killing process with pid 1556627 00:32:24.791 20:08:16 compress_isal -- common/autotest_common.sh@969 -- # kill 1556627 00:32:24.791 Received shutdown signal, test time was about 3.000000 seconds 00:32:24.791 00:32:24.791 Latency(us) 00:32:24.791 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:24.791 =================================================================================================================== 00:32:24.791 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:24.791 20:08:16 compress_isal -- common/autotest_common.sh@974 -- # wait 1556627 00:32:28.077 20:08:19 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:32:28.077 20:08:19 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:32:28.077 20:08:19 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1558234 00:32:28.077 20:08:19 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:32:28.077 20:08:19 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:32:28.077 20:08:19 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1558234 00:32:28.077 20:08:19 compress_isal -- common/autotest_common.sh@831 -- # '[' -z 1558234 ']' 00:32:28.077 20:08:19 compress_isal -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:28.077 20:08:19 compress_isal -- common/autotest_common.sh@836 -- # local max_retries=100 00:32:28.077 20:08:19 compress_isal -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:28.077 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:28.077 20:08:19 compress_isal -- common/autotest_common.sh@840 -- # xtrace_disable 00:32:28.077 20:08:19 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:32:28.077 [2024-07-24 20:08:19.315944] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:32:28.077 [2024-07-24 20:08:19.316022] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1558234 ] 00:32:28.077 [2024-07-24 20:08:19.437739] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:28.077 [2024-07-24 20:08:19.580439] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:28.077 [2024-07-24 20:08:19.580448] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:28.644 20:08:20 compress_isal -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:32:28.644 20:08:20 compress_isal -- common/autotest_common.sh@864 -- # return 0 00:32:28.644 20:08:20 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:32:28.644 20:08:20 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:32:28.644 20:08:20 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:32:29.212 20:08:20 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:32:29.212 20:08:20 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:32:29.212 20:08:20 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:32:29.212 20:08:20 compress_isal -- common/autotest_common.sh@901 -- # local i 00:32:29.212 20:08:20 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:32:29.212 20:08:20 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:32:29.212 20:08:20 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:29.481 20:08:21 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:32:29.740 [ 00:32:29.741 { 00:32:29.741 "name": "Nvme0n1", 00:32:29.741 "aliases": [ 00:32:29.741 "01000000-0000-0000-5cd2-e43197705251" 00:32:29.741 ], 00:32:29.741 "product_name": "NVMe disk", 00:32:29.741 "block_size": 512, 00:32:29.741 "num_blocks": 15002931888, 00:32:29.741 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:32:29.741 "assigned_rate_limits": { 00:32:29.741 "rw_ios_per_sec": 0, 00:32:29.741 "rw_mbytes_per_sec": 0, 00:32:29.741 "r_mbytes_per_sec": 0, 00:32:29.741 "w_mbytes_per_sec": 0 00:32:29.741 }, 00:32:29.741 "claimed": false, 00:32:29.741 "zoned": false, 00:32:29.741 "supported_io_types": { 00:32:29.741 "read": true, 00:32:29.741 "write": true, 00:32:29.741 "unmap": true, 00:32:29.741 "flush": true, 00:32:29.741 "reset": true, 00:32:29.741 "nvme_admin": true, 00:32:29.741 "nvme_io": true, 00:32:29.741 "nvme_io_md": false, 00:32:29.741 "write_zeroes": true, 00:32:29.741 "zcopy": false, 00:32:29.741 "get_zone_info": false, 00:32:29.741 "zone_management": false, 00:32:29.741 "zone_append": false, 00:32:29.741 "compare": false, 00:32:29.741 "compare_and_write": false, 00:32:29.741 "abort": true, 00:32:29.741 "seek_hole": false, 00:32:29.741 "seek_data": false, 00:32:29.741 "copy": false, 00:32:29.741 "nvme_iov_md": false 00:32:29.741 }, 00:32:29.741 "driver_specific": { 00:32:29.741 "nvme": [ 00:32:29.741 { 00:32:29.741 "pci_address": "0000:5e:00.0", 00:32:29.741 "trid": { 00:32:29.741 "trtype": "PCIe", 00:32:29.741 "traddr": "0000:5e:00.0" 00:32:29.741 }, 00:32:29.741 "ctrlr_data": { 00:32:29.741 "cntlid": 0, 00:32:29.741 "vendor_id": "0x8086", 00:32:29.741 "model_number": "INTEL SSDPF2KX076TZO", 00:32:29.741 "serial_number": "PHAC0301002G7P6CGN", 00:32:29.741 "firmware_revision": "JCV10200", 00:32:29.741 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:32:29.741 "oacs": { 00:32:29.741 "security": 1, 00:32:29.741 "format": 1, 00:32:29.741 "firmware": 1, 00:32:29.741 "ns_manage": 1 00:32:29.741 }, 00:32:29.741 "multi_ctrlr": false, 00:32:29.741 "ana_reporting": false 00:32:29.741 }, 00:32:29.741 "vs": { 00:32:29.741 "nvme_version": "1.3" 00:32:29.741 }, 00:32:29.741 "ns_data": { 00:32:29.741 "id": 1, 00:32:29.741 "can_share": false 00:32:29.741 }, 00:32:29.741 "security": { 00:32:29.741 "opal": true 00:32:29.741 } 00:32:29.741 } 00:32:29.741 ], 00:32:29.741 "mp_policy": "active_passive" 00:32:29.741 } 00:32:29.741 } 00:32:29.741 ] 00:32:29.741 20:08:21 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:32:29.741 20:08:21 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:32:32.274 b9616573-1dba-43c6-aef7-c22252fd6a42 00:32:32.274 20:08:23 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:32:32.274 dfe4bc91-873a-4643-9100-799dcb82c5bd 00:32:32.532 20:08:23 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:32:32.532 20:08:23 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:32:32.532 20:08:23 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:32:32.532 20:08:23 compress_isal -- common/autotest_common.sh@901 -- # local i 00:32:32.532 20:08:23 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:32:32.532 20:08:23 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:32:32.532 20:08:23 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:32.791 20:08:24 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:32:32.791 [ 00:32:32.791 { 00:32:32.791 "name": "dfe4bc91-873a-4643-9100-799dcb82c5bd", 00:32:32.791 "aliases": [ 00:32:32.791 "lvs0/lv0" 00:32:32.791 ], 00:32:32.791 "product_name": "Logical Volume", 00:32:32.791 "block_size": 512, 00:32:32.791 "num_blocks": 204800, 00:32:32.791 "uuid": "dfe4bc91-873a-4643-9100-799dcb82c5bd", 00:32:32.791 "assigned_rate_limits": { 00:32:32.791 "rw_ios_per_sec": 0, 00:32:32.791 "rw_mbytes_per_sec": 0, 00:32:32.791 "r_mbytes_per_sec": 0, 00:32:32.791 "w_mbytes_per_sec": 0 00:32:32.791 }, 00:32:32.791 "claimed": false, 00:32:32.791 "zoned": false, 00:32:32.791 "supported_io_types": { 00:32:32.791 "read": true, 00:32:32.791 "write": true, 00:32:32.791 "unmap": true, 00:32:32.791 "flush": false, 00:32:32.791 "reset": true, 00:32:32.791 "nvme_admin": false, 00:32:32.791 "nvme_io": false, 00:32:32.791 "nvme_io_md": false, 00:32:32.791 "write_zeroes": true, 00:32:32.791 "zcopy": false, 00:32:32.791 "get_zone_info": false, 00:32:32.791 "zone_management": false, 00:32:32.791 "zone_append": false, 00:32:32.791 "compare": false, 00:32:32.791 "compare_and_write": false, 00:32:32.791 "abort": false, 00:32:32.791 "seek_hole": true, 00:32:32.791 "seek_data": true, 00:32:32.791 "copy": false, 00:32:32.791 "nvme_iov_md": false 00:32:32.791 }, 00:32:32.791 "driver_specific": { 00:32:32.791 "lvol": { 00:32:32.791 "lvol_store_uuid": "b9616573-1dba-43c6-aef7-c22252fd6a42", 00:32:32.791 "base_bdev": "Nvme0n1", 00:32:32.791 "thin_provision": true, 00:32:32.791 "num_allocated_clusters": 0, 00:32:32.791 "snapshot": false, 00:32:32.791 "clone": false, 00:32:32.791 "esnap_clone": false 00:32:32.791 } 00:32:32.791 } 00:32:32.791 } 00:32:32.791 ] 00:32:32.791 20:08:24 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:32:32.791 20:08:24 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:32:32.791 20:08:24 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:32:33.050 [2024-07-24 20:08:24.520508] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:32:33.050 COMP_lvs0/lv0 00:32:33.050 20:08:24 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:32:33.050 20:08:24 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:32:33.050 20:08:24 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:32:33.050 20:08:24 compress_isal -- common/autotest_common.sh@901 -- # local i 00:32:33.050 20:08:24 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:32:33.050 20:08:24 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:32:33.050 20:08:24 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:33.309 20:08:24 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:32:33.309 [ 00:32:33.309 { 00:32:33.309 "name": "COMP_lvs0/lv0", 00:32:33.309 "aliases": [ 00:32:33.309 "ef2fe991-4625-5a0f-ab3f-fcb0b4df0666" 00:32:33.309 ], 00:32:33.309 "product_name": "compress", 00:32:33.309 "block_size": 4096, 00:32:33.309 "num_blocks": 25088, 00:32:33.309 "uuid": "ef2fe991-4625-5a0f-ab3f-fcb0b4df0666", 00:32:33.309 "assigned_rate_limits": { 00:32:33.309 "rw_ios_per_sec": 0, 00:32:33.309 "rw_mbytes_per_sec": 0, 00:32:33.309 "r_mbytes_per_sec": 0, 00:32:33.309 "w_mbytes_per_sec": 0 00:32:33.309 }, 00:32:33.309 "claimed": false, 00:32:33.309 "zoned": false, 00:32:33.309 "supported_io_types": { 00:32:33.309 "read": true, 00:32:33.309 "write": true, 00:32:33.309 "unmap": false, 00:32:33.309 "flush": false, 00:32:33.309 "reset": false, 00:32:33.309 "nvme_admin": false, 00:32:33.309 "nvme_io": false, 00:32:33.309 "nvme_io_md": false, 00:32:33.309 "write_zeroes": true, 00:32:33.309 "zcopy": false, 00:32:33.309 "get_zone_info": false, 00:32:33.309 "zone_management": false, 00:32:33.309 "zone_append": false, 00:32:33.309 "compare": false, 00:32:33.309 "compare_and_write": false, 00:32:33.309 "abort": false, 00:32:33.309 "seek_hole": false, 00:32:33.309 "seek_data": false, 00:32:33.309 "copy": false, 00:32:33.309 "nvme_iov_md": false 00:32:33.309 }, 00:32:33.309 "driver_specific": { 00:32:33.310 "compress": { 00:32:33.310 "name": "COMP_lvs0/lv0", 00:32:33.310 "base_bdev_name": "dfe4bc91-873a-4643-9100-799dcb82c5bd", 00:32:33.310 "pm_path": "/tmp/pmem/ba444911-3ae4-4968-8241-0349eedd80f3" 00:32:33.310 } 00:32:33.310 } 00:32:33.310 } 00:32:33.310 ] 00:32:33.569 20:08:24 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:32:33.569 20:08:24 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:32:33.569 Running I/O for 3 seconds... 00:32:36.858 00:32:36.858 Latency(us) 00:32:36.858 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:36.858 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:32:36.858 Verification LBA range: start 0x0 length 0x3100 00:32:36.858 COMP_lvs0/lv0 : 3.01 2101.87 8.21 0.00 0.00 15135.01 1146.88 13278.16 00:32:36.858 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:32:36.858 Verification LBA range: start 0x3100 length 0x3100 00:32:36.858 COMP_lvs0/lv0 : 3.01 2100.20 8.20 0.00 0.00 15109.88 1253.73 13107.20 00:32:36.858 =================================================================================================================== 00:32:36.858 Total : 4202.07 16.41 0.00 0.00 15122.45 1146.88 13278.16 00:32:36.858 0 00:32:36.858 20:08:28 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:32:36.858 20:08:28 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:32:36.858 20:08:28 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:32:37.116 20:08:28 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:32:37.116 20:08:28 compress_isal -- compress/compress.sh@78 -- # killprocess 1558234 00:32:37.116 20:08:28 compress_isal -- common/autotest_common.sh@950 -- # '[' -z 1558234 ']' 00:32:37.116 20:08:28 compress_isal -- common/autotest_common.sh@954 -- # kill -0 1558234 00:32:37.116 20:08:28 compress_isal -- common/autotest_common.sh@955 -- # uname 00:32:37.116 20:08:28 compress_isal -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:32:37.116 20:08:28 compress_isal -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1558234 00:32:37.116 20:08:28 compress_isal -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:32:37.116 20:08:28 compress_isal -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:32:37.116 20:08:28 compress_isal -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1558234' 00:32:37.116 killing process with pid 1558234 00:32:37.116 20:08:28 compress_isal -- common/autotest_common.sh@969 -- # kill 1558234 00:32:37.116 Received shutdown signal, test time was about 3.000000 seconds 00:32:37.116 00:32:37.116 Latency(us) 00:32:37.116 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:37.117 =================================================================================================================== 00:32:37.117 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:37.117 20:08:28 compress_isal -- common/autotest_common.sh@974 -- # wait 1558234 00:32:40.404 20:08:31 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:32:40.404 20:08:31 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:32:40.404 20:08:31 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=1559842 00:32:40.404 20:08:31 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:32:40.404 20:08:31 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:32:40.404 20:08:31 compress_isal -- compress/compress.sh@57 -- # waitforlisten 1559842 00:32:40.404 20:08:31 compress_isal -- common/autotest_common.sh@831 -- # '[' -z 1559842 ']' 00:32:40.404 20:08:31 compress_isal -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:40.404 20:08:31 compress_isal -- common/autotest_common.sh@836 -- # local max_retries=100 00:32:40.404 20:08:31 compress_isal -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:40.404 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:40.404 20:08:31 compress_isal -- common/autotest_common.sh@840 -- # xtrace_disable 00:32:40.404 20:08:31 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:32:40.404 [2024-07-24 20:08:31.617283] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:32:40.404 [2024-07-24 20:08:31.617339] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1559842 ] 00:32:40.404 [2024-07-24 20:08:31.719767] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:32:40.404 [2024-07-24 20:08:31.826304] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:40.404 [2024-07-24 20:08:31.826349] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:40.404 [2024-07-24 20:08:31.826348] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:40.404 20:08:31 compress_isal -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:32:40.404 20:08:31 compress_isal -- common/autotest_common.sh@864 -- # return 0 00:32:40.404 20:08:31 compress_isal -- compress/compress.sh@58 -- # create_vols 00:32:40.404 20:08:31 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:32:40.404 20:08:31 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:32:41.340 20:08:32 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:32:41.340 20:08:32 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:32:41.340 20:08:32 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:32:41.340 20:08:32 compress_isal -- common/autotest_common.sh@901 -- # local i 00:32:41.340 20:08:32 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:32:41.340 20:08:32 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:32:41.340 20:08:32 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:41.340 20:08:32 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:32:41.340 [ 00:32:41.340 { 00:32:41.340 "name": "Nvme0n1", 00:32:41.340 "aliases": [ 00:32:41.340 "01000000-0000-0000-5cd2-e43197705251" 00:32:41.340 ], 00:32:41.340 "product_name": "NVMe disk", 00:32:41.340 "block_size": 512, 00:32:41.340 "num_blocks": 15002931888, 00:32:41.341 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:32:41.341 "assigned_rate_limits": { 00:32:41.341 "rw_ios_per_sec": 0, 00:32:41.341 "rw_mbytes_per_sec": 0, 00:32:41.341 "r_mbytes_per_sec": 0, 00:32:41.341 "w_mbytes_per_sec": 0 00:32:41.341 }, 00:32:41.341 "claimed": false, 00:32:41.341 "zoned": false, 00:32:41.341 "supported_io_types": { 00:32:41.341 "read": true, 00:32:41.341 "write": true, 00:32:41.341 "unmap": true, 00:32:41.341 "flush": true, 00:32:41.341 "reset": true, 00:32:41.341 "nvme_admin": true, 00:32:41.341 "nvme_io": true, 00:32:41.341 "nvme_io_md": false, 00:32:41.341 "write_zeroes": true, 00:32:41.341 "zcopy": false, 00:32:41.341 "get_zone_info": false, 00:32:41.341 "zone_management": false, 00:32:41.341 "zone_append": false, 00:32:41.341 "compare": false, 00:32:41.341 "compare_and_write": false, 00:32:41.341 "abort": true, 00:32:41.341 "seek_hole": false, 00:32:41.341 "seek_data": false, 00:32:41.341 "copy": false, 00:32:41.341 "nvme_iov_md": false 00:32:41.341 }, 00:32:41.341 "driver_specific": { 00:32:41.341 "nvme": [ 00:32:41.341 { 00:32:41.341 "pci_address": "0000:5e:00.0", 00:32:41.341 "trid": { 00:32:41.341 "trtype": "PCIe", 00:32:41.341 "traddr": "0000:5e:00.0" 00:32:41.341 }, 00:32:41.341 "ctrlr_data": { 00:32:41.341 "cntlid": 0, 00:32:41.341 "vendor_id": "0x8086", 00:32:41.341 "model_number": "INTEL SSDPF2KX076TZO", 00:32:41.341 "serial_number": "PHAC0301002G7P6CGN", 00:32:41.341 "firmware_revision": "JCV10200", 00:32:41.341 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:32:41.341 "oacs": { 00:32:41.341 "security": 1, 00:32:41.341 "format": 1, 00:32:41.341 "firmware": 1, 00:32:41.341 "ns_manage": 1 00:32:41.341 }, 00:32:41.341 "multi_ctrlr": false, 00:32:41.341 "ana_reporting": false 00:32:41.341 }, 00:32:41.341 "vs": { 00:32:41.341 "nvme_version": "1.3" 00:32:41.341 }, 00:32:41.341 "ns_data": { 00:32:41.341 "id": 1, 00:32:41.341 "can_share": false 00:32:41.341 }, 00:32:41.341 "security": { 00:32:41.341 "opal": true 00:32:41.341 } 00:32:41.341 } 00:32:41.341 ], 00:32:41.341 "mp_policy": "active_passive" 00:32:41.341 } 00:32:41.341 } 00:32:41.341 ] 00:32:41.599 20:08:32 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:32:41.599 20:08:32 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:32:44.129 c913b629-47a8-4389-908c-f05485ef85ae 00:32:44.130 20:08:35 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:32:44.130 c158337a-d151-4be3-85af-b39d94449345 00:32:44.130 20:08:35 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:32:44.130 20:08:35 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:32:44.130 20:08:35 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:32:44.130 20:08:35 compress_isal -- common/autotest_common.sh@901 -- # local i 00:32:44.130 20:08:35 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:32:44.130 20:08:35 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:32:44.130 20:08:35 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:44.388 20:08:35 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:32:44.388 [ 00:32:44.388 { 00:32:44.388 "name": "c158337a-d151-4be3-85af-b39d94449345", 00:32:44.388 "aliases": [ 00:32:44.388 "lvs0/lv0" 00:32:44.388 ], 00:32:44.388 "product_name": "Logical Volume", 00:32:44.388 "block_size": 512, 00:32:44.388 "num_blocks": 204800, 00:32:44.388 "uuid": "c158337a-d151-4be3-85af-b39d94449345", 00:32:44.388 "assigned_rate_limits": { 00:32:44.388 "rw_ios_per_sec": 0, 00:32:44.388 "rw_mbytes_per_sec": 0, 00:32:44.388 "r_mbytes_per_sec": 0, 00:32:44.388 "w_mbytes_per_sec": 0 00:32:44.388 }, 00:32:44.388 "claimed": false, 00:32:44.388 "zoned": false, 00:32:44.388 "supported_io_types": { 00:32:44.388 "read": true, 00:32:44.388 "write": true, 00:32:44.388 "unmap": true, 00:32:44.388 "flush": false, 00:32:44.388 "reset": true, 00:32:44.388 "nvme_admin": false, 00:32:44.388 "nvme_io": false, 00:32:44.388 "nvme_io_md": false, 00:32:44.388 "write_zeroes": true, 00:32:44.388 "zcopy": false, 00:32:44.388 "get_zone_info": false, 00:32:44.388 "zone_management": false, 00:32:44.388 "zone_append": false, 00:32:44.388 "compare": false, 00:32:44.388 "compare_and_write": false, 00:32:44.388 "abort": false, 00:32:44.388 "seek_hole": true, 00:32:44.388 "seek_data": true, 00:32:44.388 "copy": false, 00:32:44.388 "nvme_iov_md": false 00:32:44.388 }, 00:32:44.388 "driver_specific": { 00:32:44.388 "lvol": { 00:32:44.388 "lvol_store_uuid": "c913b629-47a8-4389-908c-f05485ef85ae", 00:32:44.388 "base_bdev": "Nvme0n1", 00:32:44.388 "thin_provision": true, 00:32:44.388 "num_allocated_clusters": 0, 00:32:44.388 "snapshot": false, 00:32:44.388 "clone": false, 00:32:44.388 "esnap_clone": false 00:32:44.388 } 00:32:44.388 } 00:32:44.388 } 00:32:44.388 ] 00:32:44.388 20:08:35 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:32:44.388 20:08:35 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:32:44.388 20:08:35 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:32:44.647 [2024-07-24 20:08:36.076256] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:32:44.647 COMP_lvs0/lv0 00:32:44.647 20:08:36 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:32:44.647 20:08:36 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:32:44.647 20:08:36 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:32:44.647 20:08:36 compress_isal -- common/autotest_common.sh@901 -- # local i 00:32:44.647 20:08:36 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:32:44.647 20:08:36 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:32:44.647 20:08:36 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:44.905 20:08:36 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:32:45.164 [ 00:32:45.164 { 00:32:45.164 "name": "COMP_lvs0/lv0", 00:32:45.164 "aliases": [ 00:32:45.164 "90e77fd8-cb26-5b01-8417-36f1f1398784" 00:32:45.164 ], 00:32:45.164 "product_name": "compress", 00:32:45.164 "block_size": 512, 00:32:45.164 "num_blocks": 200704, 00:32:45.164 "uuid": "90e77fd8-cb26-5b01-8417-36f1f1398784", 00:32:45.164 "assigned_rate_limits": { 00:32:45.164 "rw_ios_per_sec": 0, 00:32:45.164 "rw_mbytes_per_sec": 0, 00:32:45.164 "r_mbytes_per_sec": 0, 00:32:45.164 "w_mbytes_per_sec": 0 00:32:45.164 }, 00:32:45.164 "claimed": false, 00:32:45.164 "zoned": false, 00:32:45.164 "supported_io_types": { 00:32:45.164 "read": true, 00:32:45.164 "write": true, 00:32:45.164 "unmap": false, 00:32:45.164 "flush": false, 00:32:45.164 "reset": false, 00:32:45.164 "nvme_admin": false, 00:32:45.164 "nvme_io": false, 00:32:45.164 "nvme_io_md": false, 00:32:45.164 "write_zeroes": true, 00:32:45.164 "zcopy": false, 00:32:45.164 "get_zone_info": false, 00:32:45.164 "zone_management": false, 00:32:45.164 "zone_append": false, 00:32:45.164 "compare": false, 00:32:45.164 "compare_and_write": false, 00:32:45.164 "abort": false, 00:32:45.164 "seek_hole": false, 00:32:45.164 "seek_data": false, 00:32:45.164 "copy": false, 00:32:45.164 "nvme_iov_md": false 00:32:45.164 }, 00:32:45.164 "driver_specific": { 00:32:45.164 "compress": { 00:32:45.164 "name": "COMP_lvs0/lv0", 00:32:45.164 "base_bdev_name": "c158337a-d151-4be3-85af-b39d94449345", 00:32:45.164 "pm_path": "/tmp/pmem/911b87d5-9bae-426b-b3f4-6272cd405e36" 00:32:45.164 } 00:32:45.164 } 00:32:45.164 } 00:32:45.164 ] 00:32:45.164 20:08:36 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:32:45.164 20:08:36 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:32:45.164 I/O targets: 00:32:45.164 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:32:45.164 00:32:45.164 00:32:45.164 CUnit - A unit testing framework for C - Version 2.1-3 00:32:45.164 http://cunit.sourceforge.net/ 00:32:45.164 00:32:45.164 00:32:45.164 Suite: bdevio tests on: COMP_lvs0/lv0 00:32:45.164 Test: blockdev write read block ...passed 00:32:45.164 Test: blockdev write zeroes read block ...passed 00:32:45.164 Test: blockdev write zeroes read no split ...passed 00:32:45.164 Test: blockdev write zeroes read split ...passed 00:32:45.422 Test: blockdev write zeroes read split partial ...passed 00:32:45.422 Test: blockdev reset ...[2024-07-24 20:08:36.788629] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:32:45.422 passed 00:32:45.422 Test: blockdev write read 8 blocks ...passed 00:32:45.422 Test: blockdev write read size > 128k ...passed 00:32:45.422 Test: blockdev write read invalid size ...passed 00:32:45.422 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:45.422 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:45.422 Test: blockdev write read max offset ...passed 00:32:45.422 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:45.422 Test: blockdev writev readv 8 blocks ...passed 00:32:45.422 Test: blockdev writev readv 30 x 1block ...passed 00:32:45.422 Test: blockdev writev readv block ...passed 00:32:45.422 Test: blockdev writev readv size > 128k ...passed 00:32:45.422 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:45.422 Test: blockdev comparev and writev ...passed 00:32:45.422 Test: blockdev nvme passthru rw ...passed 00:32:45.422 Test: blockdev nvme passthru vendor specific ...passed 00:32:45.422 Test: blockdev nvme admin passthru ...passed 00:32:45.422 Test: blockdev copy ...passed 00:32:45.422 00:32:45.422 Run Summary: Type Total Ran Passed Failed Inactive 00:32:45.422 suites 1 1 n/a 0 0 00:32:45.422 tests 23 23 23 0 0 00:32:45.422 asserts 130 130 130 0 n/a 00:32:45.422 00:32:45.422 Elapsed time = 0.292 seconds 00:32:45.422 0 00:32:45.422 20:08:36 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:32:45.422 20:08:36 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:32:45.422 20:08:37 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:32:45.680 20:08:37 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:32:45.680 20:08:37 compress_isal -- compress/compress.sh@62 -- # killprocess 1559842 00:32:45.680 20:08:37 compress_isal -- common/autotest_common.sh@950 -- # '[' -z 1559842 ']' 00:32:45.680 20:08:37 compress_isal -- common/autotest_common.sh@954 -- # kill -0 1559842 00:32:45.680 20:08:37 compress_isal -- common/autotest_common.sh@955 -- # uname 00:32:45.680 20:08:37 compress_isal -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:32:45.680 20:08:37 compress_isal -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1559842 00:32:45.680 20:08:37 compress_isal -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:32:45.680 20:08:37 compress_isal -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:32:45.680 20:08:37 compress_isal -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1559842' 00:32:45.680 killing process with pid 1559842 00:32:45.680 20:08:37 compress_isal -- common/autotest_common.sh@969 -- # kill 1559842 00:32:45.680 20:08:37 compress_isal -- common/autotest_common.sh@974 -- # wait 1559842 00:32:48.965 20:08:40 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:32:48.965 20:08:40 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:32:48.965 00:32:48.965 real 0m47.384s 00:32:48.965 user 1m48.869s 00:32:48.965 sys 0m4.532s 00:32:48.965 20:08:40 compress_isal -- common/autotest_common.sh@1126 -- # xtrace_disable 00:32:48.965 20:08:40 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:32:48.965 ************************************ 00:32:48.965 END TEST compress_isal 00:32:48.965 ************************************ 00:32:48.965 20:08:40 -- spdk/autotest.sh@356 -- # '[' 0 -eq 1 ']' 00:32:48.965 20:08:40 -- spdk/autotest.sh@360 -- # '[' 1 -eq 1 ']' 00:32:48.965 20:08:40 -- spdk/autotest.sh@361 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:32:48.965 20:08:40 -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:32:48.965 20:08:40 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:32:48.965 20:08:40 -- common/autotest_common.sh@10 -- # set +x 00:32:48.965 ************************************ 00:32:48.965 START TEST blockdev_crypto_aesni 00:32:48.965 ************************************ 00:32:48.965 20:08:40 blockdev_crypto_aesni -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:32:48.965 * Looking for test storage... 00:32:48.965 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/blockdev.sh@673 -- # uname -s 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/blockdev.sh@681 -- # test_type=crypto_aesni 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # crypto_device= 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # dek= 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # env_ctx= 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/blockdev.sh@689 -- # [[ crypto_aesni == bdev ]] 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/blockdev.sh@689 -- # [[ crypto_aesni == crypto_* ]] 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1561063 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:32:48.965 20:08:40 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 1561063 00:32:48.965 20:08:40 blockdev_crypto_aesni -- common/autotest_common.sh@831 -- # '[' -z 1561063 ']' 00:32:48.966 20:08:40 blockdev_crypto_aesni -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:48.966 20:08:40 blockdev_crypto_aesni -- common/autotest_common.sh@836 -- # local max_retries=100 00:32:48.966 20:08:40 blockdev_crypto_aesni -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:48.966 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:48.966 20:08:40 blockdev_crypto_aesni -- common/autotest_common.sh@840 -- # xtrace_disable 00:32:48.966 20:08:40 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:48.966 [2024-07-24 20:08:40.529487] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:32:48.966 [2024-07-24 20:08:40.529564] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1561063 ] 00:32:49.223 [2024-07-24 20:08:40.655894] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:49.223 [2024-07-24 20:08:40.758541] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:50.160 20:08:41 blockdev_crypto_aesni -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:32:50.160 20:08:41 blockdev_crypto_aesni -- common/autotest_common.sh@864 -- # return 0 00:32:50.160 20:08:41 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:32:50.160 20:08:41 blockdev_crypto_aesni -- bdev/blockdev.sh@704 -- # setup_crypto_aesni_conf 00:32:50.160 20:08:41 blockdev_crypto_aesni -- bdev/blockdev.sh@145 -- # rpc_cmd 00:32:50.160 20:08:41 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:32:50.160 20:08:41 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:50.160 [2024-07-24 20:08:41.468779] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:50.160 [2024-07-24 20:08:41.476812] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:50.160 [2024-07-24 20:08:41.484829] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:50.160 [2024-07-24 20:08:41.558761] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:52.693 true 00:32:52.693 true 00:32:52.693 true 00:32:52.693 true 00:32:52.693 Malloc0 00:32:52.693 Malloc1 00:32:52.693 Malloc2 00:32:52.693 Malloc3 00:32:52.693 [2024-07-24 20:08:43.952014] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:52.693 crypto_ram 00:32:52.693 [2024-07-24 20:08:43.960030] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:52.693 crypto_ram2 00:32:52.693 [2024-07-24 20:08:43.968051] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:52.693 crypto_ram3 00:32:52.693 [2024-07-24 20:08:43.976075] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:52.693 crypto_ram4 00:32:52.693 20:08:43 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:32:52.693 20:08:43 blockdev_crypto_aesni -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:32:52.693 20:08:43 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:32:52.693 20:08:43 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:52.693 20:08:43 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:32:52.693 20:08:43 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # cat 00:32:52.693 20:08:43 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:32:52.693 20:08:43 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:32:52.693 20:08:43 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:52.693 20:08:44 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:32:52.693 20:08:44 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:32:52.693 20:08:44 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:32:52.693 20:08:44 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:52.693 20:08:44 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:32:52.693 20:08:44 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:32:52.693 20:08:44 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:32:52.693 20:08:44 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:52.693 20:08:44 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:32:52.693 20:08:44 blockdev_crypto_aesni -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:32:52.693 20:08:44 blockdev_crypto_aesni -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:32:52.693 20:08:44 blockdev_crypto_aesni -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:32:52.693 20:08:44 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:32:52.693 20:08:44 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:52.693 20:08:44 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:32:52.693 20:08:44 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:32:52.693 20:08:44 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r .name 00:32:52.694 20:08:44 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "5c78bf3d-dbcd-5525-b1c3-8af059af5dde"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "5c78bf3d-dbcd-5525-b1c3-8af059af5dde",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "7eeeb06d-dc54-5716-8b1a-05fea4676e65"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "7eeeb06d-dc54-5716-8b1a-05fea4676e65",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "016b3bf2-d203-5d9c-afdb-3504ed3aa97c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "016b3bf2-d203-5d9c-afdb-3504ed3aa97c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "f2121967-224b-5242-83bb-41172c543207"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "f2121967-224b-5242-83bb-41172c543207",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:32:52.694 20:08:44 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:32:52.694 20:08:44 blockdev_crypto_aesni -- bdev/blockdev.sh@751 -- # hello_world_bdev=crypto_ram 00:32:52.694 20:08:44 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:32:52.694 20:08:44 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # killprocess 1561063 00:32:52.694 20:08:44 blockdev_crypto_aesni -- common/autotest_common.sh@950 -- # '[' -z 1561063 ']' 00:32:52.694 20:08:44 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # kill -0 1561063 00:32:52.694 20:08:44 blockdev_crypto_aesni -- common/autotest_common.sh@955 -- # uname 00:32:52.694 20:08:44 blockdev_crypto_aesni -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:32:52.694 20:08:44 blockdev_crypto_aesni -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1561063 00:32:52.694 20:08:44 blockdev_crypto_aesni -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:32:52.694 20:08:44 blockdev_crypto_aesni -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:32:52.694 20:08:44 blockdev_crypto_aesni -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1561063' 00:32:52.694 killing process with pid 1561063 00:32:52.694 20:08:44 blockdev_crypto_aesni -- common/autotest_common.sh@969 -- # kill 1561063 00:32:52.694 20:08:44 blockdev_crypto_aesni -- common/autotest_common.sh@974 -- # wait 1561063 00:32:53.263 20:08:44 blockdev_crypto_aesni -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:32:53.263 20:08:44 blockdev_crypto_aesni -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:32:53.263 20:08:44 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:32:53.263 20:08:44 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:32:53.263 20:08:44 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:53.522 ************************************ 00:32:53.522 START TEST bdev_hello_world 00:32:53.522 ************************************ 00:32:53.522 20:08:44 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:32:53.522 [2024-07-24 20:08:44.915215] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:32:53.522 [2024-07-24 20:08:44.915277] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1561673 ] 00:32:53.522 [2024-07-24 20:08:45.045235] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:53.781 [2024-07-24 20:08:45.147612] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:53.781 [2024-07-24 20:08:45.168942] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:53.781 [2024-07-24 20:08:45.176969] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:53.781 [2024-07-24 20:08:45.184992] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:53.781 [2024-07-24 20:08:45.287880] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:56.315 [2024-07-24 20:08:47.505042] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:56.315 [2024-07-24 20:08:47.505112] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:56.315 [2024-07-24 20:08:47.505127] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:56.315 [2024-07-24 20:08:47.513060] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:56.315 [2024-07-24 20:08:47.513081] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:56.315 [2024-07-24 20:08:47.513094] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:56.315 [2024-07-24 20:08:47.521081] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:56.315 [2024-07-24 20:08:47.521101] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:56.315 [2024-07-24 20:08:47.521113] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:56.315 [2024-07-24 20:08:47.529101] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:56.315 [2024-07-24 20:08:47.529119] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:56.315 [2024-07-24 20:08:47.529132] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:56.315 [2024-07-24 20:08:47.602072] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:32:56.315 [2024-07-24 20:08:47.602118] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:32:56.315 [2024-07-24 20:08:47.602137] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:32:56.315 [2024-07-24 20:08:47.603416] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:32:56.315 [2024-07-24 20:08:47.603491] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:32:56.315 [2024-07-24 20:08:47.603509] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:32:56.315 [2024-07-24 20:08:47.603555] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:32:56.315 00:32:56.315 [2024-07-24 20:08:47.603575] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:32:56.574 00:32:56.574 real 0m3.125s 00:32:56.574 user 0m2.727s 00:32:56.574 sys 0m0.364s 00:32:56.574 20:08:47 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1126 -- # xtrace_disable 00:32:56.574 20:08:47 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:32:56.574 ************************************ 00:32:56.574 END TEST bdev_hello_world 00:32:56.574 ************************************ 00:32:56.574 20:08:48 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:32:56.574 20:08:48 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:32:56.574 20:08:48 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:32:56.574 20:08:48 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:56.574 ************************************ 00:32:56.574 START TEST bdev_bounds 00:32:56.574 ************************************ 00:32:56.574 20:08:48 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1125 -- # bdev_bounds '' 00:32:56.574 20:08:48 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=1562044 00:32:56.574 20:08:48 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:32:56.574 20:08:48 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:32:56.574 20:08:48 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 1562044' 00:32:56.574 Process bdevio pid: 1562044 00:32:56.574 20:08:48 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 1562044 00:32:56.574 20:08:48 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@831 -- # '[' -z 1562044 ']' 00:32:56.574 20:08:48 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:56.574 20:08:48 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@836 -- # local max_retries=100 00:32:56.574 20:08:48 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:56.574 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:56.574 20:08:48 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@840 -- # xtrace_disable 00:32:56.574 20:08:48 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:32:56.574 [2024-07-24 20:08:48.128826] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:32:56.574 [2024-07-24 20:08:48.128893] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1562044 ] 00:32:56.835 [2024-07-24 20:08:48.259706] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:32:56.835 [2024-07-24 20:08:48.369130] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:56.835 [2024-07-24 20:08:48.369230] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:56.835 [2024-07-24 20:08:48.369231] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:56.835 [2024-07-24 20:08:48.390714] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:56.835 [2024-07-24 20:08:48.398739] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:56.835 [2024-07-24 20:08:48.406759] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:57.150 [2024-07-24 20:08:48.516306] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:59.731 [2024-07-24 20:08:50.756702] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:59.731 [2024-07-24 20:08:50.756778] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:59.731 [2024-07-24 20:08:50.756793] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:59.731 [2024-07-24 20:08:50.764718] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:59.731 [2024-07-24 20:08:50.764740] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:59.731 [2024-07-24 20:08:50.764757] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:59.731 [2024-07-24 20:08:50.772744] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:59.731 [2024-07-24 20:08:50.772762] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:59.731 [2024-07-24 20:08:50.772773] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:59.731 [2024-07-24 20:08:50.780770] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:59.731 [2024-07-24 20:08:50.780790] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:59.731 [2024-07-24 20:08:50.780802] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:59.731 20:08:50 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:32:59.731 20:08:50 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@864 -- # return 0 00:32:59.731 20:08:50 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:32:59.731 I/O targets: 00:32:59.731 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:32:59.731 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:32:59.731 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:32:59.731 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:32:59.731 00:32:59.731 00:32:59.731 CUnit - A unit testing framework for C - Version 2.1-3 00:32:59.731 http://cunit.sourceforge.net/ 00:32:59.731 00:32:59.731 00:32:59.731 Suite: bdevio tests on: crypto_ram4 00:32:59.731 Test: blockdev write read block ...passed 00:32:59.731 Test: blockdev write zeroes read block ...passed 00:32:59.731 Test: blockdev write zeroes read no split ...passed 00:32:59.731 Test: blockdev write zeroes read split ...passed 00:32:59.731 Test: blockdev write zeroes read split partial ...passed 00:32:59.731 Test: blockdev reset ...passed 00:32:59.731 Test: blockdev write read 8 blocks ...passed 00:32:59.731 Test: blockdev write read size > 128k ...passed 00:32:59.731 Test: blockdev write read invalid size ...passed 00:32:59.731 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:59.731 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:59.731 Test: blockdev write read max offset ...passed 00:32:59.731 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:59.731 Test: blockdev writev readv 8 blocks ...passed 00:32:59.731 Test: blockdev writev readv 30 x 1block ...passed 00:32:59.731 Test: blockdev writev readv block ...passed 00:32:59.731 Test: blockdev writev readv size > 128k ...passed 00:32:59.731 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:59.731 Test: blockdev comparev and writev ...passed 00:32:59.731 Test: blockdev nvme passthru rw ...passed 00:32:59.731 Test: blockdev nvme passthru vendor specific ...passed 00:32:59.731 Test: blockdev nvme admin passthru ...passed 00:32:59.731 Test: blockdev copy ...passed 00:32:59.731 Suite: bdevio tests on: crypto_ram3 00:32:59.731 Test: blockdev write read block ...passed 00:32:59.731 Test: blockdev write zeroes read block ...passed 00:32:59.731 Test: blockdev write zeroes read no split ...passed 00:32:59.731 Test: blockdev write zeroes read split ...passed 00:32:59.731 Test: blockdev write zeroes read split partial ...passed 00:32:59.731 Test: blockdev reset ...passed 00:32:59.731 Test: blockdev write read 8 blocks ...passed 00:32:59.731 Test: blockdev write read size > 128k ...passed 00:32:59.731 Test: blockdev write read invalid size ...passed 00:32:59.731 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:59.731 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:59.731 Test: blockdev write read max offset ...passed 00:32:59.731 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:59.731 Test: blockdev writev readv 8 blocks ...passed 00:32:59.731 Test: blockdev writev readv 30 x 1block ...passed 00:32:59.731 Test: blockdev writev readv block ...passed 00:32:59.731 Test: blockdev writev readv size > 128k ...passed 00:32:59.731 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:59.731 Test: blockdev comparev and writev ...passed 00:32:59.731 Test: blockdev nvme passthru rw ...passed 00:32:59.731 Test: blockdev nvme passthru vendor specific ...passed 00:32:59.731 Test: blockdev nvme admin passthru ...passed 00:32:59.731 Test: blockdev copy ...passed 00:32:59.731 Suite: bdevio tests on: crypto_ram2 00:32:59.731 Test: blockdev write read block ...passed 00:32:59.731 Test: blockdev write zeroes read block ...passed 00:32:59.731 Test: blockdev write zeroes read no split ...passed 00:32:59.731 Test: blockdev write zeroes read split ...passed 00:32:59.990 Test: blockdev write zeroes read split partial ...passed 00:32:59.990 Test: blockdev reset ...passed 00:32:59.991 Test: blockdev write read 8 blocks ...passed 00:32:59.991 Test: blockdev write read size > 128k ...passed 00:32:59.991 Test: blockdev write read invalid size ...passed 00:32:59.991 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:59.991 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:59.991 Test: blockdev write read max offset ...passed 00:32:59.991 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:59.991 Test: blockdev writev readv 8 blocks ...passed 00:32:59.991 Test: blockdev writev readv 30 x 1block ...passed 00:32:59.991 Test: blockdev writev readv block ...passed 00:32:59.991 Test: blockdev writev readv size > 128k ...passed 00:32:59.991 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:59.991 Test: blockdev comparev and writev ...passed 00:32:59.991 Test: blockdev nvme passthru rw ...passed 00:32:59.991 Test: blockdev nvme passthru vendor specific ...passed 00:32:59.991 Test: blockdev nvme admin passthru ...passed 00:32:59.991 Test: blockdev copy ...passed 00:32:59.991 Suite: bdevio tests on: crypto_ram 00:32:59.991 Test: blockdev write read block ...passed 00:32:59.991 Test: blockdev write zeroes read block ...passed 00:32:59.991 Test: blockdev write zeroes read no split ...passed 00:33:00.250 Test: blockdev write zeroes read split ...passed 00:33:00.250 Test: blockdev write zeroes read split partial ...passed 00:33:00.250 Test: blockdev reset ...passed 00:33:00.250 Test: blockdev write read 8 blocks ...passed 00:33:00.250 Test: blockdev write read size > 128k ...passed 00:33:00.250 Test: blockdev write read invalid size ...passed 00:33:00.250 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:00.250 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:00.250 Test: blockdev write read max offset ...passed 00:33:00.250 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:00.250 Test: blockdev writev readv 8 blocks ...passed 00:33:00.250 Test: blockdev writev readv 30 x 1block ...passed 00:33:00.250 Test: blockdev writev readv block ...passed 00:33:00.250 Test: blockdev writev readv size > 128k ...passed 00:33:00.250 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:00.250 Test: blockdev comparev and writev ...passed 00:33:00.250 Test: blockdev nvme passthru rw ...passed 00:33:00.250 Test: blockdev nvme passthru vendor specific ...passed 00:33:00.250 Test: blockdev nvme admin passthru ...passed 00:33:00.250 Test: blockdev copy ...passed 00:33:00.250 00:33:00.250 Run Summary: Type Total Ran Passed Failed Inactive 00:33:00.250 suites 4 4 n/a 0 0 00:33:00.250 tests 92 92 92 0 0 00:33:00.250 asserts 520 520 520 0 n/a 00:33:00.250 00:33:00.250 Elapsed time = 1.635 seconds 00:33:00.250 0 00:33:00.250 20:08:51 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 1562044 00:33:00.250 20:08:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@950 -- # '[' -z 1562044 ']' 00:33:00.250 20:08:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # kill -0 1562044 00:33:00.250 20:08:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@955 -- # uname 00:33:00.250 20:08:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:33:00.250 20:08:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1562044 00:33:00.510 20:08:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:33:00.510 20:08:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:33:00.510 20:08:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1562044' 00:33:00.510 killing process with pid 1562044 00:33:00.510 20:08:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@969 -- # kill 1562044 00:33:00.510 20:08:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@974 -- # wait 1562044 00:33:00.769 20:08:52 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:33:00.769 00:33:00.769 real 0m4.244s 00:33:00.769 user 0m11.237s 00:33:00.769 sys 0m0.589s 00:33:00.769 20:08:52 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:00.769 20:08:52 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:33:00.769 ************************************ 00:33:00.769 END TEST bdev_bounds 00:33:00.769 ************************************ 00:33:00.769 20:08:52 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:33:00.769 20:08:52 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:33:00.769 20:08:52 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:00.769 20:08:52 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:01.028 ************************************ 00:33:01.028 START TEST bdev_nbd 00:33:01.028 ************************************ 00:33:01.028 20:08:52 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1125 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:33:01.028 20:08:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:33:01.028 20:08:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:33:01.028 20:08:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:01.028 20:08:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:01.028 20:08:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:33:01.028 20:08:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:33:01.028 20:08:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=4 00:33:01.028 20:08:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:33:01.028 20:08:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:33:01.028 20:08:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:33:01.028 20:08:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=4 00:33:01.028 20:08:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:01.028 20:08:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:33:01.028 20:08:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:33:01.028 20:08:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:33:01.028 20:08:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=1562601 00:33:01.028 20:08:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:33:01.028 20:08:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:33:01.028 20:08:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 1562601 /var/tmp/spdk-nbd.sock 00:33:01.028 20:08:52 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@831 -- # '[' -z 1562601 ']' 00:33:01.028 20:08:52 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:33:01.028 20:08:52 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@836 -- # local max_retries=100 00:33:01.028 20:08:52 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:33:01.028 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:33:01.028 20:08:52 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@840 -- # xtrace_disable 00:33:01.028 20:08:52 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:01.028 [2024-07-24 20:08:52.474680] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:33:01.028 [2024-07-24 20:08:52.474758] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:33:01.028 [2024-07-24 20:08:52.607306] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:01.287 [2024-07-24 20:08:52.708711] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:01.287 [2024-07-24 20:08:52.730059] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:33:01.287 [2024-07-24 20:08:52.738080] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:01.287 [2024-07-24 20:08:52.746099] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:01.287 [2024-07-24 20:08:52.857367] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:33:03.825 [2024-07-24 20:08:55.088976] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:33:03.825 [2024-07-24 20:08:55.089042] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:03.825 [2024-07-24 20:08:55.089057] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:03.825 [2024-07-24 20:08:55.096994] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:33:03.825 [2024-07-24 20:08:55.097015] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:03.825 [2024-07-24 20:08:55.097027] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:03.825 [2024-07-24 20:08:55.105016] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:33:03.825 [2024-07-24 20:08:55.105035] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:03.825 [2024-07-24 20:08:55.105047] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:03.825 [2024-07-24 20:08:55.113050] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:33:03.825 [2024-07-24 20:08:55.113069] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:03.825 [2024-07-24 20:08:55.113081] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:03.825 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:33:03.825 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # return 0 00:33:03.825 20:08:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:33:03.825 20:08:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:03.825 20:08:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:33:03.825 20:08:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:33:03.825 20:08:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:33:03.825 20:08:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:03.825 20:08:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:33:03.825 20:08:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:33:03.825 20:08:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:33:03.825 20:08:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:33:03.825 20:08:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:33:03.825 20:08:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:03.825 20:08:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:33:04.083 20:08:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:33:04.083 20:08:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:33:04.083 20:08:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:33:04.083 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:33:04.083 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:33:04.083 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:33:04.083 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:33:04.083 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:33:04.083 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:33:04.083 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:33:04.083 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:33:04.083 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:04.083 1+0 records in 00:33:04.083 1+0 records out 00:33:04.083 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00029344 s, 14.0 MB/s 00:33:04.083 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:04.083 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:33:04.083 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:04.083 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:33:04.083 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:33:04.083 20:08:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:04.083 20:08:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:04.083 20:08:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:33:04.343 20:08:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:33:04.343 20:08:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:33:04.343 20:08:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:33:04.343 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:33:04.343 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:33:04.343 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:33:04.343 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:33:04.343 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:33:04.343 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:33:04.343 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:33:04.343 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:33:04.343 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:04.343 1+0 records in 00:33:04.343 1+0 records out 00:33:04.343 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000290839 s, 14.1 MB/s 00:33:04.343 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:04.343 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:33:04.343 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:04.343 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:33:04.343 20:08:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:33:04.343 20:08:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:04.343 20:08:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:04.343 20:08:55 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:33:04.602 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:33:04.602 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:33:04.602 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:33:04.602 20:08:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd2 00:33:04.602 20:08:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:33:04.602 20:08:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:33:04.602 20:08:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:33:04.602 20:08:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd2 /proc/partitions 00:33:04.602 20:08:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:33:04.602 20:08:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:33:04.602 20:08:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:33:04.602 20:08:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:04.602 1+0 records in 00:33:04.602 1+0 records out 00:33:04.602 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000337612 s, 12.1 MB/s 00:33:04.602 20:08:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:04.603 20:08:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:33:04.603 20:08:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:04.603 20:08:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:33:04.603 20:08:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:33:04.603 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:04.603 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:04.603 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:33:04.862 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:33:04.862 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:33:04.862 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:33:04.862 20:08:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd3 00:33:04.862 20:08:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:33:04.862 20:08:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:33:04.862 20:08:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:33:04.862 20:08:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd3 /proc/partitions 00:33:04.862 20:08:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:33:04.862 20:08:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:33:04.862 20:08:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:33:04.862 20:08:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:04.862 1+0 records in 00:33:04.862 1+0 records out 00:33:04.862 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000363783 s, 11.3 MB/s 00:33:04.862 20:08:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:04.862 20:08:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:33:04.862 20:08:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:04.862 20:08:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:33:04.862 20:08:56 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:33:04.862 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:04.862 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:04.862 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:05.122 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:33:05.122 { 00:33:05.122 "nbd_device": "/dev/nbd0", 00:33:05.122 "bdev_name": "crypto_ram" 00:33:05.122 }, 00:33:05.122 { 00:33:05.122 "nbd_device": "/dev/nbd1", 00:33:05.122 "bdev_name": "crypto_ram2" 00:33:05.122 }, 00:33:05.122 { 00:33:05.122 "nbd_device": "/dev/nbd2", 00:33:05.122 "bdev_name": "crypto_ram3" 00:33:05.122 }, 00:33:05.122 { 00:33:05.122 "nbd_device": "/dev/nbd3", 00:33:05.122 "bdev_name": "crypto_ram4" 00:33:05.122 } 00:33:05.122 ]' 00:33:05.122 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:33:05.122 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:33:05.122 { 00:33:05.122 "nbd_device": "/dev/nbd0", 00:33:05.122 "bdev_name": "crypto_ram" 00:33:05.122 }, 00:33:05.122 { 00:33:05.122 "nbd_device": "/dev/nbd1", 00:33:05.122 "bdev_name": "crypto_ram2" 00:33:05.122 }, 00:33:05.122 { 00:33:05.122 "nbd_device": "/dev/nbd2", 00:33:05.122 "bdev_name": "crypto_ram3" 00:33:05.122 }, 00:33:05.122 { 00:33:05.122 "nbd_device": "/dev/nbd3", 00:33:05.122 "bdev_name": "crypto_ram4" 00:33:05.122 } 00:33:05.122 ]' 00:33:05.122 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:33:05.122 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:33:05.122 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:05.122 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:33:05.122 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:05.122 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:05.122 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:05.122 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:05.380 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:05.381 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:05.381 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:05.381 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:05.381 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:05.381 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:05.381 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:05.381 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:05.381 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:05.381 20:08:56 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:33:05.949 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:05.949 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:05.949 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:05.949 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:05.949 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:05.949 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:05.949 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:05.949 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:05.949 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:05.949 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:33:06.208 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:33:06.208 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:33:06.208 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:33:06.208 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:06.208 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:06.208 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:33:06.209 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:06.209 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:06.209 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:06.209 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:33:06.468 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:33:06.468 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:33:06.468 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:33:06.468 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:06.468 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:06.468 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:33:06.468 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:06.468 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:06.468 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:06.468 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:06.468 20:08:57 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:07.036 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:33:07.036 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:33:07.036 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:07.036 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:33:07.036 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:07.036 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:33:07.036 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:33:07.036 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:33:07.036 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:33:07.036 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:33:07.036 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:33:07.036 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:33:07.036 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:33:07.036 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:07.036 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:33:07.036 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:33:07.036 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:07.036 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:33:07.036 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:33:07.036 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:07.036 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:33:07.036 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:33:07.036 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:07.037 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:33:07.037 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:33:07.037 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:33:07.037 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:07.037 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:33:07.296 /dev/nbd0 00:33:07.296 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:33:07.296 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:33:07.296 20:08:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:33:07.296 20:08:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:33:07.296 20:08:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:33:07.296 20:08:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:33:07.296 20:08:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:33:07.296 20:08:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:33:07.296 20:08:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:33:07.296 20:08:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:33:07.296 20:08:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:07.296 1+0 records in 00:33:07.296 1+0 records out 00:33:07.296 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000319065 s, 12.8 MB/s 00:33:07.296 20:08:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:07.296 20:08:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:33:07.296 20:08:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:07.296 20:08:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:33:07.296 20:08:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:33:07.296 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:07.296 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:07.296 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:33:07.556 /dev/nbd1 00:33:07.556 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:33:07.556 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:33:07.556 20:08:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:33:07.556 20:08:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:33:07.556 20:08:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:33:07.556 20:08:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:33:07.556 20:08:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:33:07.556 20:08:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:33:07.556 20:08:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:33:07.556 20:08:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:33:07.556 20:08:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:07.556 1+0 records in 00:33:07.556 1+0 records out 00:33:07.556 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000309594 s, 13.2 MB/s 00:33:07.556 20:08:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:07.556 20:08:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:33:07.556 20:08:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:07.556 20:08:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:33:07.556 20:08:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:33:07.556 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:07.556 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:07.556 20:08:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:33:07.815 /dev/nbd10 00:33:07.815 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:33:07.815 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:33:07.815 20:08:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd10 00:33:07.815 20:08:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:33:07.815 20:08:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:33:07.815 20:08:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:33:07.815 20:08:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd10 /proc/partitions 00:33:07.815 20:08:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:33:07.815 20:08:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:33:07.815 20:08:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:33:07.815 20:08:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:07.815 1+0 records in 00:33:07.815 1+0 records out 00:33:07.815 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000281385 s, 14.6 MB/s 00:33:07.815 20:08:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:07.815 20:08:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:33:07.815 20:08:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:07.815 20:08:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:33:07.815 20:08:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:33:07.815 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:07.815 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:07.815 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:33:08.074 /dev/nbd11 00:33:08.074 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:33:08.074 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:33:08.074 20:08:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd11 00:33:08.074 20:08:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:33:08.074 20:08:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:33:08.074 20:08:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:33:08.074 20:08:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd11 /proc/partitions 00:33:08.074 20:08:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:33:08.074 20:08:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:33:08.074 20:08:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:33:08.074 20:08:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:08.074 1+0 records in 00:33:08.074 1+0 records out 00:33:08.074 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000409354 s, 10.0 MB/s 00:33:08.074 20:08:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:08.074 20:08:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:33:08.074 20:08:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:08.074 20:08:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:33:08.074 20:08:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:33:08.074 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:08.074 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:08.074 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:08.074 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:08.074 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:08.333 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:33:08.333 { 00:33:08.333 "nbd_device": "/dev/nbd0", 00:33:08.333 "bdev_name": "crypto_ram" 00:33:08.333 }, 00:33:08.333 { 00:33:08.333 "nbd_device": "/dev/nbd1", 00:33:08.333 "bdev_name": "crypto_ram2" 00:33:08.333 }, 00:33:08.333 { 00:33:08.333 "nbd_device": "/dev/nbd10", 00:33:08.333 "bdev_name": "crypto_ram3" 00:33:08.333 }, 00:33:08.333 { 00:33:08.333 "nbd_device": "/dev/nbd11", 00:33:08.333 "bdev_name": "crypto_ram4" 00:33:08.333 } 00:33:08.333 ]' 00:33:08.333 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:33:08.333 { 00:33:08.333 "nbd_device": "/dev/nbd0", 00:33:08.333 "bdev_name": "crypto_ram" 00:33:08.333 }, 00:33:08.333 { 00:33:08.333 "nbd_device": "/dev/nbd1", 00:33:08.333 "bdev_name": "crypto_ram2" 00:33:08.333 }, 00:33:08.333 { 00:33:08.333 "nbd_device": "/dev/nbd10", 00:33:08.333 "bdev_name": "crypto_ram3" 00:33:08.333 }, 00:33:08.333 { 00:33:08.333 "nbd_device": "/dev/nbd11", 00:33:08.333 "bdev_name": "crypto_ram4" 00:33:08.333 } 00:33:08.333 ]' 00:33:08.333 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:08.333 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:33:08.333 /dev/nbd1 00:33:08.333 /dev/nbd10 00:33:08.333 /dev/nbd11' 00:33:08.333 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:33:08.333 /dev/nbd1 00:33:08.333 /dev/nbd10 00:33:08.333 /dev/nbd11' 00:33:08.333 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:08.333 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:33:08.333 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:33:08.333 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:33:08.333 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:33:08.334 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:33:08.334 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:08.334 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:33:08.334 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:33:08.334 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:08.334 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:33:08.334 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:33:08.334 256+0 records in 00:33:08.334 256+0 records out 00:33:08.334 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114396 s, 91.7 MB/s 00:33:08.334 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:08.334 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:33:08.592 256+0 records in 00:33:08.592 256+0 records out 00:33:08.592 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0613833 s, 17.1 MB/s 00:33:08.593 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:08.593 20:08:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:33:08.593 256+0 records in 00:33:08.593 256+0 records out 00:33:08.593 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0630533 s, 16.6 MB/s 00:33:08.593 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:08.593 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:33:08.593 256+0 records in 00:33:08.593 256+0 records out 00:33:08.593 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0510857 s, 20.5 MB/s 00:33:08.593 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:08.593 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:33:08.593 256+0 records in 00:33:08.593 256+0 records out 00:33:08.593 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0561756 s, 18.7 MB/s 00:33:08.593 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:33:08.593 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:08.593 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:33:08.593 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:33:08.593 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:08.593 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:33:08.593 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:33:08.593 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:08.593 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:33:08.593 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:08.593 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:33:08.593 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:08.593 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:33:08.593 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:08.593 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:33:08.593 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:08.593 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:33:08.593 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:08.593 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:08.593 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:08.593 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:08.593 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:08.593 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:08.852 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:08.852 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:08.852 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:08.852 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:08.852 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:08.852 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:08.852 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:08.852 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:08.852 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:08.852 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:33:09.418 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:09.418 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:09.418 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:09.418 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:09.418 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:09.418 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:09.418 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:09.418 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:09.418 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:09.418 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:33:09.418 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:33:09.418 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:33:09.418 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:33:09.418 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:09.418 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:09.418 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:33:09.418 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:09.418 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:09.418 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:09.418 20:09:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:33:09.676 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:33:09.676 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:33:09.676 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:33:09.676 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:09.676 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:09.676 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:33:09.676 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:09.676 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:09.676 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:09.676 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:09.676 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:09.933 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:33:09.933 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:33:09.933 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:10.191 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:33:10.191 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:33:10.191 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:10.191 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:33:10.191 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:33:10.191 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:33:10.191 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:33:10.191 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:33:10.192 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:33:10.192 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:33:10.192 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:10.192 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:10.192 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:33:10.192 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:33:10.192 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:33:10.450 malloc_lvol_verify 00:33:10.450 20:09:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:33:10.708 87e2faee-a8e8-467a-a111-dd47bb1841d7 00:33:10.708 20:09:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:33:10.966 69971f8b-3739-446a-9ce9-c3b5f5cec1db 00:33:10.966 20:09:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:33:11.225 /dev/nbd0 00:33:11.225 20:09:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:33:11.225 mke2fs 1.46.5 (30-Dec-2021) 00:33:11.225 Discarding device blocks: 0/4096 done 00:33:11.225 Creating filesystem with 4096 1k blocks and 1024 inodes 00:33:11.225 00:33:11.225 Allocating group tables: 0/1 done 00:33:11.225 Writing inode tables: 0/1 done 00:33:11.225 Creating journal (1024 blocks): done 00:33:11.225 Writing superblocks and filesystem accounting information: 0/1 done 00:33:11.225 00:33:11.225 20:09:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:33:11.225 20:09:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:33:11.225 20:09:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:11.225 20:09:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:33:11.225 20:09:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:11.225 20:09:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:11.225 20:09:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:11.225 20:09:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:11.483 20:09:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:11.483 20:09:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:11.483 20:09:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:11.483 20:09:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:11.483 20:09:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:11.483 20:09:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:11.483 20:09:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:11.483 20:09:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:11.483 20:09:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:33:11.483 20:09:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:33:11.483 20:09:03 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 1562601 00:33:11.484 20:09:03 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@950 -- # '[' -z 1562601 ']' 00:33:11.484 20:09:03 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # kill -0 1562601 00:33:11.484 20:09:03 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@955 -- # uname 00:33:11.484 20:09:03 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:33:11.484 20:09:03 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1562601 00:33:11.742 20:09:03 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:33:11.742 20:09:03 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:33:11.742 20:09:03 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1562601' 00:33:11.742 killing process with pid 1562601 00:33:11.742 20:09:03 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@969 -- # kill 1562601 00:33:11.742 20:09:03 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@974 -- # wait 1562601 00:33:12.000 20:09:03 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:33:12.000 00:33:12.000 real 0m11.133s 00:33:12.000 user 0m14.659s 00:33:12.000 sys 0m4.382s 00:33:12.000 20:09:03 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:12.000 20:09:03 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:12.000 ************************************ 00:33:12.000 END TEST bdev_nbd 00:33:12.000 ************************************ 00:33:12.000 20:09:03 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:33:12.000 20:09:03 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # '[' crypto_aesni = nvme ']' 00:33:12.001 20:09:03 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # '[' crypto_aesni = gpt ']' 00:33:12.001 20:09:03 blockdev_crypto_aesni -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:33:12.001 20:09:03 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:33:12.001 20:09:03 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:12.001 20:09:03 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:12.260 ************************************ 00:33:12.260 START TEST bdev_fio 00:33:12.260 ************************************ 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1125 -- # fio_test_suite '' 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:12.260 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram]' 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram2]' 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram2 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram3]' 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram3 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram4]' 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram4 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:12.260 ************************************ 00:33:12.260 START TEST bdev_fio_rw_verify 00:33:12.260 ************************************ 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:12.260 20:09:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:33:12.261 20:09:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:12.261 20:09:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:12.261 20:09:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:12.261 20:09:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:12.261 20:09:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:12.261 20:09:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:12.261 20:09:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:12.261 20:09:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:12.261 20:09:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:12.261 20:09:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:12.261 20:09:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:12.519 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:12.519 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:12.519 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:12.519 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:12.519 fio-3.35 00:33:12.519 Starting 4 threads 00:33:27.432 00:33:27.432 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1565202: Wed Jul 24 20:09:16 2024 00:33:27.432 read: IOPS=17.0k, BW=66.3MiB/s (69.5MB/s)(663MiB/10001msec) 00:33:27.432 slat (usec): min=17, max=1317, avg=78.83, stdev=39.04 00:33:27.432 clat (usec): min=20, max=1860, avg=420.84, stdev=257.85 00:33:27.432 lat (usec): min=59, max=2102, avg=499.67, stdev=279.19 00:33:27.432 clat percentiles (usec): 00:33:27.432 | 50.000th=[ 367], 99.000th=[ 1336], 99.900th=[ 1614], 99.990th=[ 1729], 00:33:27.432 | 99.999th=[ 1811] 00:33:27.432 write: IOPS=18.6k, BW=72.8MiB/s (76.3MB/s)(711MiB/9770msec); 0 zone resets 00:33:27.432 slat (usec): min=29, max=470, avg=94.75, stdev=39.67 00:33:27.432 clat (usec): min=42, max=2498, avg=510.80, stdev=306.57 00:33:27.432 lat (usec): min=93, max=2681, avg=605.55, stdev=328.72 00:33:27.432 clat percentiles (usec): 00:33:27.432 | 50.000th=[ 457], 99.000th=[ 1549], 99.900th=[ 2073], 99.990th=[ 2245], 00:33:27.432 | 99.999th=[ 2442] 00:33:27.432 bw ( KiB/s): min=57896, max=95752, per=98.03%, avg=73076.63, stdev=2768.65, samples=76 00:33:27.432 iops : min=14474, max=23938, avg=18269.16, stdev=692.16, samples=76 00:33:27.432 lat (usec) : 50=0.01%, 100=3.46%, 250=20.53%, 500=38.50%, 750=22.71% 00:33:27.432 lat (usec) : 1000=10.03% 00:33:27.432 lat (msec) : 2=4.65%, 4=0.12% 00:33:27.432 cpu : usr=99.52%, sys=0.01%, ctx=75, majf=0, minf=262 00:33:27.432 IO depths : 1=9.8%, 2=25.6%, 4=51.3%, 8=13.2%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:27.432 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:27.432 complete : 0=0.0%, 4=88.7%, 8=11.3%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:27.432 issued rwts: total=169624,182079,0,0 short=0,0,0,0 dropped=0,0,0,0 00:33:27.432 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:27.432 00:33:27.432 Run status group 0 (all jobs): 00:33:27.432 READ: bw=66.3MiB/s (69.5MB/s), 66.3MiB/s-66.3MiB/s (69.5MB/s-69.5MB/s), io=663MiB (695MB), run=10001-10001msec 00:33:27.432 WRITE: bw=72.8MiB/s (76.3MB/s), 72.8MiB/s-72.8MiB/s (76.3MB/s-76.3MB/s), io=711MiB (746MB), run=9770-9770msec 00:33:27.432 00:33:27.432 real 0m13.542s 00:33:27.432 user 0m45.874s 00:33:27.432 sys 0m0.531s 00:33:27.432 20:09:17 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:27.432 20:09:17 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:33:27.432 ************************************ 00:33:27.432 END TEST bdev_fio_rw_verify 00:33:27.432 ************************************ 00:33:27.432 20:09:17 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:33:27.432 20:09:17 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:27.432 20:09:17 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:33:27.432 20:09:17 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:27.432 20:09:17 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:33:27.432 20:09:17 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "5c78bf3d-dbcd-5525-b1c3-8af059af5dde"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "5c78bf3d-dbcd-5525-b1c3-8af059af5dde",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "7eeeb06d-dc54-5716-8b1a-05fea4676e65"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "7eeeb06d-dc54-5716-8b1a-05fea4676e65",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "016b3bf2-d203-5d9c-afdb-3504ed3aa97c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "016b3bf2-d203-5d9c-afdb-3504ed3aa97c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "f2121967-224b-5242-83bb-41172c543207"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "f2121967-224b-5242-83bb-41172c543207",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n crypto_ram 00:33:27.433 crypto_ram2 00:33:27.433 crypto_ram3 00:33:27.433 crypto_ram4 ]] 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "5c78bf3d-dbcd-5525-b1c3-8af059af5dde"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "5c78bf3d-dbcd-5525-b1c3-8af059af5dde",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "7eeeb06d-dc54-5716-8b1a-05fea4676e65"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "7eeeb06d-dc54-5716-8b1a-05fea4676e65",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "016b3bf2-d203-5d9c-afdb-3504ed3aa97c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "016b3bf2-d203-5d9c-afdb-3504ed3aa97c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "f2121967-224b-5242-83bb-41172c543207"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "f2121967-224b-5242-83bb-41172c543207",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram]' 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram2]' 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram2 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram3]' 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram3 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram4]' 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram4 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:27.433 20:09:17 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:33:27.434 20:09:17 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:27.434 20:09:17 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:27.434 ************************************ 00:33:27.434 START TEST bdev_fio_trim 00:33:27.434 ************************************ 00:33:27.434 20:09:17 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:27.434 20:09:17 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:27.434 20:09:17 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:27.434 20:09:17 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:27.434 20:09:17 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:27.434 20:09:17 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:27.434 20:09:17 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:33:27.434 20:09:17 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:27.434 20:09:17 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:27.434 20:09:17 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:27.434 20:09:17 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:33:27.434 20:09:17 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:27.434 20:09:17 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:27.434 20:09:17 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:27.434 20:09:17 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:27.434 20:09:17 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:27.434 20:09:17 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:27.434 20:09:17 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:27.434 20:09:17 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:27.434 20:09:17 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:27.434 20:09:17 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:27.434 20:09:17 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:27.434 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:27.434 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:27.434 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:27.434 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:27.434 fio-3.35 00:33:27.434 Starting 4 threads 00:33:39.630 00:33:39.630 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1567003: Wed Jul 24 20:09:30 2024 00:33:39.630 write: IOPS=37.8k, BW=148MiB/s (155MB/s)(1475MiB/10001msec); 0 zone resets 00:33:39.630 slat (usec): min=18, max=563, avg=59.89, stdev=37.74 00:33:39.630 clat (usec): min=26, max=2071, avg=269.55, stdev=186.88 00:33:39.630 lat (usec): min=54, max=2353, avg=329.45, stdev=213.00 00:33:39.630 clat percentiles (usec): 00:33:39.630 | 50.000th=[ 219], 99.000th=[ 1004], 99.900th=[ 1221], 99.990th=[ 1336], 00:33:39.630 | 99.999th=[ 2024] 00:33:39.630 bw ( KiB/s): min=143640, max=197792, per=100.00%, avg=151479.58, stdev=4158.73, samples=76 00:33:39.630 iops : min=35910, max=49448, avg=37869.84, stdev=1039.67, samples=76 00:33:39.630 trim: IOPS=37.8k, BW=148MiB/s (155MB/s)(1475MiB/10001msec); 0 zone resets 00:33:39.630 slat (usec): min=6, max=1487, avg=16.55, stdev= 7.52 00:33:39.630 clat (usec): min=28, max=1960, avg=253.85, stdev=122.39 00:33:39.630 lat (usec): min=39, max=1985, avg=270.40, stdev=125.55 00:33:39.630 clat percentiles (usec): 00:33:39.630 | 50.000th=[ 233], 99.000th=[ 701], 99.900th=[ 824], 99.990th=[ 906], 00:33:39.630 | 99.999th=[ 1369] 00:33:39.630 bw ( KiB/s): min=143640, max=197800, per=100.00%, avg=151481.26, stdev=4159.54, samples=76 00:33:39.630 iops : min=35910, max=49450, avg=37870.26, stdev=1039.87, samples=76 00:33:39.630 lat (usec) : 50=0.01%, 100=6.57%, 250=52.17%, 500=33.40%, 750=5.96% 00:33:39.630 lat (usec) : 1000=1.39% 00:33:39.630 lat (msec) : 2=0.51%, 4=0.01% 00:33:39.631 cpu : usr=99.59%, sys=0.00%, ctx=90, majf=0, minf=102 00:33:39.631 IO depths : 1=8.1%, 2=26.3%, 4=52.5%, 8=13.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:39.631 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:39.631 complete : 0=0.0%, 4=88.4%, 8=11.6%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:39.631 issued rwts: total=0,377675,377676,0 short=0,0,0,0 dropped=0,0,0,0 00:33:39.631 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:39.631 00:33:39.631 Run status group 0 (all jobs): 00:33:39.631 WRITE: bw=148MiB/s (155MB/s), 148MiB/s-148MiB/s (155MB/s-155MB/s), io=1475MiB (1547MB), run=10001-10001msec 00:33:39.631 TRIM: bw=148MiB/s (155MB/s), 148MiB/s-148MiB/s (155MB/s-155MB/s), io=1475MiB (1547MB), run=10001-10001msec 00:33:39.631 00:33:39.631 real 0m13.688s 00:33:39.631 user 0m45.880s 00:33:39.631 sys 0m0.543s 00:33:39.631 20:09:31 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:39.631 20:09:31 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:33:39.631 ************************************ 00:33:39.631 END TEST bdev_fio_trim 00:33:39.631 ************************************ 00:33:39.889 20:09:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:33:39.889 20:09:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:39.889 20:09:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:33:39.889 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:39.889 20:09:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:33:39.889 00:33:39.889 real 0m27.624s 00:33:39.889 user 1m31.982s 00:33:39.889 sys 0m1.266s 00:33:39.889 20:09:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:39.889 20:09:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:39.889 ************************************ 00:33:39.889 END TEST bdev_fio 00:33:39.889 ************************************ 00:33:39.889 20:09:31 blockdev_crypto_aesni -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:33:39.889 20:09:31 blockdev_crypto_aesni -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:33:39.889 20:09:31 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:33:39.889 20:09:31 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:39.889 20:09:31 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:39.889 ************************************ 00:33:39.889 START TEST bdev_verify 00:33:39.889 ************************************ 00:33:39.890 20:09:31 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:33:39.890 [2024-07-24 20:09:31.397533] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:33:39.890 [2024-07-24 20:09:31.397599] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1568414 ] 00:33:40.148 [2024-07-24 20:09:31.528496] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:40.148 [2024-07-24 20:09:31.635521] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:40.148 [2024-07-24 20:09:31.635526] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:40.148 [2024-07-24 20:09:31.656982] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:33:40.148 [2024-07-24 20:09:31.665012] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:40.148 [2024-07-24 20:09:31.673042] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:40.407 [2024-07-24 20:09:31.783250] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:33:42.942 [2024-07-24 20:09:34.004948] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:33:42.942 [2024-07-24 20:09:34.005040] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:42.942 [2024-07-24 20:09:34.005054] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:42.942 [2024-07-24 20:09:34.012967] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:33:42.942 [2024-07-24 20:09:34.012988] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:42.942 [2024-07-24 20:09:34.013000] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:42.942 [2024-07-24 20:09:34.020992] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:33:42.942 [2024-07-24 20:09:34.021012] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:42.942 [2024-07-24 20:09:34.021023] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:42.942 [2024-07-24 20:09:34.029014] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:33:42.942 [2024-07-24 20:09:34.029033] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:42.942 [2024-07-24 20:09:34.029044] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:42.942 Running I/O for 5 seconds... 00:33:48.214 00:33:48.214 Latency(us) 00:33:48.214 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:48.214 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:48.214 Verification LBA range: start 0x0 length 0x1000 00:33:48.214 crypto_ram : 5.07 465.36 1.82 0.00 0.00 273669.98 2721.17 165036.74 00:33:48.214 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:48.214 Verification LBA range: start 0x1000 length 0x1000 00:33:48.214 crypto_ram : 5.08 377.83 1.48 0.00 0.00 337898.26 13107.20 205156.17 00:33:48.214 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:48.214 Verification LBA range: start 0x0 length 0x1000 00:33:48.214 crypto_ram2 : 5.07 466.59 1.82 0.00 0.00 272168.97 4131.62 153183.28 00:33:48.214 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:48.214 Verification LBA range: start 0x1000 length 0x1000 00:33:48.214 crypto_ram2 : 5.08 377.73 1.48 0.00 0.00 336688.69 14132.98 186920.07 00:33:48.214 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:48.214 Verification LBA range: start 0x0 length 0x1000 00:33:48.214 crypto_ram3 : 5.05 3625.69 14.16 0.00 0.00 34943.05 7522.39 27012.23 00:33:48.214 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:48.214 Verification LBA range: start 0x1000 length 0x1000 00:33:48.214 crypto_ram3 : 5.07 2929.10 11.44 0.00 0.00 43227.22 5755.77 31685.23 00:33:48.214 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:48.214 Verification LBA range: start 0x0 length 0x1000 00:33:48.214 crypto_ram4 : 5.06 3642.81 14.23 0.00 0.00 34739.48 2251.02 26898.25 00:33:48.214 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:48.214 Verification LBA range: start 0x1000 length 0x1000 00:33:48.214 crypto_ram4 : 5.07 2927.03 11.43 0.00 0.00 43127.37 6639.08 31229.33 00:33:48.214 =================================================================================================================== 00:33:48.214 Total : 14812.14 57.86 0.00 0.00 68633.36 2251.02 205156.17 00:33:48.214 00:33:48.214 real 0m8.332s 00:33:48.214 user 0m15.756s 00:33:48.214 sys 0m0.392s 00:33:48.214 20:09:39 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:48.214 20:09:39 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:33:48.214 ************************************ 00:33:48.214 END TEST bdev_verify 00:33:48.214 ************************************ 00:33:48.214 20:09:39 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:33:48.214 20:09:39 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:33:48.214 20:09:39 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:48.214 20:09:39 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:48.214 ************************************ 00:33:48.214 START TEST bdev_verify_big_io 00:33:48.214 ************************************ 00:33:48.214 20:09:39 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:33:48.473 [2024-07-24 20:09:39.812982] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:33:48.473 [2024-07-24 20:09:39.813053] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1569478 ] 00:33:48.473 [2024-07-24 20:09:39.943177] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:48.473 [2024-07-24 20:09:40.053685] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:48.473 [2024-07-24 20:09:40.053689] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:48.732 [2024-07-24 20:09:40.075277] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:33:48.732 [2024-07-24 20:09:40.083305] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:48.732 [2024-07-24 20:09:40.091333] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:48.732 [2024-07-24 20:09:40.190374] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:33:51.265 [2024-07-24 20:09:42.423468] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:33:51.265 [2024-07-24 20:09:42.423560] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:51.265 [2024-07-24 20:09:42.423575] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:51.265 [2024-07-24 20:09:42.431486] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:33:51.265 [2024-07-24 20:09:42.431508] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:51.265 [2024-07-24 20:09:42.431520] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:51.265 [2024-07-24 20:09:42.439506] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:33:51.266 [2024-07-24 20:09:42.439526] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:51.266 [2024-07-24 20:09:42.439537] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:51.266 [2024-07-24 20:09:42.447530] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:33:51.266 [2024-07-24 20:09:42.447548] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:51.266 [2024-07-24 20:09:42.447560] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:51.266 Running I/O for 5 seconds... 00:33:51.833 [2024-07-24 20:09:43.418615] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:51.833 [2024-07-24 20:09:43.419211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:51.833 [2024-07-24 20:09:43.419441] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:51.833 [2024-07-24 20:09:43.419564] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:51.833 [2024-07-24 20:09:43.419638] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:51.833 [2024-07-24 20:09:43.420057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:51.833 [2024-07-24 20:09:43.421792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:51.833 [2024-07-24 20:09:43.421871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:51.833 [2024-07-24 20:09:43.421926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:51.833 [2024-07-24 20:09:43.421980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:51.833 [2024-07-24 20:09:43.422598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:51.833 [2024-07-24 20:09:43.422671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:51.833 [2024-07-24 20:09:43.422725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:51.833 [2024-07-24 20:09:43.422777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:51.833 [2024-07-24 20:09:43.423218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:51.833 [2024-07-24 20:09:43.424698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:51.833 [2024-07-24 20:09:43.424765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:51.833 [2024-07-24 20:09:43.424818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:51.833 [2024-07-24 20:09:43.424873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:51.833 [2024-07-24 20:09:43.425565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:51.833 [2024-07-24 20:09:43.425641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:51.833 [2024-07-24 20:09:43.425698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:51.833 [2024-07-24 20:09:43.425764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.426199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.427565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.427631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.427698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.427751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.428281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.428342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.428403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.428458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.428791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.430172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.430266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.430325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.430378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.430945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.431012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.431066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.431121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.431635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.432810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.432874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.432931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.432985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.433555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.433615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.433668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.433721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.434149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.094 [2024-07-24 20:09:43.435844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.435909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.435962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.436523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.436589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.436642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.438153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.438218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.438270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.438323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.438885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.438953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.439012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.439064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.440702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.440771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.440825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.440882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.441385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.441458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.441511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.441564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.443161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.443227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.443291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.443348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.443843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.443904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.443958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.444010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.445857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.445923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.445976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.446029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.446558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.446618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.446678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.446733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.448153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.448220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.448272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.448324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.448867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.448926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.448989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.449046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.451044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.451112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.451164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.451217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.451733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.451805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.451857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.451910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.453417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.453480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.453532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.453585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.454122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.454184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.454240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.454293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.456019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.456090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.456144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.456201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.456715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.456776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.456829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.456883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.458470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.458534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.458592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.458651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.459135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.459195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.459248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.459301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.460975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.461045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.461097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.461150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.461685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.461749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.461810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.461876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.463315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.463381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.463442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.463496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.464029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.464090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.464143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.095 [2024-07-24 20:09:43.464203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.466232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.466295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.466348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.466408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.466934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.467000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.467069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.467121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.468550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.468613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.468666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.468719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.469273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.469332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.469400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.469454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.471399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.471466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.471519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.471574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.472066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.472139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.472192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.472245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.473706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.473788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.473840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.473896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.474383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.474454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.474513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.474567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.476194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.476264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.476325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.476378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.476914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.476975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.477027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.477079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.478640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.478703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.478764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.478822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.479310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.479371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.479431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.479485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.481106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.481178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.481231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.481284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.481823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.481888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.481950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.482005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.483541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.483608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.483661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.483713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.484237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.484296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.484352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.484413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.486062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.486125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.486177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.486230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.486775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.486845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.486898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.486951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.488355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.488427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.488481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.488533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.489162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.489222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.489275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.489330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.491010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.491078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.491139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.096 [2024-07-24 20:09:43.491201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.491699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.491766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.491822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.491875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.493371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.493440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.493494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.493555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.494262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.494322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.494376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.494438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.496028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.496094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.496150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.496203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.496770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.496831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.496884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.496936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.498500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.498570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.498625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.498677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.499370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.499437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.499491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.499547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.501107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.502947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.504883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.506821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.507311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.508861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.509355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.510734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.513597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.515367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.517163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.519022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.520149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.521908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.523694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.525499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.528693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.529876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.530370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.532133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.534507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.536450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.538258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.540194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.542489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.544268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.546064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.547855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.550081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.551879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.553684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.554370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.557897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.559646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.561586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.563507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.565673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.566176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.567429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.569180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.572486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.574283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.576074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.576579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.579150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.580943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.582734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.584118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.586967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.587477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.589012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.590780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.592963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.594418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.596166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.597998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.601669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.603464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.097 [2024-07-24 20:09:43.605251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.606638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.608769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.610567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.611984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.612481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.615742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.617071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.618826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.620593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.621597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.622173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.623935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.625721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.628910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.630715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.631745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.632239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.634548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.636493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.638296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.640234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.642087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.643140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.644902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.646692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.648476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.650220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.652001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.653788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.657259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.659161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.660864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.662806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.665045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.666529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.667023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.668491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.671361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.673114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.674913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.676853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.677932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.679693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.681466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.098 [2024-07-24 20:09:43.683165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.359 [2024-07-24 20:09:43.686387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.359 [2024-07-24 20:09:43.688055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.359 [2024-07-24 20:09:43.688550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.359 [2024-07-24 20:09:43.689044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.359 [2024-07-24 20:09:43.690136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.359 [2024-07-24 20:09:43.691568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.359 [2024-07-24 20:09:43.692066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.359 [2024-07-24 20:09:43.692563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.359 [2024-07-24 20:09:43.694777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.359 [2024-07-24 20:09:43.695289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.359 [2024-07-24 20:09:43.695792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.359 [2024-07-24 20:09:43.696286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.359 [2024-07-24 20:09:43.697330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.359 [2024-07-24 20:09:43.697841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.359 [2024-07-24 20:09:43.698340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.359 [2024-07-24 20:09:43.698845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.701124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.701639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.702138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.702647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.703807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.704312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.704812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.705304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.707967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.708483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.708976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.709475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.710631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.711134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.711638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.712130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.714491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.714994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.715501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.716015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.717019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.717536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.718030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.718536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.720718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.721222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.721727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.722226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.723275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.723791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.724293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.724804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.727055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.727574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.728081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.728586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.729625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.730125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.730627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.731125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.733549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.734055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.734557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.735054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.736174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.736694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.737195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.737694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.740096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.740615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.741111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.741613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.742649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.743150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.743672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.744171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.746327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.747422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.748895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.750056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.752239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.752767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.753262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.755198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.758055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.759725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.761734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.762792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.765348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.766062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.766561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.768478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.771758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.773720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.775732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.777084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.778840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.780613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.782405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.784351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.787929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.788440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.789099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.790861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.793356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.794487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.796250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.798029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.801302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.803261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.360 [2024-07-24 20:09:43.805217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.806808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.809175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.811226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.812738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.813237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.816573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.817673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.819424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.821208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.822144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.822656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.824422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.826201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.829434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.831452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.832361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.832860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.835152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.837099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.838729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.840659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.842540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.843648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.845411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.847193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.848833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.850590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.852384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.854320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.858043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.860060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.861326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.863074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.865540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.866676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.867175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.868811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.871827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.873621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.875560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.875614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.877082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.877650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.878874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.880626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.882408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.884414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.886005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.886072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.886135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.886189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.886601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.887223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.887287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.887342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.887405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.888842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.888921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.888974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.889028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.889466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.889647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.889705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.889759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.889818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.891249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.891314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.891368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.891429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.891907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.892090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.892148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.892207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.892260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.893743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.893808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.893869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.893925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.894259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.894457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.894520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.894578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.894631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.896182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.896250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.896305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.896361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.896928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.897108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.897169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.897223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.361 [2024-07-24 20:09:43.897277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.898790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.898855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.898909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.898983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.899328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.899519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.899578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.899632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.899685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.901239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.901306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.901360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.901420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.901788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.901964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.902022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.902083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.902138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.903554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.903623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.903676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.903735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.904142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.904320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.904378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.904439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.904499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.906218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.906283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.906336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.906399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.906738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.906920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.906991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.907054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.907108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.908679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.908742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.908795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.908848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.909259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.909450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.909518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.909571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.909624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.911232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.911298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.911351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.911429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.911767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.911952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.912012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.912065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.912118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.913673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.913737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.913798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.913865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.914201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.914384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.914452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.914506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.914561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.916223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.916291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.916349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.916410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.916817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.916997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.917061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.917125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.917180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.918594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.918664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.918717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.918770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.919178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.919358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.919423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.919485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.919543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.921546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.921611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.921664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.921718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.922091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.922272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.922337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.922404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.362 [2024-07-24 20:09:43.922458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.923896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.923960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.924013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.924072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.924485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.924665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.924735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.924792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.924846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.926814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.926878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.926930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.926984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.927333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.927523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.927593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.927646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.927700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.929217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.929280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.929333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.929386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.929762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.929944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.930002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.930055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.930110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.931729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.931793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.931853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.931916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.932252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.932445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.932504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.932558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.932612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.934191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.934255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.934315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.934377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.934719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.934901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.934959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.935013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.935067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.936741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.936818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.936871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.936924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.937292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.937482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.937545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.937599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.937667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.939133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.939200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.939257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.939311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.939656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.939835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.939894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.939955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.940009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.941832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.941898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.941951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.942004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.942403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.942583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.942650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.942709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.942763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.944251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.944315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.944368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.944434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.944771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.944951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.945018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.945072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.945137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.946934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.946998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.947051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.947112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.947455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.947635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.947699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.947752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.363 [2024-07-24 20:09:43.947805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.364 [2024-07-24 20:09:43.949321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.364 [2024-07-24 20:09:43.949386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.364 [2024-07-24 20:09:43.949446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.364 [2024-07-24 20:09:43.949506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.364 [2024-07-24 20:09:43.949841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.364 [2024-07-24 20:09:43.950022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.950104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.950173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.950227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.951811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.951882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.951941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.951994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.952329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.952520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.952579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.952641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.952697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.954228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.954297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.954350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.954416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.954900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.955079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.955138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.955193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.955247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.956831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.956896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.956948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.957002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.957335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.957523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.957589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.957644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.957697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.959103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.959168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.959226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.959280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.959788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.959972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.624 [2024-07-24 20:09:43.960033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.960098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.960152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.961691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.961755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.961808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.961871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.962204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.962382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.962448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.962506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.962560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.964022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.964086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.964590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.965013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.965198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.965256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.965313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.965367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.966833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.967991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.969747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.971536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.971881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.972061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.972572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.973446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.975188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.978474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.980262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.982272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.982909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.983443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.985051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.986835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.987584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.989355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.992980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.994700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.995786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.997485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.997904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.999385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:43.999887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.000382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.000886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.003474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.003979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.004480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.004980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.005488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.006103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.006614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.007110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.007609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.009768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.010280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.010781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.011276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.011708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.012318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.012828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.013330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.013835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.016155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.016670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.017168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.017678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.018100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.018725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.019227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.019748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.020249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.022342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.022847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.023339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.023851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.024427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.025042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.025555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.026054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.026550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.028946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.029464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.029961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.030483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.030978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.031612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.032113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.032614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.625 [2024-07-24 20:09:44.033110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.035152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.035686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.036188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.036691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.037183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.037806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.038311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.038810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.039304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.041831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.042337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.042839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.043335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.043813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.044427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.044928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.045440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.045938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.048101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.048638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.049141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.049644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.050092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.050711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.051211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.051877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.053880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.055992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.056510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.058449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.059199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.059628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.060237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.060896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.062359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.064086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.067910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.069450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.071365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.073304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.073653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.074271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.075350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.077118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.078921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.082206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.083948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.084557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.085057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.085405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.087323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.089113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.090460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.092217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.094083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.095518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.097259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.099121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.099474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.101312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.103251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.105189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.107054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.110781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.112574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.113914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.115666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.116051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.117974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.118890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.119386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.121323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.124478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.126425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.128373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.130138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.130659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.131691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.133456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.135231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.136999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.140315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.141037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.141539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.143491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.143834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.145752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.147168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.148922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.150708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.153568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.626 [2024-07-24 20:09:44.155337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.157125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.158931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.159348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.161227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.163025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.164958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.165463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.168834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.170196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.171945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.173742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.174122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.175381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.175889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.177611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.179467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.182614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.184488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.186428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.186928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.187451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.189323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.191119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.192879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.194208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.196778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.197287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.199000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.200860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.201218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.203226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.205165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.207102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.208943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.212696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.627 [2024-07-24 20:09:44.214483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.216257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.217578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.218008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.219916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.221705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.222588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.223087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.226455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.228182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.230028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.231970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.232317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.232939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.233471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.235238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.237017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.240183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.241973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.243186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.243689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.244040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.245928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.247839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.249785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.251584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.253517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.254248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.256001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.257791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.258203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.259668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.261435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.263220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.889 [2024-07-24 20:09:44.264991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.268199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.270154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.272097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.273913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.274281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.276336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.278176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.278700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.279705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.282549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.284309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.286096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.287888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.288330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.288958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.290907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.292744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.294546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.297787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.299431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.299932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.301122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.301515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.303435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.305225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.306720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.308464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.310496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.312371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.314132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.315906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.316293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.318222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.320007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.321778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.323053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.326422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.328212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.329565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.331324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.331723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.333646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.334153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.334681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.336427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.339590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.341371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.343163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.343229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.343658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.344273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.346001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.347899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.349842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.353154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.353233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.353291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.353345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.353757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.354384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.354463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.354518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.354572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.355970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.356034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.356088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.356142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.356539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.356725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.356784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.356838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.356899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.358299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.358375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.358438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.358493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.358932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.359126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.359192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.359248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.359305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.360739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.360804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.360857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.360910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.361325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.361522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.361592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.361647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.890 [2024-07-24 20:09:44.361704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.363127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.363193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.363246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.363300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.363829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.364016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.364094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.364150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.364203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.365770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.365836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.365896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.365957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.366295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.366490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.366564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.366621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.366675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.368154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.368224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.368279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.368334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.368908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.369096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.369155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.369209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.369263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.370849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.370914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.370967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.371021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.371403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.371595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.371654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.371713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.371766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.373843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.373920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.373988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.374053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.374402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.374586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.374650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.374718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.374776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.376341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.376415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.376468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.376527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.376886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.377068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.377131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.377184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.377238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.378817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.378882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.378948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.379004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.379498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.379686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.379759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.379816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.379869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.381621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.381687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.381758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.381814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.382265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.382463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.382537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.382594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.382648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.384749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.384826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.384915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.384982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.385414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.385599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.385663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.385732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.385800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.387477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.387553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.387608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.387665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.388233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.891 [2024-07-24 20:09:44.388424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.388484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.388539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.388596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.390288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.390363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.390427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.390482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.390892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.391076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.391136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.391190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.391244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.393650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.393751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.393807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.393879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.394372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.394566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.394627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.394682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.394749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.396638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.396718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.396789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.396844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.397407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.397594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.397657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.397712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.397766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.399473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.399549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.399616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.399670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.400092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.400290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.400362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.400425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.400480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.402612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.402691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.402748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.402821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.403290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.403489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.403551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.403622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.403690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.405525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.405623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.405680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.405735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.406292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.406487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.406552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.406607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.406661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.408387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.408485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.408564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.408619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.409037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.409221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.409287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.409342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.409405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.411801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.411877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.411954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.412019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.412452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.412638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.412699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.412767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.412822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.414551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.414629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.414685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.414739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.415305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.415499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.415559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.415613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.415667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.417383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.417474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.417540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.417594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.418001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.892 [2024-07-24 20:09:44.418189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.418261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.418317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.418372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.420352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.420431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.420498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.420566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.421024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.421206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.421272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.421326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.421402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.423289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.423366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.423445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.423511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.424008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.424193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.424264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.424318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.424372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.426114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.426180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.426235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.426322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.426849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.427048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.427120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.427193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.427249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.429017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.429094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.429169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.429236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.429712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.429901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.429971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.430026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.430080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.432197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.432275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.432331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.432411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.432888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.433070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.433154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.433221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.433275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.435030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.435098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.435153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.435211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.435742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.435926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.435999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.436058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.436131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.437805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.437872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.437939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.438007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.438436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.438620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.438709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.438768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.438822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.441059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.441137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.441203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.441730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.442214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.442406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.442469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.442534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.442604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.444314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.446061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.447974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.449917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.450258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.450444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.452384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.454210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.456217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.459662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.461306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.463127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.464634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.893 [2024-07-24 20:09:44.465099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.894 [2024-07-24 20:09:44.465720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.894 [2024-07-24 20:09:44.467057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.894 [2024-07-24 20:09:44.468157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.894 [2024-07-24 20:09:44.469437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.894 [2024-07-24 20:09:44.472983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.894 [2024-07-24 20:09:44.473754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.894 [2024-07-24 20:09:44.475469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.894 [2024-07-24 20:09:44.477464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.894 [2024-07-24 20:09:44.477953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:52.894 [2024-07-24 20:09:44.478576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.154 [2024-07-24 20:09:44.479970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.481726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.483507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.486818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.488666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.489167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.489890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.490309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.492213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.494222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.495330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.497084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.499157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.501066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.503009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.505041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.505386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.507456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.509408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.511411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.512812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.516401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.518418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.519540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.521280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.521668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.523808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.524317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.524837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.526649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.529877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.531666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.533662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.534592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.535021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.536595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.538331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.540121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.542079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.545482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.545996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.546741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.548519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.548911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.551040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.552159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.553906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.555687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.559054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.561008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.562987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.564559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.564904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.566963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.568978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.570363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.570868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.574472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.575640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.577412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.579187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.579536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.580155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.580669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.582429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.584199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.587460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.589478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.590422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.590921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.591265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.593132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.594988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.596842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.598566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.600583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.601384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.603142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.604938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.605288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.606490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.608243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.610004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.612005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.615519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.616974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.618973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.620313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.620741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.155 [2024-07-24 20:09:44.622653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.624584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.625086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.625686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.628217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.629960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.631735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.633719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.634182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.634808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.636599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.638509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.640451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.643816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.645351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.645863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.646878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.647293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.648994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.650999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.652090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.653856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.655925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.657878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.659823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.661835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.662231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.664291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.666118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.668135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.669376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.672874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.674900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.676090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.677812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.678211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.680337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.680855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.681354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.683200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.686465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.688247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.690236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.691283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.691742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.693165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.694933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.696691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.698632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.702213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.702732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.703253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.705001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.705379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.707519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.708631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.710383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.712171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.715268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.717017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.718930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.720720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.721077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.723124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.725105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.726635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.727136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.730733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.731846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.733584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.735365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.735717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.736338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.736851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.738722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.740501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.743713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.156 [2024-07-24 20:09:44.745724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.746707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.747210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.747590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.749506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.751454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.753067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.755010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.756924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.757998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.759737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.761504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.761846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.763164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.764914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.766674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.768678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.772222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.774212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.775690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.777633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.777979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.780109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.781444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.781941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.783054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.785719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.787475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.789252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.791247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.791754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.792373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.792888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.793387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.793890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.797393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.799340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.799844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.800363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.800715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.802592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.803926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.805814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.807756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.809903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.810420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.810920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.811424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.811845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.812474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.812979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.813488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.813987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.816224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.816739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.817242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.817756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.818253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.818876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.819374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.419 [2024-07-24 20:09:44.819880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.820378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.822659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.823183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.823687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.824184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.824676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.825311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.825827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.826326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.826851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.829072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.829593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.830094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.830600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.831120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.831746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.832249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.832749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.833242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.835724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.836645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.837852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.839684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.840185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.840813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.841745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.842935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.844761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.848091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.849745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.850244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.850313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.850836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.851995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.853055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.855002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.855507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.858846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.858924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.858981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.859035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.859464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.860095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.860169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.860225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.860282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.861957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.862022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.862102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.862156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.862635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.862821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.862894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.862960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.863017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.864860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.864928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.864981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.865034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.865542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.865732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.865812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.865869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.865959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.867655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.867718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.867778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.867833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.868306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.868506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.868577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.868641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.868709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.870337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.870410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.870464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.870518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.870935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.871120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.871184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.871251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.871307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.872917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.872982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.873035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.873089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.873470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.873656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.873715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.873769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.873844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.420 [2024-07-24 20:09:44.875484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.875570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.875624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.875678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.876083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.876275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.876336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.876397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.876452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.878171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.878236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.878316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.878372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.878834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.879017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.879091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.879158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.879212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.881061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.881126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.881179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.881235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.881802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.881986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.882063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.882133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.882202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.884192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.884256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.884310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.884363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.884897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.885079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.885150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.885223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.885280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.886890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.886961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.887017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.887070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.887489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.887675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.887732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.887785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.887838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.889630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.889695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.889748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.889801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.890172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.890358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.890429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.890483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.890535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.892073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.892138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.892191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.892244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.892773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.892959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.893023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.893103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.893159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.894772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.894849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.894904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.894957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.895379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.895573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.895631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.895685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.895745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.897438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.897503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.897563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.897628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.897966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.898149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.898206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.898260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.898313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.421 [2024-07-24 20:09:44.899954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.900017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.900070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.900123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.900469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.900651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.900709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.900762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.900816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.902632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.902705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.902764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.902817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.903155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.903334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.903405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.903472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.903530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.905011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.905075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.905128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.905182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.905529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.905711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.905770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.905824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.905878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.907763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.907827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.907879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.907940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.908277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.908472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.908532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.908585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.908638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.910278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.910351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.910420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.910473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.910896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.911084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.911143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.911198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.911251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.912898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.912963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.913021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.913074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.913420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.913605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.913670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.913725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.913778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.915264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.915328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.915381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.915441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.915899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.916084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.916142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.916198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.916252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.917865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.917929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.917994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.918048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.918429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.918609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.918666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.918723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.918776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.920283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.920351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.920412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.920467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.920976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.422 [2024-07-24 20:09:44.921159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.921217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.921270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.921326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.922852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.922915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.922969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.923022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.923472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.923653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.923714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.923767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.923820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.925349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.925440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.925495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.925549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.926012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.926190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.926250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.926303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.926361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.927843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.927909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.927971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.928025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.928385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.928573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.928652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.928724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.928780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.930380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.930450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.930507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.930560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.931134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.931311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.931369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.931430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.931483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.933004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.933069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.933122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.933183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.933528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.933708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.933770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.933823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.933877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.935488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.935553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.935607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.935659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.935998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.936177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.936239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.936299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.936358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.937950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.938013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.938065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.938124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.938513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.938694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.938774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.938829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.938882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.940588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.940652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.940705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.942639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.942978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.943160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.943218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.943271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.943324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.423 [2024-07-24 20:09:44.945001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.946943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.948586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.949078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.949578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.949761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.951513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.953503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.955504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.959042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.959556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.960050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.961799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.962141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.964269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.965178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.966922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.968914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.972266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.974189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.976197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.977606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.977948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.980000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.982009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.983237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.983737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.987161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.988548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.990300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.992306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.992653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.993266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.993780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.995523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:44.997519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:45.001029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:45.003020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:45.003717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:45.004215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:45.004564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:45.006609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.424 [2024-07-24 20:09:45.008629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.684 [2024-07-24 20:09:45.009866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.684 [2024-07-24 20:09:45.011793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.684 [2024-07-24 20:09:45.013797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.684 [2024-07-24 20:09:45.015218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.684 [2024-07-24 20:09:45.016977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.684 [2024-07-24 20:09:45.018928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.684 [2024-07-24 20:09:45.019272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.684 [2024-07-24 20:09:45.021008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.684 [2024-07-24 20:09:45.022825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.684 [2024-07-24 20:09:45.024862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.684 [2024-07-24 20:09:45.026281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.684 [2024-07-24 20:09:45.030165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.684 [2024-07-24 20:09:45.032181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.684 [2024-07-24 20:09:45.033177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.684 [2024-07-24 20:09:45.034948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.684 [2024-07-24 20:09:45.035291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.684 [2024-07-24 20:09:45.037403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.684 [2024-07-24 20:09:45.037904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.684 [2024-07-24 20:09:45.038401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.684 [2024-07-24 20:09:45.040162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.684 [2024-07-24 20:09:45.043440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.684 [2024-07-24 20:09:45.045448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.684 [2024-07-24 20:09:45.047463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.684 [2024-07-24 20:09:45.048114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.684 [2024-07-24 20:09:45.048554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.684 [2024-07-24 20:09:45.050350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.684 [2024-07-24 20:09:45.052221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.054237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.055673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.058416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.058918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.060151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.061896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.062241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.064302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.065566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.067311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.069300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.073029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.075026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.077030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.077986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.078329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.080458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.082466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.083207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.083708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.086649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.088595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.090533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.092534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.092942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.093574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.094735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.096485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.098488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.102092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.103962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.104460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.105129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.105521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.107659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.109676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.110562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.112313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.114384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.116330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.118350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.120340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.120793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.122853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.124872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.126873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.127827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.131445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.133294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.134812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.136539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.136895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.138673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.139171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.139985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.141738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.145067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.147082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.149102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.149609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.150120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.152169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.154135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.155718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.157323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.159443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.159957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.160460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.160968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.161311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.163218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.165169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.166407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.168170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.171459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.173327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.175182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.176508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.176904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.179043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.179557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.180048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.181675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.184032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.184542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.185041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.185539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.185952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.186577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.187077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.685 [2024-07-24 20:09:45.187578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.188078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.190372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.190881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.191377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.191879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.192455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.193066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.193585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.194109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.194607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.197067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.197582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.198078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.198585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.199118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.199741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.200241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.200743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.201249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.203422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.203950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.204462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.204961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.205460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.206079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.206591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.207086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.207584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.210074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.210593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.211089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.211593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.212153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.212771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.213271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.213775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.214269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.216734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.217245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.217750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.218244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.218628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.219237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.219743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.220240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.220744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.223103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.223615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.224110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.224627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.225191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.225816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.226318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.226816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.227327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.229619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.230122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.230623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.231118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.231554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.232169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.232677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.234670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.236154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.238324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.240269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.242284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.243715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.244061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.246084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.247857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.248354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.248855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.250917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.251640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.253630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.254632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.254980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.256921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.257422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.258164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.259810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.261783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.262290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.686 [2024-07-24 20:09:45.264280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.687 [2024-07-24 20:09:45.266289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.687 [2024-07-24 20:09:45.266756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.687 [2024-07-24 20:09:45.268769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.687 [2024-07-24 20:09:45.270781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.687 [2024-07-24 20:09:45.272162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.687 [2024-07-24 20:09:45.272680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.276251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.277407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.279186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.281179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.281531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.282152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.282787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.284545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.286540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.290155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.292170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.292679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.293169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.293521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.295649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.297659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.298612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.300347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.302404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.304164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.306044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.308060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.308448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.310421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.312361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.314371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.315651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.319487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.321321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.322808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.322870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.323264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.325373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.326863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.327355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.328394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.331176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.331248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.331301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.331367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.331716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.333865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.333933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.333986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.334040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.335748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.335815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.335868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.335921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.336260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.336454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.336523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.336576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.336629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.338210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.338275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.338328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.338381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.948 [2024-07-24 20:09:45.338732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.338919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.338980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.339038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.339093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.340791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.340875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.340929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.340982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.341323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.341520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.341582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.341649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.341704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.343259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.343323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.343375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.343435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.343904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.344087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.344152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.344206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.344281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.345962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.346038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.346111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.346167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.346554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.346745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.346805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.346861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.346914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.348435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.348501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.348555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.348609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.349170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.349349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.349413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.349470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.349525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.351032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.351100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.351152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.351209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.351668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.351850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.351907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.351960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.352021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.353577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.353642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.353695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.353750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.354313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.354506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.354572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.354626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.354679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.356194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.356259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.356312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.356366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.356715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.356898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.356956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.357009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.357063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.358708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.358772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.358826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.358879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.359219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.359417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.359485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.359539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.359597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.361194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.361257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.361310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.361363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.361747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.361931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.361999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.362052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.362104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.363805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.363869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.949 [2024-07-24 20:09:45.363937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.363991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.364370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.364560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.364625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.364678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.364731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.366229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.366302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.366359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.366419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.366759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.366940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.366998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.367058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.367120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.368928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.368992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.369045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.369099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.369484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.369664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.369721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.369783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.369837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.371395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.371460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.371513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.371566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.371908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.372089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.372148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.372201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.372254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.376071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.376140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.376193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.376246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.376727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.376911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.376973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.377026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.377079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.381170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.381241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.381294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.381351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.381749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.381932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.381989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.382051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.382107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.386824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.386889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.386941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.386994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.387521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.387706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.387765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.387819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.387874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.393362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.393433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.393486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.393539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.393923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.394107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.394185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.394240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.394293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.398247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.398313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.398381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.398442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.398782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.398961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.399030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.399083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.399145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.404598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.404663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.950 [2024-07-24 20:09:45.404735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.404789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.405247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.405439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.405499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.405553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.405606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.410399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.410467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.410519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.410572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.410913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.411100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.411158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.411211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.411263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.415065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.415140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.415193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.415246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.415667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.415853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.415915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.415968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.416020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.420262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.420340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.420401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.420454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.420828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.421006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.421068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.421121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.421187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.426154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.426221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.426274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.426328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.426775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.426958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.427015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.427070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.427123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.432745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.432813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.432867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.432920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.433294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.433490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.433553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.433615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.433668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.437683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.437749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.437807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.437870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.438210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.438402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.438461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.438514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.438576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.443992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.444059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.444113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.444183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.444724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.444904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.444965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.445020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.445072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.449725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.449814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.449868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.449922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.450261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.450455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.450523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.450580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.450634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.454665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.454730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.454790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.454846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.455245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.455434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.455494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.455554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.455637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.456468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.456533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.951 [2024-07-24 20:09:45.456586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.457426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.457875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.458058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.458116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.458169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.458224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.459719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.460607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.462355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.464353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.464702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.464882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.465376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.465940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.467678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.469788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.470287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.471395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.472404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.472767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.473380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.473883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.475814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.477757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.481053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.481566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.482058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.483998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.484343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.485544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.487302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.488134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.488630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.491899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.493843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.494338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.494832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.495358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.495977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.496486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.496982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.497486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.499848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.500348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.500854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.501352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.501844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.502461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.502959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.503482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.503976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.506244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.506761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.507255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.507756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.508148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.508766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.509275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.509779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.510269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.512621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.513126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.513630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.514125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.514600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.515213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.515726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.516217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:53.952 [2024-07-24 20:09:45.516717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:57.241 00:33:57.241 Latency(us) 00:33:57.241 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:57.241 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:57.241 Verification LBA range: start 0x0 length 0x100 00:33:57.241 crypto_ram : 5.80 44.15 2.76 0.00 0.00 2813442.67 73400.32 2523876.84 00:33:57.241 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:57.241 Verification LBA range: start 0x100 length 0x100 00:33:57.241 crypto_ram : 6.04 41.39 2.59 0.00 0.00 2980942.91 141329.81 3122021.06 00:33:57.241 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:57.241 Verification LBA range: start 0x0 length 0x100 00:33:57.241 crypto_ram2 : 5.80 44.14 2.76 0.00 0.00 2711993.66 72944.42 2523876.84 00:33:57.241 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:57.241 Verification LBA range: start 0x100 length 0x100 00:33:57.241 crypto_ram2 : 6.04 42.02 2.63 0.00 0.00 2826964.13 62002.75 3122021.06 00:33:57.241 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:57.241 Verification LBA range: start 0x0 length 0x100 00:33:57.241 crypto_ram3 : 5.60 289.72 18.11 0.00 0.00 394107.64 49009.53 587202.56 00:33:57.241 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:57.241 Verification LBA range: start 0x100 length 0x100 00:33:57.241 crypto_ram3 : 5.68 225.91 14.12 0.00 0.00 495333.45 13563.10 630969.21 00:33:57.241 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:57.241 Verification LBA range: start 0x0 length 0x100 00:33:57.241 crypto_ram4 : 5.68 306.14 19.13 0.00 0.00 362217.70 17210.32 682030.30 00:33:57.241 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:57.241 Verification LBA range: start 0x100 length 0x100 00:33:57.241 crypto_ram4 : 5.80 243.28 15.21 0.00 0.00 447493.17 4416.56 634616.43 00:33:57.241 =================================================================================================================== 00:33:57.241 Total : 1236.75 77.30 0.00 0.00 765540.91 4416.56 3122021.06 00:33:57.500 00:33:57.500 real 0m9.314s 00:33:57.500 user 0m17.652s 00:33:57.500 sys 0m0.452s 00:33:57.500 20:09:49 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:57.500 20:09:49 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:33:57.500 ************************************ 00:33:57.500 END TEST bdev_verify_big_io 00:33:57.500 ************************************ 00:33:57.759 20:09:49 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:57.759 20:09:49 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:33:57.759 20:09:49 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:57.759 20:09:49 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:57.759 ************************************ 00:33:57.759 START TEST bdev_write_zeroes 00:33:57.759 ************************************ 00:33:57.759 20:09:49 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:57.759 [2024-07-24 20:09:49.210405] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:33:57.759 [2024-07-24 20:09:49.210453] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1570707 ] 00:33:57.759 [2024-07-24 20:09:49.321430] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:58.018 [2024-07-24 20:09:49.427118] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:58.018 [2024-07-24 20:09:49.448542] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:33:58.018 [2024-07-24 20:09:49.456568] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:58.018 [2024-07-24 20:09:49.464585] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:58.018 [2024-07-24 20:09:49.564451] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:34:00.556 [2024-07-24 20:09:51.799604] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:34:00.556 [2024-07-24 20:09:51.799667] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:00.556 [2024-07-24 20:09:51.799682] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:00.556 [2024-07-24 20:09:51.807623] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:34:00.556 [2024-07-24 20:09:51.807644] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:00.556 [2024-07-24 20:09:51.807656] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:00.556 [2024-07-24 20:09:51.815642] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:34:00.556 [2024-07-24 20:09:51.815661] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:00.556 [2024-07-24 20:09:51.815673] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:00.556 [2024-07-24 20:09:51.823665] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:34:00.556 [2024-07-24 20:09:51.823684] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:00.556 [2024-07-24 20:09:51.823695] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:00.556 Running I/O for 1 seconds... 00:34:01.583 00:34:01.583 Latency(us) 00:34:01.583 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:01.583 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:01.583 crypto_ram : 1.03 1948.22 7.61 0.00 0.00 65174.58 5470.83 77959.35 00:34:01.583 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:01.583 crypto_ram2 : 1.03 1953.96 7.63 0.00 0.00 64630.07 5442.34 72944.42 00:34:01.583 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:01.583 crypto_ram3 : 1.02 14978.70 58.51 0.00 0.00 8417.22 2507.46 10998.65 00:34:01.583 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:01.583 crypto_ram4 : 1.02 15015.85 58.66 0.00 0.00 8369.75 2564.45 8776.13 00:34:01.583 =================================================================================================================== 00:34:01.583 Total : 33896.74 132.41 0.00 0.00 14926.93 2507.46 77959.35 00:34:01.843 00:34:01.843 real 0m4.233s 00:34:01.843 user 0m3.810s 00:34:01.843 sys 0m0.379s 00:34:01.843 20:09:53 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1126 -- # xtrace_disable 00:34:01.843 20:09:53 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:34:01.843 ************************************ 00:34:01.843 END TEST bdev_write_zeroes 00:34:01.843 ************************************ 00:34:01.843 20:09:53 blockdev_crypto_aesni -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:01.843 20:09:53 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:34:01.843 20:09:53 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:34:01.843 20:09:53 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:34:02.102 ************************************ 00:34:02.102 START TEST bdev_json_nonenclosed 00:34:02.102 ************************************ 00:34:02.102 20:09:53 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:02.102 [2024-07-24 20:09:53.538667] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:34:02.102 [2024-07-24 20:09:53.538730] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1571251 ] 00:34:02.102 [2024-07-24 20:09:53.668335] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:02.362 [2024-07-24 20:09:53.773809] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:02.362 [2024-07-24 20:09:53.773883] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:34:02.362 [2024-07-24 20:09:53.773902] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:34:02.362 [2024-07-24 20:09:53.773914] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:34:02.362 00:34:02.362 real 0m0.406s 00:34:02.362 user 0m0.241s 00:34:02.362 sys 0m0.162s 00:34:02.362 20:09:53 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:34:02.362 20:09:53 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:34:02.362 ************************************ 00:34:02.362 END TEST bdev_json_nonenclosed 00:34:02.362 ************************************ 00:34:02.362 20:09:53 blockdev_crypto_aesni -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:02.362 20:09:53 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:34:02.362 20:09:53 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:34:02.362 20:09:53 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:34:02.621 ************************************ 00:34:02.621 START TEST bdev_json_nonarray 00:34:02.621 ************************************ 00:34:02.621 20:09:53 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:02.621 [2024-07-24 20:09:54.024643] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:34:02.621 [2024-07-24 20:09:54.024690] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1571278 ] 00:34:02.621 [2024-07-24 20:09:54.136871] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:02.880 [2024-07-24 20:09:54.242960] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:02.880 [2024-07-24 20:09:54.243040] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:34:02.880 [2024-07-24 20:09:54.243060] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:34:02.880 [2024-07-24 20:09:54.243073] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:34:02.880 00:34:02.880 real 0m0.381s 00:34:02.880 user 0m0.239s 00:34:02.880 sys 0m0.138s 00:34:02.880 20:09:54 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1126 -- # xtrace_disable 00:34:02.880 20:09:54 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:34:02.880 ************************************ 00:34:02.880 END TEST bdev_json_nonarray 00:34:02.880 ************************************ 00:34:02.880 20:09:54 blockdev_crypto_aesni -- bdev/blockdev.sh@786 -- # [[ crypto_aesni == bdev ]] 00:34:02.880 20:09:54 blockdev_crypto_aesni -- bdev/blockdev.sh@793 -- # [[ crypto_aesni == gpt ]] 00:34:02.880 20:09:54 blockdev_crypto_aesni -- bdev/blockdev.sh@797 -- # [[ crypto_aesni == crypto_sw ]] 00:34:02.880 20:09:54 blockdev_crypto_aesni -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:34:02.880 20:09:54 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # cleanup 00:34:02.880 20:09:54 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:34:02.880 20:09:54 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:02.880 20:09:54 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:34:02.880 20:09:54 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:34:02.880 20:09:54 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:34:02.880 20:09:54 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:34:02.880 00:34:02.880 real 1m14.080s 00:34:02.880 user 2m43.060s 00:34:02.880 sys 0m9.462s 00:34:02.880 20:09:54 blockdev_crypto_aesni -- common/autotest_common.sh@1126 -- # xtrace_disable 00:34:02.880 20:09:54 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:34:02.880 ************************************ 00:34:02.880 END TEST blockdev_crypto_aesni 00:34:02.880 ************************************ 00:34:02.880 20:09:54 -- spdk/autotest.sh@362 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:34:02.880 20:09:54 -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:34:02.880 20:09:54 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:34:02.880 20:09:54 -- common/autotest_common.sh@10 -- # set +x 00:34:03.139 ************************************ 00:34:03.139 START TEST blockdev_crypto_sw 00:34:03.139 ************************************ 00:34:03.140 20:09:54 blockdev_crypto_sw -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:34:03.140 * Looking for test storage... 00:34:03.140 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/blockdev.sh@673 -- # uname -s 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/blockdev.sh@681 -- # test_type=crypto_sw 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # crypto_device= 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # dek= 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # env_ctx= 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/blockdev.sh@689 -- # [[ crypto_sw == bdev ]] 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/blockdev.sh@689 -- # [[ crypto_sw == crypto_* ]] 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1571503 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 1571503 00:34:03.140 20:09:54 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:34:03.140 20:09:54 blockdev_crypto_sw -- common/autotest_common.sh@831 -- # '[' -z 1571503 ']' 00:34:03.140 20:09:54 blockdev_crypto_sw -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:03.140 20:09:54 blockdev_crypto_sw -- common/autotest_common.sh@836 -- # local max_retries=100 00:34:03.140 20:09:54 blockdev_crypto_sw -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:03.140 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:03.140 20:09:54 blockdev_crypto_sw -- common/autotest_common.sh@840 -- # xtrace_disable 00:34:03.140 20:09:54 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:03.140 [2024-07-24 20:09:54.679178] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:34:03.140 [2024-07-24 20:09:54.679237] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1571503 ] 00:34:03.399 [2024-07-24 20:09:54.807255] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:03.399 [2024-07-24 20:09:54.918934] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:04.334 20:09:55 blockdev_crypto_sw -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:34:04.334 20:09:55 blockdev_crypto_sw -- common/autotest_common.sh@864 -- # return 0 00:34:04.334 20:09:55 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:34:04.334 20:09:55 blockdev_crypto_sw -- bdev/blockdev.sh@710 -- # setup_crypto_sw_conf 00:34:04.334 20:09:55 blockdev_crypto_sw -- bdev/blockdev.sh@192 -- # rpc_cmd 00:34:04.334 20:09:55 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:34:04.334 20:09:55 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:04.334 Malloc0 00:34:04.334 Malloc1 00:34:04.334 true 00:34:04.334 true 00:34:04.334 true 00:34:04.334 [2024-07-24 20:09:55.890984] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:34:04.334 crypto_ram 00:34:04.334 [2024-07-24 20:09:55.899007] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:34:04.334 crypto_ram2 00:34:04.334 [2024-07-24 20:09:55.907037] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:34:04.334 crypto_ram3 00:34:04.334 [ 00:34:04.334 { 00:34:04.334 "name": "Malloc1", 00:34:04.334 "aliases": [ 00:34:04.334 "5a95a8df-b9e7-4752-b57d-c682ebbddd93" 00:34:04.334 ], 00:34:04.334 "product_name": "Malloc disk", 00:34:04.334 "block_size": 4096, 00:34:04.334 "num_blocks": 4096, 00:34:04.334 "uuid": "5a95a8df-b9e7-4752-b57d-c682ebbddd93", 00:34:04.334 "assigned_rate_limits": { 00:34:04.334 "rw_ios_per_sec": 0, 00:34:04.334 "rw_mbytes_per_sec": 0, 00:34:04.334 "r_mbytes_per_sec": 0, 00:34:04.334 "w_mbytes_per_sec": 0 00:34:04.334 }, 00:34:04.334 "claimed": true, 00:34:04.334 "claim_type": "exclusive_write", 00:34:04.334 "zoned": false, 00:34:04.334 "supported_io_types": { 00:34:04.334 "read": true, 00:34:04.334 "write": true, 00:34:04.334 "unmap": true, 00:34:04.334 "flush": true, 00:34:04.334 "reset": true, 00:34:04.334 "nvme_admin": false, 00:34:04.334 "nvme_io": false, 00:34:04.334 "nvme_io_md": false, 00:34:04.334 "write_zeroes": true, 00:34:04.334 "zcopy": true, 00:34:04.334 "get_zone_info": false, 00:34:04.334 "zone_management": false, 00:34:04.334 "zone_append": false, 00:34:04.334 "compare": false, 00:34:04.334 "compare_and_write": false, 00:34:04.334 "abort": true, 00:34:04.334 "seek_hole": false, 00:34:04.334 "seek_data": false, 00:34:04.334 "copy": true, 00:34:04.334 "nvme_iov_md": false 00:34:04.334 }, 00:34:04.334 "memory_domains": [ 00:34:04.334 { 00:34:04.334 "dma_device_id": "system", 00:34:04.334 "dma_device_type": 1 00:34:04.334 }, 00:34:04.334 { 00:34:04.334 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:04.334 "dma_device_type": 2 00:34:04.334 } 00:34:04.334 ], 00:34:04.334 "driver_specific": {} 00:34:04.593 } 00:34:04.593 ] 00:34:04.593 20:09:55 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:34:04.593 20:09:55 blockdev_crypto_sw -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:34:04.593 20:09:55 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:34:04.593 20:09:55 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:04.593 20:09:55 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:34:04.593 20:09:55 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # cat 00:34:04.593 20:09:55 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:34:04.593 20:09:55 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:34:04.593 20:09:55 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:04.593 20:09:55 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:34:04.593 20:09:55 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:34:04.593 20:09:55 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:34:04.593 20:09:55 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:04.593 20:09:55 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:34:04.593 20:09:55 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:34:04.593 20:09:55 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:34:04.593 20:09:55 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:04.593 20:09:56 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:34:04.593 20:09:56 blockdev_crypto_sw -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:34:04.593 20:09:56 blockdev_crypto_sw -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:34:04.593 20:09:56 blockdev_crypto_sw -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:34:04.593 20:09:56 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:34:04.593 20:09:56 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:04.593 20:09:56 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:34:04.593 20:09:56 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:34:04.593 20:09:56 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r .name 00:34:04.593 20:09:56 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "ef10aa21-cf1c-5ed3-b349-916a8eb7be75"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "ef10aa21-cf1c-5ed3-b349-916a8eb7be75",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "fb2fce29-7246-5ccb-9a03-f03ea9e87789"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "fb2fce29-7246-5ccb-9a03-f03ea9e87789",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:34:04.593 20:09:56 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:34:04.593 20:09:56 blockdev_crypto_sw -- bdev/blockdev.sh@751 -- # hello_world_bdev=crypto_ram 00:34:04.593 20:09:56 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:34:04.593 20:09:56 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # killprocess 1571503 00:34:04.593 20:09:56 blockdev_crypto_sw -- common/autotest_common.sh@950 -- # '[' -z 1571503 ']' 00:34:04.593 20:09:56 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # kill -0 1571503 00:34:04.593 20:09:56 blockdev_crypto_sw -- common/autotest_common.sh@955 -- # uname 00:34:04.593 20:09:56 blockdev_crypto_sw -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:34:04.593 20:09:56 blockdev_crypto_sw -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1571503 00:34:04.593 20:09:56 blockdev_crypto_sw -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:34:04.593 20:09:56 blockdev_crypto_sw -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:34:04.593 20:09:56 blockdev_crypto_sw -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1571503' 00:34:04.593 killing process with pid 1571503 00:34:04.593 20:09:56 blockdev_crypto_sw -- common/autotest_common.sh@969 -- # kill 1571503 00:34:04.593 20:09:56 blockdev_crypto_sw -- common/autotest_common.sh@974 -- # wait 1571503 00:34:05.159 20:09:56 blockdev_crypto_sw -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:34:05.159 20:09:56 blockdev_crypto_sw -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:34:05.159 20:09:56 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:34:05.159 20:09:56 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:34:05.159 20:09:56 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:05.159 ************************************ 00:34:05.159 START TEST bdev_hello_world 00:34:05.159 ************************************ 00:34:05.159 20:09:56 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:34:05.159 [2024-07-24 20:09:56.620739] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:34:05.160 [2024-07-24 20:09:56.620801] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1571707 ] 00:34:05.160 [2024-07-24 20:09:56.747091] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:05.417 [2024-07-24 20:09:56.846349] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:05.417 [2024-07-24 20:09:57.009952] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:34:05.417 [2024-07-24 20:09:57.010006] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:05.417 [2024-07-24 20:09:57.010021] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:05.675 [2024-07-24 20:09:57.017970] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:34:05.675 [2024-07-24 20:09:57.017993] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:05.675 [2024-07-24 20:09:57.018005] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:05.675 [2024-07-24 20:09:57.025992] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:34:05.675 [2024-07-24 20:09:57.026012] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:34:05.675 [2024-07-24 20:09:57.026023] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:05.675 [2024-07-24 20:09:57.066634] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:34:05.675 [2024-07-24 20:09:57.066671] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:34:05.675 [2024-07-24 20:09:57.066689] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:34:05.675 [2024-07-24 20:09:57.068084] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:34:05.675 [2024-07-24 20:09:57.068153] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:34:05.675 [2024-07-24 20:09:57.068169] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:34:05.675 [2024-07-24 20:09:57.068205] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:34:05.675 00:34:05.675 [2024-07-24 20:09:57.068223] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:34:05.934 00:34:05.934 real 0m0.730s 00:34:05.934 user 0m0.486s 00:34:05.934 sys 0m0.228s 00:34:05.934 20:09:57 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1126 -- # xtrace_disable 00:34:05.934 20:09:57 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:34:05.934 ************************************ 00:34:05.934 END TEST bdev_hello_world 00:34:05.934 ************************************ 00:34:05.934 20:09:57 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:34:05.934 20:09:57 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:34:05.934 20:09:57 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:34:05.934 20:09:57 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:05.934 ************************************ 00:34:05.934 START TEST bdev_bounds 00:34:05.934 ************************************ 00:34:05.934 20:09:57 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1125 -- # bdev_bounds '' 00:34:05.934 20:09:57 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=1571893 00:34:05.934 20:09:57 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:34:05.934 20:09:57 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:34:05.934 20:09:57 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 1571893' 00:34:05.934 Process bdevio pid: 1571893 00:34:05.934 20:09:57 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 1571893 00:34:05.934 20:09:57 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@831 -- # '[' -z 1571893 ']' 00:34:05.934 20:09:57 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:05.934 20:09:57 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@836 -- # local max_retries=100 00:34:05.934 20:09:57 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:05.934 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:05.934 20:09:57 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@840 -- # xtrace_disable 00:34:05.934 20:09:57 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:34:05.934 [2024-07-24 20:09:57.437807] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:34:05.934 [2024-07-24 20:09:57.437877] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1571893 ] 00:34:06.193 [2024-07-24 20:09:57.569702] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:34:06.193 [2024-07-24 20:09:57.674093] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:06.193 [2024-07-24 20:09:57.674197] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:34:06.193 [2024-07-24 20:09:57.674198] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:06.451 [2024-07-24 20:09:57.842872] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:34:06.451 [2024-07-24 20:09:57.842947] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:06.451 [2024-07-24 20:09:57.842961] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:06.451 [2024-07-24 20:09:57.850895] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:34:06.451 [2024-07-24 20:09:57.850916] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:06.451 [2024-07-24 20:09:57.850927] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:06.451 [2024-07-24 20:09:57.858915] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:34:06.451 [2024-07-24 20:09:57.858933] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:34:06.451 [2024-07-24 20:09:57.858944] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:07.018 20:09:58 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:34:07.018 20:09:58 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@864 -- # return 0 00:34:07.018 20:09:58 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:34:07.018 I/O targets: 00:34:07.018 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:34:07.018 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:34:07.018 00:34:07.018 00:34:07.018 CUnit - A unit testing framework for C - Version 2.1-3 00:34:07.018 http://cunit.sourceforge.net/ 00:34:07.018 00:34:07.018 00:34:07.018 Suite: bdevio tests on: crypto_ram3 00:34:07.018 Test: blockdev write read block ...passed 00:34:07.018 Test: blockdev write zeroes read block ...passed 00:34:07.018 Test: blockdev write zeroes read no split ...passed 00:34:07.018 Test: blockdev write zeroes read split ...passed 00:34:07.018 Test: blockdev write zeroes read split partial ...passed 00:34:07.018 Test: blockdev reset ...passed 00:34:07.018 Test: blockdev write read 8 blocks ...passed 00:34:07.018 Test: blockdev write read size > 128k ...passed 00:34:07.018 Test: blockdev write read invalid size ...passed 00:34:07.018 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:07.018 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:07.018 Test: blockdev write read max offset ...passed 00:34:07.018 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:07.018 Test: blockdev writev readv 8 blocks ...passed 00:34:07.018 Test: blockdev writev readv 30 x 1block ...passed 00:34:07.018 Test: blockdev writev readv block ...passed 00:34:07.018 Test: blockdev writev readv size > 128k ...passed 00:34:07.018 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:07.018 Test: blockdev comparev and writev ...passed 00:34:07.018 Test: blockdev nvme passthru rw ...passed 00:34:07.018 Test: blockdev nvme passthru vendor specific ...passed 00:34:07.018 Test: blockdev nvme admin passthru ...passed 00:34:07.018 Test: blockdev copy ...passed 00:34:07.018 Suite: bdevio tests on: crypto_ram 00:34:07.018 Test: blockdev write read block ...passed 00:34:07.018 Test: blockdev write zeroes read block ...passed 00:34:07.018 Test: blockdev write zeroes read no split ...passed 00:34:07.018 Test: blockdev write zeroes read split ...passed 00:34:07.018 Test: blockdev write zeroes read split partial ...passed 00:34:07.018 Test: blockdev reset ...passed 00:34:07.018 Test: blockdev write read 8 blocks ...passed 00:34:07.018 Test: blockdev write read size > 128k ...passed 00:34:07.018 Test: blockdev write read invalid size ...passed 00:34:07.018 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:07.018 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:07.018 Test: blockdev write read max offset ...passed 00:34:07.018 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:07.018 Test: blockdev writev readv 8 blocks ...passed 00:34:07.018 Test: blockdev writev readv 30 x 1block ...passed 00:34:07.018 Test: blockdev writev readv block ...passed 00:34:07.018 Test: blockdev writev readv size > 128k ...passed 00:34:07.018 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:07.019 Test: blockdev comparev and writev ...passed 00:34:07.019 Test: blockdev nvme passthru rw ...passed 00:34:07.019 Test: blockdev nvme passthru vendor specific ...passed 00:34:07.019 Test: blockdev nvme admin passthru ...passed 00:34:07.019 Test: blockdev copy ...passed 00:34:07.019 00:34:07.019 Run Summary: Type Total Ran Passed Failed Inactive 00:34:07.019 suites 2 2 n/a 0 0 00:34:07.019 tests 46 46 46 0 0 00:34:07.019 asserts 260 260 260 0 n/a 00:34:07.019 00:34:07.019 Elapsed time = 0.193 seconds 00:34:07.019 0 00:34:07.019 20:09:58 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 1571893 00:34:07.019 20:09:58 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@950 -- # '[' -z 1571893 ']' 00:34:07.019 20:09:58 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # kill -0 1571893 00:34:07.019 20:09:58 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@955 -- # uname 00:34:07.019 20:09:58 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:34:07.019 20:09:58 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1571893 00:34:07.019 20:09:58 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:34:07.019 20:09:58 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:34:07.019 20:09:58 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1571893' 00:34:07.019 killing process with pid 1571893 00:34:07.019 20:09:58 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@969 -- # kill 1571893 00:34:07.019 20:09:58 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@974 -- # wait 1571893 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:34:07.278 00:34:07.278 real 0m1.400s 00:34:07.278 user 0m3.465s 00:34:07.278 sys 0m0.387s 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1126 -- # xtrace_disable 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:34:07.278 ************************************ 00:34:07.278 END TEST bdev_bounds 00:34:07.278 ************************************ 00:34:07.278 20:09:58 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:34:07.278 20:09:58 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:34:07.278 20:09:58 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:34:07.278 20:09:58 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:07.278 ************************************ 00:34:07.278 START TEST bdev_nbd 00:34:07.278 ************************************ 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1125 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=2 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=2 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=1572101 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 1572101 /var/tmp/spdk-nbd.sock 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@831 -- # '[' -z 1572101 ']' 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@836 -- # local max_retries=100 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:34:07.278 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@840 -- # xtrace_disable 00:34:07.278 20:09:58 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:34:07.537 [2024-07-24 20:09:58.966835] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:34:07.537 [2024-07-24 20:09:58.966969] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:34:07.796 [2024-07-24 20:09:59.163377] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:07.796 [2024-07-24 20:09:59.259595] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:08.055 [2024-07-24 20:09:59.425875] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:34:08.055 [2024-07-24 20:09:59.425941] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:08.055 [2024-07-24 20:09:59.425956] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:08.055 [2024-07-24 20:09:59.433893] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:34:08.055 [2024-07-24 20:09:59.433913] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:08.055 [2024-07-24 20:09:59.433925] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:08.055 [2024-07-24 20:09:59.441915] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:34:08.055 [2024-07-24 20:09:59.441933] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:34:08.055 [2024-07-24 20:09:59.441944] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:08.623 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:34:08.623 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@864 -- # return 0 00:34:08.623 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:34:08.623 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:08.623 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:34:08.623 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:34:08.623 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:34:08.623 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:08.623 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:34:08.623 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:34:08.623 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:34:08.623 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:34:08.623 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:34:08.623 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:34:08.623 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:34:08.882 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:34:08.882 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:34:08.882 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:34:08.882 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:34:08.882 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:34:08.882 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:34:08.882 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:34:08.882 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:34:08.882 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:34:08.882 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:34:08.882 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:34:08.882 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:08.882 1+0 records in 00:34:08.882 1+0 records out 00:34:08.882 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000307179 s, 13.3 MB/s 00:34:08.882 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:08.882 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:34:08.882 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:08.882 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:34:08.882 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:34:08.882 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:08.882 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:34:08.882 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:34:09.141 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:34:09.141 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:34:09.141 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:34:09.141 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:34:09.141 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:34:09.141 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:34:09.141 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:34:09.141 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:34:09.141 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:34:09.141 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:34:09.141 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:34:09.141 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:09.141 1+0 records in 00:34:09.141 1+0 records out 00:34:09.141 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000337967 s, 12.1 MB/s 00:34:09.141 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:09.141 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:34:09.141 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:09.141 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:34:09.141 20:10:00 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:34:09.141 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:09.141 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:34:09.141 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:09.400 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:34:09.400 { 00:34:09.400 "nbd_device": "/dev/nbd0", 00:34:09.400 "bdev_name": "crypto_ram" 00:34:09.400 }, 00:34:09.400 { 00:34:09.400 "nbd_device": "/dev/nbd1", 00:34:09.400 "bdev_name": "crypto_ram3" 00:34:09.400 } 00:34:09.400 ]' 00:34:09.400 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:34:09.400 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:34:09.400 { 00:34:09.400 "nbd_device": "/dev/nbd0", 00:34:09.400 "bdev_name": "crypto_ram" 00:34:09.400 }, 00:34:09.400 { 00:34:09.400 "nbd_device": "/dev/nbd1", 00:34:09.400 "bdev_name": "crypto_ram3" 00:34:09.400 } 00:34:09.400 ]' 00:34:09.400 20:10:00 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:34:09.658 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:34:09.658 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:09.658 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:34:09.658 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:09.658 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:09.658 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:09.658 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:09.917 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:09.917 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:09.917 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:09.917 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:09.917 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:09.917 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:09.917 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:09.917 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:09.917 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:09.917 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:34:10.176 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:34:10.176 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:34:10.176 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:34:10.176 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:10.176 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:10.176 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:34:10.176 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:10.176 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:10.176 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:10.176 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:10.176 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:10.435 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:34:10.435 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:34:10.435 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:10.435 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:34:10.435 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:10.435 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:34:10.435 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:34:10.435 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:34:10.435 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:34:10.435 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:34:10.435 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:34:10.435 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:34:10.435 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:34:10.435 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:10.435 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:34:10.435 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:34:10.435 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:34:10.435 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:34:10.435 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:34:10.435 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:10.435 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:34:10.435 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:34:10.435 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:34:10.435 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:34:10.435 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:34:10.435 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:34:10.435 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:34:10.435 20:10:01 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:34:10.695 /dev/nbd0 00:34:10.695 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:34:10.695 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:34:10.695 20:10:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:34:10.695 20:10:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:34:10.695 20:10:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:34:10.695 20:10:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:34:10.695 20:10:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:34:10.695 20:10:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:34:10.695 20:10:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:34:10.695 20:10:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:34:10.695 20:10:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:10.695 1+0 records in 00:34:10.695 1+0 records out 00:34:10.695 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000285557 s, 14.3 MB/s 00:34:10.695 20:10:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:10.695 20:10:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:34:10.695 20:10:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:10.695 20:10:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:34:10.695 20:10:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:34:10.695 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:10.695 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:34:10.695 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:34:10.954 /dev/nbd1 00:34:10.954 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:34:10.954 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:34:10.954 20:10:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:34:10.954 20:10:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:34:10.954 20:10:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:34:10.954 20:10:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:34:10.954 20:10:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:34:10.954 20:10:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:34:10.954 20:10:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:34:10.954 20:10:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:34:10.954 20:10:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:10.954 1+0 records in 00:34:10.954 1+0 records out 00:34:10.954 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000362839 s, 11.3 MB/s 00:34:10.954 20:10:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:10.954 20:10:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:34:10.954 20:10:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:10.954 20:10:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:34:10.954 20:10:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:34:10.954 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:10.954 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:34:10.954 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:10.954 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:10.954 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:11.213 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:34:11.213 { 00:34:11.213 "nbd_device": "/dev/nbd0", 00:34:11.213 "bdev_name": "crypto_ram" 00:34:11.213 }, 00:34:11.213 { 00:34:11.213 "nbd_device": "/dev/nbd1", 00:34:11.213 "bdev_name": "crypto_ram3" 00:34:11.213 } 00:34:11.213 ]' 00:34:11.213 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:34:11.213 { 00:34:11.213 "nbd_device": "/dev/nbd0", 00:34:11.213 "bdev_name": "crypto_ram" 00:34:11.213 }, 00:34:11.213 { 00:34:11.213 "nbd_device": "/dev/nbd1", 00:34:11.213 "bdev_name": "crypto_ram3" 00:34:11.213 } 00:34:11.213 ]' 00:34:11.213 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:11.213 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:34:11.213 /dev/nbd1' 00:34:11.213 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:34:11.213 /dev/nbd1' 00:34:11.213 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:11.213 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:34:11.213 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:34:11.213 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:34:11.213 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:34:11.213 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:34:11.213 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:34:11.213 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:34:11.213 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:34:11.213 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:11.213 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:34:11.213 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:34:11.213 256+0 records in 00:34:11.213 256+0 records out 00:34:11.213 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0107301 s, 97.7 MB/s 00:34:11.213 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:11.213 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:34:11.213 256+0 records in 00:34:11.213 256+0 records out 00:34:11.213 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0284471 s, 36.9 MB/s 00:34:11.213 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:11.213 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:34:11.472 256+0 records in 00:34:11.472 256+0 records out 00:34:11.472 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0554126 s, 18.9 MB/s 00:34:11.472 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:34:11.472 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:34:11.472 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:34:11.472 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:34:11.472 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:11.472 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:34:11.472 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:34:11.472 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:11.472 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:34:11.472 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:11.472 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:34:11.472 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:11.472 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:34:11.472 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:11.472 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:34:11.472 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:11.472 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:11.472 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:11.472 20:10:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:11.730 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:11.730 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:11.730 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:11.731 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:11.731 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:11.731 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:11.731 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:11.731 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:11.731 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:11.731 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:34:11.988 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:34:11.988 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:34:11.988 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:34:11.988 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:11.988 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:11.989 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:34:11.989 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:11.989 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:11.989 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:11.989 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:11.989 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:12.247 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:34:12.247 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:34:12.247 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:12.247 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:34:12.247 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:34:12.247 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:12.247 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:34:12.247 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:34:12.247 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:34:12.247 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:34:12.247 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:34:12.247 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:34:12.247 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:34:12.247 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:12.247 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:34:12.247 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:34:12.247 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:34:12.247 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:34:12.506 malloc_lvol_verify 00:34:12.506 20:10:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:34:12.765 c993c9f3-9ca3-4010-b17f-cd728d09d382 00:34:12.765 20:10:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:34:13.023 237a7293-cf04-4ba0-a411-840d45b691a7 00:34:13.023 20:10:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:34:13.281 /dev/nbd0 00:34:13.281 20:10:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:34:13.281 mke2fs 1.46.5 (30-Dec-2021) 00:34:13.281 Discarding device blocks: 0/4096 done 00:34:13.281 Creating filesystem with 4096 1k blocks and 1024 inodes 00:34:13.281 00:34:13.281 Allocating group tables: 0/1 done 00:34:13.281 Writing inode tables: 0/1 done 00:34:13.281 Creating journal (1024 blocks): done 00:34:13.281 Writing superblocks and filesystem accounting information: 0/1 done 00:34:13.281 00:34:13.281 20:10:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:34:13.281 20:10:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:34:13.281 20:10:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:13.281 20:10:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:34:13.281 20:10:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:13.281 20:10:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:13.281 20:10:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:13.281 20:10:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:13.540 20:10:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:13.540 20:10:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:13.540 20:10:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:13.540 20:10:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:13.540 20:10:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:13.540 20:10:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:13.540 20:10:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:13.540 20:10:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:13.540 20:10:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:34:13.540 20:10:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:34:13.540 20:10:04 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 1572101 00:34:13.540 20:10:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@950 -- # '[' -z 1572101 ']' 00:34:13.540 20:10:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # kill -0 1572101 00:34:13.540 20:10:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@955 -- # uname 00:34:13.540 20:10:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:34:13.540 20:10:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1572101 00:34:13.540 20:10:05 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:34:13.540 20:10:05 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:34:13.540 20:10:05 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1572101' 00:34:13.540 killing process with pid 1572101 00:34:13.540 20:10:05 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@969 -- # kill 1572101 00:34:13.540 20:10:05 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@974 -- # wait 1572101 00:34:13.799 20:10:05 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:34:13.799 00:34:13.799 real 0m6.465s 00:34:13.799 user 0m9.247s 00:34:13.799 sys 0m2.535s 00:34:13.799 20:10:05 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1126 -- # xtrace_disable 00:34:13.799 20:10:05 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:34:13.799 ************************************ 00:34:13.799 END TEST bdev_nbd 00:34:13.799 ************************************ 00:34:13.799 20:10:05 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:34:13.799 20:10:05 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # '[' crypto_sw = nvme ']' 00:34:13.799 20:10:05 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # '[' crypto_sw = gpt ']' 00:34:13.799 20:10:05 blockdev_crypto_sw -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:34:13.799 20:10:05 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:34:13.799 20:10:05 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:34:13.799 20:10:05 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:14.059 ************************************ 00:34:14.059 START TEST bdev_fio 00:34:14.059 ************************************ 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1125 -- # fio_test_suite '' 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:34:14.059 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram]' 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram3]' 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram3 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:14.059 ************************************ 00:34:14.059 START TEST bdev_fio_rw_verify 00:34:14.059 ************************************ 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:14.059 20:10:05 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:34:14.060 20:10:05 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:14.319 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:14.319 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:14.319 fio-3.35 00:34:14.319 Starting 2 threads 00:34:26.533 00:34:26.533 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=1573212: Wed Jul 24 20:10:16 2024 00:34:26.533 read: IOPS=25.7k, BW=100MiB/s (105MB/s)(1005MiB/10000msec) 00:34:26.533 slat (usec): min=10, max=596, avg=17.50, stdev= 4.16 00:34:26.533 clat (usec): min=8, max=782, avg=124.33, stdev=45.81 00:34:26.533 lat (usec): min=27, max=801, avg=141.82, stdev=46.83 00:34:26.533 clat percentiles (usec): 00:34:26.533 | 50.000th=[ 123], 99.000th=[ 237], 99.900th=[ 269], 99.990th=[ 293], 00:34:26.533 | 99.999th=[ 742] 00:34:26.534 write: IOPS=30.9k, BW=121MiB/s (126MB/s)(1142MiB/9476msec); 0 zone resets 00:34:26.534 slat (usec): min=11, max=137, avg=28.70, stdev= 4.95 00:34:26.534 clat (usec): min=25, max=849, avg=166.33, stdev=68.61 00:34:26.534 lat (usec): min=45, max=965, avg=195.03, stdev=69.09 00:34:26.534 clat percentiles (usec): 00:34:26.534 | 50.000th=[ 165], 99.000th=[ 297], 99.900th=[ 326], 99.990th=[ 578], 00:34:26.534 | 99.999th=[ 848] 00:34:26.534 bw ( KiB/s): min=111560, max=123752, per=94.79%, avg=117029.05, stdev=1520.92, samples=38 00:34:26.534 iops : min=27890, max=30938, avg=29257.26, stdev=380.23, samples=38 00:34:26.534 lat (usec) : 10=0.01%, 20=0.01%, 50=2.97%, 100=24.10%, 250=65.72% 00:34:26.534 lat (usec) : 500=7.18%, 750=0.01%, 1000=0.01% 00:34:26.534 cpu : usr=99.61%, sys=0.01%, ctx=26, majf=0, minf=450 00:34:26.534 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:34:26.534 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:26.534 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:26.534 issued rwts: total=257276,292476,0,0 short=0,0,0,0 dropped=0,0,0,0 00:34:26.534 latency : target=0, window=0, percentile=100.00%, depth=8 00:34:26.534 00:34:26.534 Run status group 0 (all jobs): 00:34:26.534 READ: bw=100MiB/s (105MB/s), 100MiB/s-100MiB/s (105MB/s-105MB/s), io=1005MiB (1054MB), run=10000-10000msec 00:34:26.534 WRITE: bw=121MiB/s (126MB/s), 121MiB/s-121MiB/s (126MB/s-126MB/s), io=1142MiB (1198MB), run=9476-9476msec 00:34:26.534 00:34:26.534 real 0m11.184s 00:34:26.534 user 0m23.968s 00:34:26.534 sys 0m0.333s 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:34:26.534 ************************************ 00:34:26.534 END TEST bdev_fio_rw_verify 00:34:26.534 ************************************ 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "ef10aa21-cf1c-5ed3-b349-916a8eb7be75"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "ef10aa21-cf1c-5ed3-b349-916a8eb7be75",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "fb2fce29-7246-5ccb-9a03-f03ea9e87789"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "fb2fce29-7246-5ccb-9a03-f03ea9e87789",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n crypto_ram 00:34:26.534 crypto_ram3 ]] 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "ef10aa21-cf1c-5ed3-b349-916a8eb7be75"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "ef10aa21-cf1c-5ed3-b349-916a8eb7be75",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "fb2fce29-7246-5ccb-9a03-f03ea9e87789"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "fb2fce29-7246-5ccb-9a03-f03ea9e87789",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram]' 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram3]' 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram3 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:34:26.534 20:10:16 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:26.534 ************************************ 00:34:26.534 START TEST bdev_fio_trim 00:34:26.534 ************************************ 00:34:26.535 20:10:16 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:26.535 20:10:16 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:26.535 20:10:16 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:34:26.535 20:10:16 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:34:26.535 20:10:16 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:34:26.535 20:10:16 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:26.535 20:10:16 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:34:26.535 20:10:16 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:34:26.535 20:10:16 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:26.535 20:10:16 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:26.535 20:10:16 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:34:26.535 20:10:16 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:26.535 20:10:16 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:26.535 20:10:16 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:26.535 20:10:16 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:26.535 20:10:16 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:26.535 20:10:16 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:34:26.535 20:10:16 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:26.535 20:10:17 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:26.535 20:10:17 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:26.535 20:10:17 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:34:26.535 20:10:17 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:26.535 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:26.535 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:26.535 fio-3.35 00:34:26.535 Starting 2 threads 00:34:36.625 00:34:36.625 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=1574718: Wed Jul 24 20:10:27 2024 00:34:36.625 write: IOPS=26.2k, BW=102MiB/s (107MB/s)(1023MiB/10001msec); 0 zone resets 00:34:36.625 slat (usec): min=14, max=658, avg=33.83, stdev=11.15 00:34:36.625 clat (usec): min=37, max=1880, avg=247.21, stdev=144.54 00:34:36.625 lat (usec): min=52, max=1899, avg=281.04, stdev=151.68 00:34:36.625 clat percentiles (usec): 00:34:36.625 | 50.000th=[ 221], 99.000th=[ 611], 99.900th=[ 660], 99.990th=[ 865], 00:34:36.625 | 99.999th=[ 1254] 00:34:36.625 bw ( KiB/s): min=82144, max=156376, per=100.00%, avg=105341.47, stdev=11749.83, samples=38 00:34:36.625 iops : min=20536, max=39094, avg=26335.37, stdev=2937.46, samples=38 00:34:36.625 trim: IOPS=26.2k, BW=102MiB/s (107MB/s)(1023MiB/10001msec); 0 zone resets 00:34:36.625 slat (usec): min=6, max=1727, avg=16.72, stdev= 7.37 00:34:36.625 clat (usec): min=45, max=1900, avg=163.37, stdev=66.99 00:34:36.625 lat (usec): min=53, max=1910, avg=180.08, stdev=69.66 00:34:36.626 clat percentiles (usec): 00:34:36.626 | 50.000th=[ 153], 99.000th=[ 338], 99.900th=[ 367], 99.990th=[ 469], 00:34:36.626 | 99.999th=[ 660] 00:34:36.626 bw ( KiB/s): min=82144, max=156376, per=100.00%, avg=105342.74, stdev=11750.66, samples=38 00:34:36.626 iops : min=20536, max=39094, avg=26335.68, stdev=2937.67, samples=38 00:34:36.626 lat (usec) : 50=0.58%, 100=16.78%, 250=54.10%, 500=25.59%, 750=2.94% 00:34:36.626 lat (usec) : 1000=0.01% 00:34:36.626 lat (msec) : 2=0.01% 00:34:36.626 cpu : usr=99.61%, sys=0.01%, ctx=28, majf=0, minf=399 00:34:36.626 IO depths : 1=7.3%, 2=17.1%, 4=60.5%, 8=15.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:34:36.626 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:36.626 complete : 0=0.0%, 4=86.9%, 8=13.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:36.626 issued rwts: total=0,261927,261927,0 short=0,0,0,0 dropped=0,0,0,0 00:34:36.626 latency : target=0, window=0, percentile=100.00%, depth=8 00:34:36.626 00:34:36.626 Run status group 0 (all jobs): 00:34:36.626 WRITE: bw=102MiB/s (107MB/s), 102MiB/s-102MiB/s (107MB/s-107MB/s), io=1023MiB (1073MB), run=10001-10001msec 00:34:36.626 TRIM: bw=102MiB/s (107MB/s), 102MiB/s-102MiB/s (107MB/s-107MB/s), io=1023MiB (1073MB), run=10001-10001msec 00:34:36.626 00:34:36.626 real 0m11.125s 00:34:36.626 user 0m23.932s 00:34:36.626 sys 0m0.368s 00:34:36.626 20:10:28 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1126 -- # xtrace_disable 00:34:36.626 20:10:28 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:34:36.626 ************************************ 00:34:36.626 END TEST bdev_fio_trim 00:34:36.626 ************************************ 00:34:36.626 20:10:28 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:34:36.626 20:10:28 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:36.626 20:10:28 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:34:36.626 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:36.626 20:10:28 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:34:36.626 00:34:36.626 real 0m22.718s 00:34:36.626 user 0m48.120s 00:34:36.626 sys 0m0.911s 00:34:36.626 20:10:28 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1126 -- # xtrace_disable 00:34:36.626 20:10:28 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:36.626 ************************************ 00:34:36.626 END TEST bdev_fio 00:34:36.626 ************************************ 00:34:36.626 20:10:28 blockdev_crypto_sw -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:34:36.626 20:10:28 blockdev_crypto_sw -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:34:36.626 20:10:28 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:34:36.626 20:10:28 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:34:36.626 20:10:28 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:36.626 ************************************ 00:34:36.626 START TEST bdev_verify 00:34:36.626 ************************************ 00:34:36.626 20:10:28 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:34:36.885 [2024-07-24 20:10:28.271265] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:34:36.885 [2024-07-24 20:10:28.271333] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1576129 ] 00:34:36.885 [2024-07-24 20:10:28.393786] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:34:37.144 [2024-07-24 20:10:28.504416] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:37.144 [2024-07-24 20:10:28.504421] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:37.144 [2024-07-24 20:10:28.674671] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:34:37.144 [2024-07-24 20:10:28.674739] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:37.144 [2024-07-24 20:10:28.674754] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:37.144 [2024-07-24 20:10:28.682690] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:34:37.144 [2024-07-24 20:10:28.682709] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:37.144 [2024-07-24 20:10:28.682720] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:37.144 [2024-07-24 20:10:28.690710] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:34:37.144 [2024-07-24 20:10:28.690730] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:34:37.144 [2024-07-24 20:10:28.690742] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:37.403 Running I/O for 5 seconds... 00:34:42.678 00:34:42.678 Latency(us) 00:34:42.678 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:42.678 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:42.678 Verification LBA range: start 0x0 length 0x800 00:34:42.678 crypto_ram : 5.03 5878.55 22.96 0.00 0.00 21690.09 1852.10 24276.81 00:34:42.678 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:42.678 Verification LBA range: start 0x800 length 0x800 00:34:42.678 crypto_ram : 5.01 4754.29 18.57 0.00 0.00 26809.69 2108.55 27582.11 00:34:42.678 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:42.678 Verification LBA range: start 0x0 length 0x800 00:34:42.678 crypto_ram3 : 5.04 2948.10 11.52 0.00 0.00 43179.13 1759.50 29405.72 00:34:42.678 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:42.678 Verification LBA range: start 0x800 length 0x800 00:34:42.678 crypto_ram3 : 5.03 2390.62 9.34 0.00 0.00 53201.28 2806.65 34648.60 00:34:42.678 =================================================================================================================== 00:34:42.678 Total : 15971.56 62.39 0.00 0.00 31908.63 1759.50 34648.60 00:34:42.678 00:34:42.678 real 0m5.818s 00:34:42.678 user 0m10.912s 00:34:42.678 sys 0m0.248s 00:34:42.678 20:10:34 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:34:42.678 20:10:34 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:34:42.678 ************************************ 00:34:42.678 END TEST bdev_verify 00:34:42.678 ************************************ 00:34:42.678 20:10:34 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:34:42.678 20:10:34 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:34:42.678 20:10:34 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:34:42.678 20:10:34 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:42.678 ************************************ 00:34:42.678 START TEST bdev_verify_big_io 00:34:42.678 ************************************ 00:34:42.678 20:10:34 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:34:42.678 [2024-07-24 20:10:34.166339] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:34:42.678 [2024-07-24 20:10:34.166398] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1576849 ] 00:34:42.936 [2024-07-24 20:10:34.279754] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:34:42.936 [2024-07-24 20:10:34.386379] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:42.936 [2024-07-24 20:10:34.386383] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:43.196 [2024-07-24 20:10:34.560141] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:34:43.196 [2024-07-24 20:10:34.560207] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:43.196 [2024-07-24 20:10:34.560222] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:43.196 [2024-07-24 20:10:34.568161] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:34:43.196 [2024-07-24 20:10:34.568179] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:43.196 [2024-07-24 20:10:34.568191] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:43.196 [2024-07-24 20:10:34.576186] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:34:43.196 [2024-07-24 20:10:34.576206] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:34:43.196 [2024-07-24 20:10:34.576218] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:43.196 Running I/O for 5 seconds... 00:34:48.473 00:34:48.473 Latency(us) 00:34:48.473 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:48.473 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:48.473 Verification LBA range: start 0x0 length 0x80 00:34:48.473 crypto_ram : 5.17 470.83 29.43 0.00 0.00 265540.77 6867.03 351956.81 00:34:48.473 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:48.473 Verification LBA range: start 0x80 length 0x80 00:34:48.473 crypto_ram : 5.18 395.54 24.72 0.00 0.00 315094.75 7636.37 412135.96 00:34:48.473 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:48.473 Verification LBA range: start 0x0 length 0x80 00:34:48.473 crypto_ram3 : 5.38 261.76 16.36 0.00 0.00 463383.13 6097.70 390252.63 00:34:48.473 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:48.473 Verification LBA range: start 0x80 length 0x80 00:34:48.473 crypto_ram3 : 5.40 213.39 13.34 0.00 0.00 561828.66 8890.10 444960.95 00:34:48.473 =================================================================================================================== 00:34:48.473 Total : 1341.53 83.85 0.00 0.00 368008.42 6097.70 444960.95 00:34:48.732 00:34:48.732 real 0m6.185s 00:34:48.732 user 0m11.662s 00:34:48.732 sys 0m0.253s 00:34:48.732 20:10:40 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:34:48.732 20:10:40 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:34:48.732 ************************************ 00:34:48.732 END TEST bdev_verify_big_io 00:34:48.732 ************************************ 00:34:48.992 20:10:40 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:48.992 20:10:40 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:34:48.992 20:10:40 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:34:48.992 20:10:40 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:48.992 ************************************ 00:34:48.992 START TEST bdev_write_zeroes 00:34:48.992 ************************************ 00:34:48.992 20:10:40 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:48.992 [2024-07-24 20:10:40.450073] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:34:48.992 [2024-07-24 20:10:40.450140] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1577610 ] 00:34:48.992 [2024-07-24 20:10:40.584801] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:49.251 [2024-07-24 20:10:40.690526] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:49.510 [2024-07-24 20:10:40.858686] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:34:49.510 [2024-07-24 20:10:40.858758] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:49.510 [2024-07-24 20:10:40.858774] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:49.510 [2024-07-24 20:10:40.866705] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:34:49.510 [2024-07-24 20:10:40.866724] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:49.510 [2024-07-24 20:10:40.866736] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:49.510 [2024-07-24 20:10:40.874726] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:34:49.510 [2024-07-24 20:10:40.874743] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:34:49.510 [2024-07-24 20:10:40.874754] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:49.510 Running I/O for 1 seconds... 00:34:50.448 00:34:50.448 Latency(us) 00:34:50.448 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:50.448 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:50.448 crypto_ram : 1.01 26449.96 103.32 0.00 0.00 4826.84 1303.60 6610.59 00:34:50.448 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:50.448 crypto_ram3 : 1.01 13197.97 51.55 0.00 0.00 9622.25 5983.72 9915.88 00:34:50.448 =================================================================================================================== 00:34:50.448 Total : 39647.93 154.87 0.00 0.00 6425.31 1303.60 9915.88 00:34:50.707 00:34:50.707 real 0m1.764s 00:34:50.707 user 0m1.510s 00:34:50.707 sys 0m0.236s 00:34:50.707 20:10:42 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1126 -- # xtrace_disable 00:34:50.707 20:10:42 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:34:50.707 ************************************ 00:34:50.707 END TEST bdev_write_zeroes 00:34:50.707 ************************************ 00:34:50.707 20:10:42 blockdev_crypto_sw -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:50.707 20:10:42 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:34:50.707 20:10:42 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:34:50.707 20:10:42 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:50.707 ************************************ 00:34:50.707 START TEST bdev_json_nonenclosed 00:34:50.707 ************************************ 00:34:50.707 20:10:42 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:50.967 [2024-07-24 20:10:42.301803] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:34:50.967 [2024-07-24 20:10:42.301872] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1577924 ] 00:34:50.967 [2024-07-24 20:10:42.432641] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:50.967 [2024-07-24 20:10:42.535721] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:50.967 [2024-07-24 20:10:42.535799] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:34:50.967 [2024-07-24 20:10:42.535816] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:34:50.967 [2024-07-24 20:10:42.535829] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:34:51.227 00:34:51.227 real 0m0.407s 00:34:51.227 user 0m0.249s 00:34:51.227 sys 0m0.154s 00:34:51.227 20:10:42 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:34:51.227 20:10:42 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:34:51.227 ************************************ 00:34:51.227 END TEST bdev_json_nonenclosed 00:34:51.227 ************************************ 00:34:51.227 20:10:42 blockdev_crypto_sw -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:51.227 20:10:42 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:34:51.227 20:10:42 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:34:51.227 20:10:42 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:51.227 ************************************ 00:34:51.227 START TEST bdev_json_nonarray 00:34:51.227 ************************************ 00:34:51.227 20:10:42 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:51.227 [2024-07-24 20:10:42.798483] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:34:51.227 [2024-07-24 20:10:42.798552] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1577948 ] 00:34:51.486 [2024-07-24 20:10:42.929307] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:51.486 [2024-07-24 20:10:43.033125] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:51.486 [2024-07-24 20:10:43.033206] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:34:51.486 [2024-07-24 20:10:43.033224] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:34:51.486 [2024-07-24 20:10:43.033237] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:34:51.746 00:34:51.746 real 0m0.402s 00:34:51.746 user 0m0.246s 00:34:51.746 sys 0m0.153s 00:34:51.746 20:10:43 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1126 -- # xtrace_disable 00:34:51.746 20:10:43 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:34:51.746 ************************************ 00:34:51.746 END TEST bdev_json_nonarray 00:34:51.746 ************************************ 00:34:51.746 20:10:43 blockdev_crypto_sw -- bdev/blockdev.sh@786 -- # [[ crypto_sw == bdev ]] 00:34:51.746 20:10:43 blockdev_crypto_sw -- bdev/blockdev.sh@793 -- # [[ crypto_sw == gpt ]] 00:34:51.746 20:10:43 blockdev_crypto_sw -- bdev/blockdev.sh@797 -- # [[ crypto_sw == crypto_sw ]] 00:34:51.746 20:10:43 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:34:51.746 20:10:43 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:34:51.746 20:10:43 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:34:51.746 20:10:43 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:51.746 ************************************ 00:34:51.746 START TEST bdev_crypto_enomem 00:34:51.746 ************************************ 00:34:51.746 20:10:43 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1125 -- # bdev_crypto_enomem 00:34:51.746 20:10:43 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@634 -- # local base_dev=base0 00:34:51.746 20:10:43 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local test_dev=crypt0 00:34:51.746 20:10:43 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local err_dev=EE_base0 00:34:51.746 20:10:43 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local qd=32 00:34:51.746 20:10:43 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # ERR_PID=1578038 00:34:51.746 20:10:43 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:34:51.746 20:10:43 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:34:51.746 20:10:43 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # waitforlisten 1578038 00:34:51.746 20:10:43 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@831 -- # '[' -z 1578038 ']' 00:34:51.746 20:10:43 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:51.746 20:10:43 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@836 -- # local max_retries=100 00:34:51.746 20:10:43 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:51.746 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:51.746 20:10:43 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@840 -- # xtrace_disable 00:34:51.746 20:10:43 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:51.746 [2024-07-24 20:10:43.285379] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:34:51.746 [2024-07-24 20:10:43.285453] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1578038 ] 00:34:52.013 [2024-07-24 20:10:43.422963] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:52.013 [2024-07-24 20:10:43.556160] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:52.960 20:10:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:34:52.960 20:10:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@864 -- # return 0 00:34:52.960 20:10:44 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@644 -- # rpc_cmd 00:34:52.960 20:10:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@561 -- # xtrace_disable 00:34:52.960 20:10:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:52.960 true 00:34:52.960 base0 00:34:52.960 true 00:34:52.960 [2024-07-24 20:10:44.263457] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:34:52.960 crypt0 00:34:52.960 20:10:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:34:52.960 20:10:44 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@651 -- # waitforbdev crypt0 00:34:52.960 20:10:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local bdev_name=crypt0 00:34:52.960 20:10:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:34:52.960 20:10:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@901 -- # local i 00:34:52.960 20:10:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:34:52.960 20:10:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:34:52.960 20:10:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:34:52.960 20:10:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@561 -- # xtrace_disable 00:34:52.960 20:10:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:52.960 20:10:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:34:52.960 20:10:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:34:52.960 20:10:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@561 -- # xtrace_disable 00:34:52.960 20:10:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:52.960 [ 00:34:52.960 { 00:34:52.960 "name": "crypt0", 00:34:52.960 "aliases": [ 00:34:52.960 "3dc42bdc-ca8f-5b6e-8acf-a3450b59e612" 00:34:52.960 ], 00:34:52.960 "product_name": "crypto", 00:34:52.960 "block_size": 512, 00:34:52.960 "num_blocks": 2097152, 00:34:52.960 "uuid": "3dc42bdc-ca8f-5b6e-8acf-a3450b59e612", 00:34:52.960 "assigned_rate_limits": { 00:34:52.960 "rw_ios_per_sec": 0, 00:34:52.960 "rw_mbytes_per_sec": 0, 00:34:52.960 "r_mbytes_per_sec": 0, 00:34:52.960 "w_mbytes_per_sec": 0 00:34:52.960 }, 00:34:52.960 "claimed": false, 00:34:52.960 "zoned": false, 00:34:52.960 "supported_io_types": { 00:34:52.960 "read": true, 00:34:52.960 "write": true, 00:34:52.960 "unmap": false, 00:34:52.960 "flush": false, 00:34:52.960 "reset": true, 00:34:52.960 "nvme_admin": false, 00:34:52.960 "nvme_io": false, 00:34:52.960 "nvme_io_md": false, 00:34:52.960 "write_zeroes": true, 00:34:52.960 "zcopy": false, 00:34:52.960 "get_zone_info": false, 00:34:52.960 "zone_management": false, 00:34:52.960 "zone_append": false, 00:34:52.960 "compare": false, 00:34:52.960 "compare_and_write": false, 00:34:52.960 "abort": false, 00:34:52.960 "seek_hole": false, 00:34:52.960 "seek_data": false, 00:34:52.960 "copy": false, 00:34:52.960 "nvme_iov_md": false 00:34:52.960 }, 00:34:52.960 "memory_domains": [ 00:34:52.960 { 00:34:52.960 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:52.960 "dma_device_type": 2 00:34:52.960 } 00:34:52.960 ], 00:34:52.960 "driver_specific": { 00:34:52.960 "crypto": { 00:34:52.960 "base_bdev_name": "EE_base0", 00:34:52.960 "name": "crypt0", 00:34:52.960 "key_name": "test_dek_sw" 00:34:52.960 } 00:34:52.960 } 00:34:52.960 } 00:34:52.960 ] 00:34:52.960 20:10:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:34:52.960 20:10:44 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@907 -- # return 0 00:34:52.960 20:10:44 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # rpcpid=1578152 00:34:52.960 20:10:44 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@656 -- # sleep 1 00:34:52.960 20:10:44 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:34:52.960 Running I/O for 5 seconds... 00:34:53.898 20:10:45 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:34:53.898 20:10:45 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@561 -- # xtrace_disable 00:34:53.898 20:10:45 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:53.898 20:10:45 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:34:53.898 20:10:45 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@659 -- # wait 1578152 00:34:58.090 00:34:58.090 Latency(us) 00:34:58.091 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:58.091 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:34:58.091 crypt0 : 5.00 28000.29 109.38 0.00 0.00 1137.97 527.14 1495.93 00:34:58.091 =================================================================================================================== 00:34:58.091 Total : 28000.29 109.38 0.00 0.00 1137.97 527.14 1495.93 00:34:58.091 0 00:34:58.091 20:10:49 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@661 -- # rpc_cmd bdev_crypto_delete crypt0 00:34:58.091 20:10:49 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@561 -- # xtrace_disable 00:34:58.091 20:10:49 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:58.091 20:10:49 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:34:58.091 20:10:49 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@663 -- # killprocess 1578038 00:34:58.091 20:10:49 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@950 -- # '[' -z 1578038 ']' 00:34:58.091 20:10:49 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # kill -0 1578038 00:34:58.091 20:10:49 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@955 -- # uname 00:34:58.091 20:10:49 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:34:58.091 20:10:49 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1578038 00:34:58.091 20:10:49 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:34:58.091 20:10:49 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:34:58.091 20:10:49 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1578038' 00:34:58.091 killing process with pid 1578038 00:34:58.091 20:10:49 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@969 -- # kill 1578038 00:34:58.091 Received shutdown signal, test time was about 5.000000 seconds 00:34:58.091 00:34:58.091 Latency(us) 00:34:58.091 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:58.091 =================================================================================================================== 00:34:58.091 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:34:58.091 20:10:49 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@974 -- # wait 1578038 00:34:58.350 20:10:49 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # trap - SIGINT SIGTERM EXIT 00:34:58.350 00:34:58.350 real 0m6.582s 00:34:58.350 user 0m6.813s 00:34:58.350 sys 0m0.442s 00:34:58.350 20:10:49 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1126 -- # xtrace_disable 00:34:58.350 20:10:49 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:58.350 ************************************ 00:34:58.350 END TEST bdev_crypto_enomem 00:34:58.350 ************************************ 00:34:58.350 20:10:49 blockdev_crypto_sw -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:34:58.350 20:10:49 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # cleanup 00:34:58.350 20:10:49 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:34:58.350 20:10:49 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:58.350 20:10:49 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:34:58.350 20:10:49 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:34:58.350 20:10:49 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:34:58.350 20:10:49 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:34:58.350 00:34:58.350 real 0m55.365s 00:34:58.350 user 1m35.149s 00:34:58.350 sys 0m6.776s 00:34:58.350 20:10:49 blockdev_crypto_sw -- common/autotest_common.sh@1126 -- # xtrace_disable 00:34:58.350 20:10:49 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:58.350 ************************************ 00:34:58.350 END TEST blockdev_crypto_sw 00:34:58.350 ************************************ 00:34:58.350 20:10:49 -- spdk/autotest.sh@363 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:34:58.350 20:10:49 -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:34:58.350 20:10:49 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:34:58.350 20:10:49 -- common/autotest_common.sh@10 -- # set +x 00:34:58.609 ************************************ 00:34:58.609 START TEST blockdev_crypto_qat 00:34:58.609 ************************************ 00:34:58.609 20:10:49 blockdev_crypto_qat -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:34:58.609 * Looking for test storage... 00:34:58.609 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:34:58.609 20:10:50 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:34:58.609 20:10:50 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:34:58.609 20:10:50 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:34:58.609 20:10:50 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:58.609 20:10:50 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:34:58.609 20:10:50 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:34:58.609 20:10:50 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:34:58.609 20:10:50 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:34:58.609 20:10:50 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:34:58.609 20:10:50 blockdev_crypto_qat -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:34:58.609 20:10:50 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:34:58.610 20:10:50 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:34:58.610 20:10:50 blockdev_crypto_qat -- bdev/blockdev.sh@673 -- # uname -s 00:34:58.610 20:10:50 blockdev_crypto_qat -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:34:58.610 20:10:50 blockdev_crypto_qat -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:34:58.610 20:10:50 blockdev_crypto_qat -- bdev/blockdev.sh@681 -- # test_type=crypto_qat 00:34:58.610 20:10:50 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # crypto_device= 00:34:58.610 20:10:50 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # dek= 00:34:58.610 20:10:50 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # env_ctx= 00:34:58.610 20:10:50 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:34:58.610 20:10:50 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:34:58.610 20:10:50 blockdev_crypto_qat -- bdev/blockdev.sh@689 -- # [[ crypto_qat == bdev ]] 00:34:58.610 20:10:50 blockdev_crypto_qat -- bdev/blockdev.sh@689 -- # [[ crypto_qat == crypto_* ]] 00:34:58.610 20:10:50 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:34:58.610 20:10:50 blockdev_crypto_qat -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:34:58.610 20:10:50 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1578911 00:34:58.610 20:10:50 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:34:58.610 20:10:50 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:34:58.610 20:10:50 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 1578911 00:34:58.610 20:10:50 blockdev_crypto_qat -- common/autotest_common.sh@831 -- # '[' -z 1578911 ']' 00:34:58.610 20:10:50 blockdev_crypto_qat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:58.610 20:10:50 blockdev_crypto_qat -- common/autotest_common.sh@836 -- # local max_retries=100 00:34:58.610 20:10:50 blockdev_crypto_qat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:58.610 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:58.610 20:10:50 blockdev_crypto_qat -- common/autotest_common.sh@840 -- # xtrace_disable 00:34:58.610 20:10:50 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:58.610 [2024-07-24 20:10:50.147835] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:34:58.610 [2024-07-24 20:10:50.147916] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1578911 ] 00:34:58.868 [2024-07-24 20:10:50.277041] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:58.868 [2024-07-24 20:10:50.375252] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:59.435 20:10:51 blockdev_crypto_qat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:34:59.435 20:10:51 blockdev_crypto_qat -- common/autotest_common.sh@864 -- # return 0 00:34:59.435 20:10:51 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:34:59.435 20:10:51 blockdev_crypto_qat -- bdev/blockdev.sh@707 -- # setup_crypto_qat_conf 00:34:59.435 20:10:51 blockdev_crypto_qat -- bdev/blockdev.sh@169 -- # rpc_cmd 00:34:59.435 20:10:51 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:34:59.435 20:10:51 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:59.435 [2024-07-24 20:10:51.025401] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:59.694 [2024-07-24 20:10:51.033439] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:59.694 [2024-07-24 20:10:51.041452] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:59.694 [2024-07-24 20:10:51.110736] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:35:02.232 true 00:35:02.232 true 00:35:02.232 true 00:35:02.232 true 00:35:02.232 Malloc0 00:35:02.232 Malloc1 00:35:02.232 Malloc2 00:35:02.232 Malloc3 00:35:02.232 [2024-07-24 20:10:53.471577] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:35:02.232 crypto_ram 00:35:02.232 [2024-07-24 20:10:53.479595] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:35:02.232 crypto_ram1 00:35:02.232 [2024-07-24 20:10:53.487613] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:35:02.232 crypto_ram2 00:35:02.232 [2024-07-24 20:10:53.495635] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:35:02.232 crypto_ram3 00:35:02.232 [ 00:35:02.232 { 00:35:02.232 "name": "Malloc1", 00:35:02.232 "aliases": [ 00:35:02.232 "42e0a550-a271-4e31-b460-246b6ed08c57" 00:35:02.232 ], 00:35:02.232 "product_name": "Malloc disk", 00:35:02.232 "block_size": 512, 00:35:02.232 "num_blocks": 65536, 00:35:02.232 "uuid": "42e0a550-a271-4e31-b460-246b6ed08c57", 00:35:02.232 "assigned_rate_limits": { 00:35:02.232 "rw_ios_per_sec": 0, 00:35:02.232 "rw_mbytes_per_sec": 0, 00:35:02.232 "r_mbytes_per_sec": 0, 00:35:02.232 "w_mbytes_per_sec": 0 00:35:02.232 }, 00:35:02.232 "claimed": true, 00:35:02.232 "claim_type": "exclusive_write", 00:35:02.232 "zoned": false, 00:35:02.232 "supported_io_types": { 00:35:02.232 "read": true, 00:35:02.232 "write": true, 00:35:02.232 "unmap": true, 00:35:02.232 "flush": true, 00:35:02.232 "reset": true, 00:35:02.232 "nvme_admin": false, 00:35:02.232 "nvme_io": false, 00:35:02.232 "nvme_io_md": false, 00:35:02.232 "write_zeroes": true, 00:35:02.232 "zcopy": true, 00:35:02.232 "get_zone_info": false, 00:35:02.232 "zone_management": false, 00:35:02.232 "zone_append": false, 00:35:02.232 "compare": false, 00:35:02.232 "compare_and_write": false, 00:35:02.232 "abort": true, 00:35:02.232 "seek_hole": false, 00:35:02.232 "seek_data": false, 00:35:02.232 "copy": true, 00:35:02.232 "nvme_iov_md": false 00:35:02.232 }, 00:35:02.232 "memory_domains": [ 00:35:02.232 { 00:35:02.232 "dma_device_id": "system", 00:35:02.232 "dma_device_type": 1 00:35:02.232 }, 00:35:02.232 { 00:35:02.232 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:02.232 "dma_device_type": 2 00:35:02.232 } 00:35:02.232 ], 00:35:02.232 "driver_specific": {} 00:35:02.232 } 00:35:02.232 ] 00:35:02.232 20:10:53 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:02.232 20:10:53 blockdev_crypto_qat -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:35:02.232 20:10:53 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:02.232 20:10:53 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:02.232 20:10:53 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:02.232 20:10:53 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # cat 00:35:02.232 20:10:53 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:35:02.232 20:10:53 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:02.232 20:10:53 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:02.232 20:10:53 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:02.232 20:10:53 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:35:02.232 20:10:53 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:02.232 20:10:53 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:02.232 20:10:53 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:02.232 20:10:53 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:35:02.232 20:10:53 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:02.232 20:10:53 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:02.232 20:10:53 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:02.232 20:10:53 blockdev_crypto_qat -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:35:02.232 20:10:53 blockdev_crypto_qat -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:35:02.232 20:10:53 blockdev_crypto_qat -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:35:02.232 20:10:53 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:35:02.232 20:10:53 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:02.232 20:10:53 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:35:02.232 20:10:53 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:35:02.232 20:10:53 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r .name 00:35:02.232 20:10:53 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "458a9beb-ccdf-5581-a8f1-6994d0312047"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "458a9beb-ccdf-5581-a8f1-6994d0312047",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "677c9d01-f1dc-530a-b77f-ca6218c08ed1"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "677c9d01-f1dc-530a-b77f-ca6218c08ed1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "ae662a0c-c863-512f-afe5-057b76117c11"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "ae662a0c-c863-512f-afe5-057b76117c11",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "463f6e78-c318-5381-89b9-ccf81b29ba95"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "463f6e78-c318-5381-89b9-ccf81b29ba95",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:35:02.232 20:10:53 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:35:02.232 20:10:53 blockdev_crypto_qat -- bdev/blockdev.sh@751 -- # hello_world_bdev=crypto_ram 00:35:02.232 20:10:53 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:35:02.232 20:10:53 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # killprocess 1578911 00:35:02.232 20:10:53 blockdev_crypto_qat -- common/autotest_common.sh@950 -- # '[' -z 1578911 ']' 00:35:02.232 20:10:53 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # kill -0 1578911 00:35:02.233 20:10:53 blockdev_crypto_qat -- common/autotest_common.sh@955 -- # uname 00:35:02.233 20:10:53 blockdev_crypto_qat -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:35:02.233 20:10:53 blockdev_crypto_qat -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1578911 00:35:02.233 20:10:53 blockdev_crypto_qat -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:35:02.233 20:10:53 blockdev_crypto_qat -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:35:02.233 20:10:53 blockdev_crypto_qat -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1578911' 00:35:02.233 killing process with pid 1578911 00:35:02.233 20:10:53 blockdev_crypto_qat -- common/autotest_common.sh@969 -- # kill 1578911 00:35:02.233 20:10:53 blockdev_crypto_qat -- common/autotest_common.sh@974 -- # wait 1578911 00:35:02.802 20:10:54 blockdev_crypto_qat -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:35:02.802 20:10:54 blockdev_crypto_qat -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:35:02.802 20:10:54 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:35:02.802 20:10:54 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:35:02.802 20:10:54 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:02.802 ************************************ 00:35:02.802 START TEST bdev_hello_world 00:35:02.802 ************************************ 00:35:02.802 20:10:54 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:35:03.061 [2024-07-24 20:10:54.432064] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:35:03.061 [2024-07-24 20:10:54.432130] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1579483 ] 00:35:03.061 [2024-07-24 20:10:54.562352] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:03.321 [2024-07-24 20:10:54.663733] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:03.321 [2024-07-24 20:10:54.685099] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:35:03.321 [2024-07-24 20:10:54.693127] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:35:03.321 [2024-07-24 20:10:54.701156] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:35:03.321 [2024-07-24 20:10:54.809136] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:35:05.940 [2024-07-24 20:10:57.019259] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:35:05.940 [2024-07-24 20:10:57.019321] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:05.940 [2024-07-24 20:10:57.019336] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:05.940 [2024-07-24 20:10:57.027279] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:35:05.940 [2024-07-24 20:10:57.027298] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:05.940 [2024-07-24 20:10:57.027310] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:05.940 [2024-07-24 20:10:57.035302] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:35:05.940 [2024-07-24 20:10:57.035322] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:35:05.940 [2024-07-24 20:10:57.035334] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:05.940 [2024-07-24 20:10:57.043319] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:35:05.940 [2024-07-24 20:10:57.043336] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:35:05.940 [2024-07-24 20:10:57.043347] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:05.940 [2024-07-24 20:10:57.121041] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:35:05.940 [2024-07-24 20:10:57.121079] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:35:05.940 [2024-07-24 20:10:57.121096] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:35:05.941 [2024-07-24 20:10:57.122372] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:35:05.941 [2024-07-24 20:10:57.122449] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:35:05.941 [2024-07-24 20:10:57.122467] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:35:05.941 [2024-07-24 20:10:57.122511] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:35:05.941 00:35:05.941 [2024-07-24 20:10:57.122531] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:35:06.201 00:35:06.201 real 0m3.164s 00:35:06.201 user 0m2.749s 00:35:06.201 sys 0m0.381s 00:35:06.201 20:10:57 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1126 -- # xtrace_disable 00:35:06.201 20:10:57 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:35:06.201 ************************************ 00:35:06.201 END TEST bdev_hello_world 00:35:06.201 ************************************ 00:35:06.201 20:10:57 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:35:06.201 20:10:57 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:35:06.201 20:10:57 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:35:06.201 20:10:57 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:06.201 ************************************ 00:35:06.201 START TEST bdev_bounds 00:35:06.201 ************************************ 00:35:06.201 20:10:57 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1125 -- # bdev_bounds '' 00:35:06.201 20:10:57 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=1579993 00:35:06.201 20:10:57 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:35:06.201 20:10:57 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:35:06.201 20:10:57 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 1579993' 00:35:06.201 Process bdevio pid: 1579993 00:35:06.201 20:10:57 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 1579993 00:35:06.201 20:10:57 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@831 -- # '[' -z 1579993 ']' 00:35:06.201 20:10:57 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:06.201 20:10:57 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@836 -- # local max_retries=100 00:35:06.201 20:10:57 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:06.201 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:06.201 20:10:57 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@840 -- # xtrace_disable 00:35:06.201 20:10:57 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:35:06.201 [2024-07-24 20:10:57.688037] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:35:06.201 [2024-07-24 20:10:57.688106] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1579993 ] 00:35:06.461 [2024-07-24 20:10:57.814574] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:35:06.461 [2024-07-24 20:10:57.918970] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:06.461 [2024-07-24 20:10:57.919072] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:35:06.461 [2024-07-24 20:10:57.919074] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:06.461 [2024-07-24 20:10:57.940561] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:35:06.461 [2024-07-24 20:10:57.948589] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:35:06.461 [2024-07-24 20:10:57.956609] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:35:06.721 [2024-07-24 20:10:58.059398] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:35:09.258 [2024-07-24 20:11:00.267447] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:35:09.258 [2024-07-24 20:11:00.267522] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:09.258 [2024-07-24 20:11:00.267538] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:09.258 [2024-07-24 20:11:00.275464] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:35:09.258 [2024-07-24 20:11:00.275483] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:09.258 [2024-07-24 20:11:00.275495] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:09.258 [2024-07-24 20:11:00.283489] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:35:09.258 [2024-07-24 20:11:00.283508] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:35:09.258 [2024-07-24 20:11:00.283519] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:09.258 [2024-07-24 20:11:00.291513] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:35:09.258 [2024-07-24 20:11:00.291530] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:35:09.258 [2024-07-24 20:11:00.291541] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:09.258 20:11:00 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:35:09.258 20:11:00 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@864 -- # return 0 00:35:09.258 20:11:00 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:35:09.258 I/O targets: 00:35:09.258 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:35:09.258 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:35:09.258 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:35:09.258 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:35:09.258 00:35:09.258 00:35:09.258 CUnit - A unit testing framework for C - Version 2.1-3 00:35:09.258 http://cunit.sourceforge.net/ 00:35:09.258 00:35:09.258 00:35:09.258 Suite: bdevio tests on: crypto_ram3 00:35:09.258 Test: blockdev write read block ...passed 00:35:09.258 Test: blockdev write zeroes read block ...passed 00:35:09.258 Test: blockdev write zeroes read no split ...passed 00:35:09.258 Test: blockdev write zeroes read split ...passed 00:35:09.258 Test: blockdev write zeroes read split partial ...passed 00:35:09.258 Test: blockdev reset ...passed 00:35:09.258 Test: blockdev write read 8 blocks ...passed 00:35:09.258 Test: blockdev write read size > 128k ...passed 00:35:09.258 Test: blockdev write read invalid size ...passed 00:35:09.258 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:35:09.258 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:35:09.258 Test: blockdev write read max offset ...passed 00:35:09.258 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:35:09.258 Test: blockdev writev readv 8 blocks ...passed 00:35:09.258 Test: blockdev writev readv 30 x 1block ...passed 00:35:09.258 Test: blockdev writev readv block ...passed 00:35:09.258 Test: blockdev writev readv size > 128k ...passed 00:35:09.258 Test: blockdev writev readv size > 128k in two iovs ...passed 00:35:09.258 Test: blockdev comparev and writev ...passed 00:35:09.258 Test: blockdev nvme passthru rw ...passed 00:35:09.258 Test: blockdev nvme passthru vendor specific ...passed 00:35:09.258 Test: blockdev nvme admin passthru ...passed 00:35:09.258 Test: blockdev copy ...passed 00:35:09.258 Suite: bdevio tests on: crypto_ram2 00:35:09.258 Test: blockdev write read block ...passed 00:35:09.258 Test: blockdev write zeroes read block ...passed 00:35:09.258 Test: blockdev write zeroes read no split ...passed 00:35:09.258 Test: blockdev write zeroes read split ...passed 00:35:09.258 Test: blockdev write zeroes read split partial ...passed 00:35:09.258 Test: blockdev reset ...passed 00:35:09.258 Test: blockdev write read 8 blocks ...passed 00:35:09.258 Test: blockdev write read size > 128k ...passed 00:35:09.258 Test: blockdev write read invalid size ...passed 00:35:09.258 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:35:09.258 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:35:09.258 Test: blockdev write read max offset ...passed 00:35:09.258 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:35:09.258 Test: blockdev writev readv 8 blocks ...passed 00:35:09.258 Test: blockdev writev readv 30 x 1block ...passed 00:35:09.258 Test: blockdev writev readv block ...passed 00:35:09.258 Test: blockdev writev readv size > 128k ...passed 00:35:09.258 Test: blockdev writev readv size > 128k in two iovs ...passed 00:35:09.258 Test: blockdev comparev and writev ...passed 00:35:09.258 Test: blockdev nvme passthru rw ...passed 00:35:09.258 Test: blockdev nvme passthru vendor specific ...passed 00:35:09.258 Test: blockdev nvme admin passthru ...passed 00:35:09.258 Test: blockdev copy ...passed 00:35:09.258 Suite: bdevio tests on: crypto_ram1 00:35:09.258 Test: blockdev write read block ...passed 00:35:09.258 Test: blockdev write zeroes read block ...passed 00:35:09.258 Test: blockdev write zeroes read no split ...passed 00:35:09.258 Test: blockdev write zeroes read split ...passed 00:35:09.516 Test: blockdev write zeroes read split partial ...passed 00:35:09.516 Test: blockdev reset ...passed 00:35:09.516 Test: blockdev write read 8 blocks ...passed 00:35:09.516 Test: blockdev write read size > 128k ...passed 00:35:09.516 Test: blockdev write read invalid size ...passed 00:35:09.516 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:35:09.516 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:35:09.516 Test: blockdev write read max offset ...passed 00:35:09.516 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:35:09.516 Test: blockdev writev readv 8 blocks ...passed 00:35:09.516 Test: blockdev writev readv 30 x 1block ...passed 00:35:09.516 Test: blockdev writev readv block ...passed 00:35:09.516 Test: blockdev writev readv size > 128k ...passed 00:35:09.516 Test: blockdev writev readv size > 128k in two iovs ...passed 00:35:09.516 Test: blockdev comparev and writev ...passed 00:35:09.516 Test: blockdev nvme passthru rw ...passed 00:35:09.516 Test: blockdev nvme passthru vendor specific ...passed 00:35:09.516 Test: blockdev nvme admin passthru ...passed 00:35:09.516 Test: blockdev copy ...passed 00:35:09.516 Suite: bdevio tests on: crypto_ram 00:35:09.516 Test: blockdev write read block ...passed 00:35:09.516 Test: blockdev write zeroes read block ...passed 00:35:09.516 Test: blockdev write zeroes read no split ...passed 00:35:09.516 Test: blockdev write zeroes read split ...passed 00:35:09.775 Test: blockdev write zeroes read split partial ...passed 00:35:09.775 Test: blockdev reset ...passed 00:35:09.775 Test: blockdev write read 8 blocks ...passed 00:35:09.775 Test: blockdev write read size > 128k ...passed 00:35:09.775 Test: blockdev write read invalid size ...passed 00:35:09.775 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:35:09.775 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:35:09.775 Test: blockdev write read max offset ...passed 00:35:09.775 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:35:09.775 Test: blockdev writev readv 8 blocks ...passed 00:35:09.775 Test: blockdev writev readv 30 x 1block ...passed 00:35:09.776 Test: blockdev writev readv block ...passed 00:35:09.776 Test: blockdev writev readv size > 128k ...passed 00:35:09.776 Test: blockdev writev readv size > 128k in two iovs ...passed 00:35:09.776 Test: blockdev comparev and writev ...passed 00:35:09.776 Test: blockdev nvme passthru rw ...passed 00:35:09.776 Test: blockdev nvme passthru vendor specific ...passed 00:35:09.776 Test: blockdev nvme admin passthru ...passed 00:35:09.776 Test: blockdev copy ...passed 00:35:09.776 00:35:09.776 Run Summary: Type Total Ran Passed Failed Inactive 00:35:09.776 suites 4 4 n/a 0 0 00:35:09.776 tests 92 92 92 0 0 00:35:09.776 asserts 520 520 520 0 n/a 00:35:09.776 00:35:09.776 Elapsed time = 1.569 seconds 00:35:09.776 0 00:35:09.776 20:11:01 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 1579993 00:35:09.776 20:11:01 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@950 -- # '[' -z 1579993 ']' 00:35:09.776 20:11:01 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # kill -0 1579993 00:35:09.776 20:11:01 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@955 -- # uname 00:35:09.776 20:11:01 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:35:09.776 20:11:01 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1579993 00:35:09.776 20:11:01 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:35:09.776 20:11:01 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:35:09.776 20:11:01 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1579993' 00:35:09.776 killing process with pid 1579993 00:35:09.776 20:11:01 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@969 -- # kill 1579993 00:35:09.776 20:11:01 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@974 -- # wait 1579993 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:35:10.345 00:35:10.345 real 0m4.126s 00:35:10.345 user 0m10.989s 00:35:10.345 sys 0m0.564s 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1126 -- # xtrace_disable 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:35:10.345 ************************************ 00:35:10.345 END TEST bdev_bounds 00:35:10.345 ************************************ 00:35:10.345 20:11:01 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:35:10.345 20:11:01 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:35:10.345 20:11:01 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:35:10.345 20:11:01 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:10.345 ************************************ 00:35:10.345 START TEST bdev_nbd 00:35:10.345 ************************************ 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1125 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=4 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=4 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=1580543 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 1580543 /var/tmp/spdk-nbd.sock 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@831 -- # '[' -z 1580543 ']' 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@836 -- # local max_retries=100 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:35:10.345 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@840 -- # xtrace_disable 00:35:10.345 20:11:01 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:35:10.345 [2024-07-24 20:11:01.913051] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:35:10.345 [2024-07-24 20:11:01.913129] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:35:10.604 [2024-07-24 20:11:02.047426] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:10.604 [2024-07-24 20:11:02.146214] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:10.604 [2024-07-24 20:11:02.167574] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:35:10.604 [2024-07-24 20:11:02.175596] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:35:10.604 [2024-07-24 20:11:02.183614] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:35:10.862 [2024-07-24 20:11:02.288914] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:35:13.396 [2024-07-24 20:11:04.505071] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:35:13.396 [2024-07-24 20:11:04.505128] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:13.396 [2024-07-24 20:11:04.505144] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:13.396 [2024-07-24 20:11:04.513089] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:35:13.396 [2024-07-24 20:11:04.513110] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:13.396 [2024-07-24 20:11:04.513122] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:13.396 [2024-07-24 20:11:04.521111] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:35:13.396 [2024-07-24 20:11:04.521134] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:35:13.396 [2024-07-24 20:11:04.521145] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:13.396 [2024-07-24 20:11:04.529130] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:35:13.396 [2024-07-24 20:11:04.529149] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:35:13.396 [2024-07-24 20:11:04.529161] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:13.396 20:11:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:35:13.396 20:11:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # return 0 00:35:13.396 20:11:04 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:35:13.396 20:11:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:13.396 20:11:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:35:13.396 20:11:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:35:13.396 20:11:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:35:13.396 20:11:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:13.396 20:11:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:35:13.396 20:11:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:35:13.396 20:11:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:35:13.396 20:11:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:35:13.396 20:11:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:35:13.396 20:11:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:35:13.396 20:11:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:35:13.396 20:11:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:35:13.396 20:11:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:35:13.396 20:11:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:35:13.396 20:11:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:35:13.396 20:11:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:35:13.396 20:11:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:35:13.396 20:11:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:35:13.396 20:11:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:35:13.397 20:11:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:35:13.397 20:11:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:35:13.397 20:11:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:35:13.397 20:11:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:35:13.397 1+0 records in 00:35:13.397 1+0 records out 00:35:13.397 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000333936 s, 12.3 MB/s 00:35:13.397 20:11:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:13.397 20:11:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:35:13.397 20:11:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:13.397 20:11:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:35:13.397 20:11:04 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:35:13.397 20:11:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:35:13.397 20:11:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:35:13.397 20:11:04 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:35:13.656 20:11:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:35:13.656 20:11:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:35:13.656 20:11:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:35:13.656 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:35:13.656 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:35:13.656 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:35:13.656 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:35:13.656 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:35:13.656 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:35:13.656 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:35:13.656 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:35:13.656 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:35:13.656 1+0 records in 00:35:13.656 1+0 records out 00:35:13.656 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000267456 s, 15.3 MB/s 00:35:13.656 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:13.656 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:35:13.656 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:13.656 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:35:13.656 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:35:13.656 20:11:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:35:13.656 20:11:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:35:13.656 20:11:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:35:13.915 20:11:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:35:13.915 20:11:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:35:13.915 20:11:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:35:13.915 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd2 00:35:13.915 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:35:13.915 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:35:13.915 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:35:13.915 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd2 /proc/partitions 00:35:13.915 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:35:13.915 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:35:13.915 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:35:13.915 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:35:13.915 1+0 records in 00:35:13.915 1+0 records out 00:35:13.915 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000258303 s, 15.9 MB/s 00:35:13.915 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:13.915 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:35:13.915 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:13.915 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:35:13.915 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:35:13.915 20:11:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:35:13.915 20:11:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:35:14.174 20:11:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:35:14.433 20:11:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:35:14.433 20:11:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:35:14.433 20:11:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:35:14.433 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd3 00:35:14.433 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:35:14.433 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:35:14.433 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:35:14.433 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd3 /proc/partitions 00:35:14.433 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:35:14.433 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:35:14.433 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:35:14.433 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:35:14.433 1+0 records in 00:35:14.433 1+0 records out 00:35:14.433 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000406087 s, 10.1 MB/s 00:35:14.433 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:14.433 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:35:14.433 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:14.433 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:35:14.433 20:11:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:35:14.433 20:11:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:35:14.433 20:11:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:35:14.433 20:11:05 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:35:14.692 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:35:14.692 { 00:35:14.692 "nbd_device": "/dev/nbd0", 00:35:14.692 "bdev_name": "crypto_ram" 00:35:14.692 }, 00:35:14.692 { 00:35:14.692 "nbd_device": "/dev/nbd1", 00:35:14.692 "bdev_name": "crypto_ram1" 00:35:14.692 }, 00:35:14.692 { 00:35:14.692 "nbd_device": "/dev/nbd2", 00:35:14.692 "bdev_name": "crypto_ram2" 00:35:14.692 }, 00:35:14.692 { 00:35:14.692 "nbd_device": "/dev/nbd3", 00:35:14.692 "bdev_name": "crypto_ram3" 00:35:14.692 } 00:35:14.692 ]' 00:35:14.692 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:35:14.692 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:35:14.692 { 00:35:14.692 "nbd_device": "/dev/nbd0", 00:35:14.693 "bdev_name": "crypto_ram" 00:35:14.693 }, 00:35:14.693 { 00:35:14.693 "nbd_device": "/dev/nbd1", 00:35:14.693 "bdev_name": "crypto_ram1" 00:35:14.693 }, 00:35:14.693 { 00:35:14.693 "nbd_device": "/dev/nbd2", 00:35:14.693 "bdev_name": "crypto_ram2" 00:35:14.693 }, 00:35:14.693 { 00:35:14.693 "nbd_device": "/dev/nbd3", 00:35:14.693 "bdev_name": "crypto_ram3" 00:35:14.693 } 00:35:14.693 ]' 00:35:14.693 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:35:14.693 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:35:14.693 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:14.693 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:35:14.693 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:35:14.693 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:35:14.693 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:14.693 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:35:14.952 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:35:14.952 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:35:14.952 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:35:14.952 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:14.952 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:14.952 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:35:14.952 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:14.952 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:14.952 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:14.952 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:35:15.211 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:35:15.211 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:35:15.211 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:35:15.211 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:15.211 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:15.211 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:35:15.211 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:15.211 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:15.211 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:15.211 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:35:15.470 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:35:15.470 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:35:15.470 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:35:15.470 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:15.470 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:15.470 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:35:15.470 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:15.470 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:15.470 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:15.470 20:11:06 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:35:15.729 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:35:15.729 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:35:15.729 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:35:15.729 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:15.729 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:15.729 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:35:15.729 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:15.729 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:15.729 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:35:15.729 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:15.729 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:35:15.989 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:35:15.989 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:35:15.989 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:35:15.989 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:35:15.989 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:35:15.989 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:35:15.989 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:35:15.989 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:35:15.989 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:35:15.989 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:35:15.989 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:35:15.989 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:35:15.989 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:35:15.989 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:15.989 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:35:15.989 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:35:15.989 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:35:15.989 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:35:15.989 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:35:15.989 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:15.989 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:35:15.989 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:35:15.989 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:35:15.989 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:35:15.989 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:35:15.989 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:35:15.989 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:35:15.989 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:35:16.248 /dev/nbd0 00:35:16.249 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:35:16.249 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:35:16.249 20:11:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:35:16.249 20:11:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:35:16.249 20:11:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:35:16.249 20:11:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:35:16.249 20:11:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:35:16.249 20:11:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:35:16.249 20:11:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:35:16.249 20:11:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:35:16.249 20:11:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:35:16.249 1+0 records in 00:35:16.249 1+0 records out 00:35:16.249 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000338698 s, 12.1 MB/s 00:35:16.249 20:11:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:16.249 20:11:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:35:16.249 20:11:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:16.249 20:11:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:35:16.249 20:11:07 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:35:16.249 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:35:16.249 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:35:16.249 20:11:07 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:35:16.508 /dev/nbd1 00:35:16.508 20:11:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:35:16.508 20:11:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:35:16.508 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:35:16.508 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:35:16.508 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:35:16.508 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:35:16.508 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:35:16.508 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:35:16.508 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:35:16.508 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:35:16.508 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:35:16.508 1+0 records in 00:35:16.508 1+0 records out 00:35:16.508 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000327051 s, 12.5 MB/s 00:35:16.508 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:16.766 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:35:16.766 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:16.766 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:35:16.766 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:35:16.766 20:11:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:35:16.766 20:11:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:35:16.767 20:11:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:35:16.767 /dev/nbd10 00:35:17.026 20:11:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:35:17.026 20:11:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:35:17.026 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd10 00:35:17.026 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:35:17.026 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:35:17.026 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:35:17.026 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd10 /proc/partitions 00:35:17.026 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:35:17.026 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:35:17.026 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:35:17.026 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:35:17.026 1+0 records in 00:35:17.026 1+0 records out 00:35:17.026 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000237115 s, 17.3 MB/s 00:35:17.026 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:17.026 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:35:17.026 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:17.026 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:35:17.026 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:35:17.026 20:11:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:35:17.026 20:11:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:35:17.026 20:11:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:35:17.284 /dev/nbd11 00:35:17.284 20:11:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:35:17.284 20:11:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:35:17.284 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd11 00:35:17.284 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:35:17.284 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:35:17.284 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:35:17.284 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd11 /proc/partitions 00:35:17.284 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:35:17.284 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:35:17.284 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:35:17.284 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:35:17.284 1+0 records in 00:35:17.284 1+0 records out 00:35:17.284 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000319317 s, 12.8 MB/s 00:35:17.284 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:17.284 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:35:17.284 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:17.284 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:35:17.284 20:11:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:35:17.284 20:11:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:35:17.284 20:11:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:35:17.284 20:11:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:35:17.284 20:11:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:17.284 20:11:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:35:17.543 20:11:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:35:17.543 { 00:35:17.543 "nbd_device": "/dev/nbd0", 00:35:17.543 "bdev_name": "crypto_ram" 00:35:17.543 }, 00:35:17.543 { 00:35:17.543 "nbd_device": "/dev/nbd1", 00:35:17.543 "bdev_name": "crypto_ram1" 00:35:17.543 }, 00:35:17.543 { 00:35:17.543 "nbd_device": "/dev/nbd10", 00:35:17.543 "bdev_name": "crypto_ram2" 00:35:17.543 }, 00:35:17.543 { 00:35:17.543 "nbd_device": "/dev/nbd11", 00:35:17.543 "bdev_name": "crypto_ram3" 00:35:17.543 } 00:35:17.543 ]' 00:35:17.543 20:11:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:35:17.543 { 00:35:17.543 "nbd_device": "/dev/nbd0", 00:35:17.543 "bdev_name": "crypto_ram" 00:35:17.543 }, 00:35:17.543 { 00:35:17.543 "nbd_device": "/dev/nbd1", 00:35:17.543 "bdev_name": "crypto_ram1" 00:35:17.543 }, 00:35:17.543 { 00:35:17.543 "nbd_device": "/dev/nbd10", 00:35:17.543 "bdev_name": "crypto_ram2" 00:35:17.543 }, 00:35:17.543 { 00:35:17.543 "nbd_device": "/dev/nbd11", 00:35:17.543 "bdev_name": "crypto_ram3" 00:35:17.543 } 00:35:17.543 ]' 00:35:17.543 20:11:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:35:17.543 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:35:17.543 /dev/nbd1 00:35:17.543 /dev/nbd10 00:35:17.543 /dev/nbd11' 00:35:17.543 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:35:17.543 /dev/nbd1 00:35:17.543 /dev/nbd10 00:35:17.543 /dev/nbd11' 00:35:17.543 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:35:17.543 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:35:17.543 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:35:17.543 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:35:17.543 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:35:17.543 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:35:17.543 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:35:17.543 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:35:17.543 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:35:17.543 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:35:17.543 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:35:17.543 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:35:17.543 256+0 records in 00:35:17.543 256+0 records out 00:35:17.543 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114575 s, 91.5 MB/s 00:35:17.543 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:35:17.543 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:35:17.802 256+0 records in 00:35:17.802 256+0 records out 00:35:17.802 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0830799 s, 12.6 MB/s 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:35:17.802 256+0 records in 00:35:17.802 256+0 records out 00:35:17.802 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0653053 s, 16.1 MB/s 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:35:17.802 256+0 records in 00:35:17.802 256+0 records out 00:35:17.802 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0574144 s, 18.3 MB/s 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:35:17.802 256+0 records in 00:35:17.802 256+0 records out 00:35:17.802 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0551473 s, 19.0 MB/s 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:17.802 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:35:18.062 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:35:18.062 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:35:18.062 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:35:18.062 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:18.062 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:18.062 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:35:18.062 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:18.062 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:18.062 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:18.062 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:35:18.321 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:35:18.321 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:35:18.321 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:35:18.321 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:18.321 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:18.321 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:35:18.321 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:18.321 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:18.321 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:18.321 20:11:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:35:18.643 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:35:18.643 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:35:18.643 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:35:18.643 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:18.643 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:18.643 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:35:18.643 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:18.643 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:18.643 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:18.643 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:35:18.902 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:35:18.902 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:35:18.902 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:35:18.902 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:18.902 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:18.902 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:35:18.902 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:18.902 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:18.902 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:35:18.902 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:18.902 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:35:19.161 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:35:19.161 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:35:19.161 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:35:19.161 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:35:19.161 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:35:19.161 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:35:19.161 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:35:19.161 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:35:19.161 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:35:19.161 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:35:19.161 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:35:19.161 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:35:19.161 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:35:19.161 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:19.161 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:35:19.161 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:35:19.161 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:35:19.161 20:11:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:35:19.728 malloc_lvol_verify 00:35:19.728 20:11:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:35:19.987 140d2f3b-ef9a-4636-ab29-5c17e6d7baaf 00:35:19.987 20:11:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:35:20.555 141a81c7-aef4-462b-ab3a-7999cb78fa5a 00:35:20.555 20:11:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:35:21.124 /dev/nbd0 00:35:21.124 20:11:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:35:21.124 mke2fs 1.46.5 (30-Dec-2021) 00:35:21.124 Discarding device blocks: 0/4096 done 00:35:21.124 Creating filesystem with 4096 1k blocks and 1024 inodes 00:35:21.124 00:35:21.124 Allocating group tables: 0/1 done 00:35:21.124 Writing inode tables: 0/1 done 00:35:21.124 Creating journal (1024 blocks): done 00:35:21.124 Writing superblocks and filesystem accounting information: 0/1 done 00:35:21.124 00:35:21.124 20:11:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:35:21.124 20:11:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:35:21.124 20:11:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:21.124 20:11:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:35:21.124 20:11:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:35:21.124 20:11:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:35:21.124 20:11:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:21.124 20:11:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:35:21.383 20:11:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:35:21.383 20:11:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:35:21.383 20:11:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:35:21.383 20:11:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:21.383 20:11:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:21.383 20:11:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:35:21.383 20:11:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:21.383 20:11:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:21.383 20:11:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:35:21.383 20:11:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:35:21.383 20:11:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 1580543 00:35:21.383 20:11:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@950 -- # '[' -z 1580543 ']' 00:35:21.383 20:11:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # kill -0 1580543 00:35:21.383 20:11:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@955 -- # uname 00:35:21.383 20:11:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:35:21.383 20:11:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1580543 00:35:21.383 20:11:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:35:21.383 20:11:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:35:21.383 20:11:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1580543' 00:35:21.383 killing process with pid 1580543 00:35:21.383 20:11:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@969 -- # kill 1580543 00:35:21.383 20:11:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@974 -- # wait 1580543 00:35:21.959 20:11:13 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:35:21.959 00:35:21.959 real 0m11.418s 00:35:21.959 user 0m15.379s 00:35:21.959 sys 0m4.373s 00:35:21.959 20:11:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1126 -- # xtrace_disable 00:35:21.959 20:11:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:35:21.959 ************************************ 00:35:21.959 END TEST bdev_nbd 00:35:21.959 ************************************ 00:35:21.959 20:11:13 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:35:21.959 20:11:13 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # '[' crypto_qat = nvme ']' 00:35:21.959 20:11:13 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # '[' crypto_qat = gpt ']' 00:35:21.959 20:11:13 blockdev_crypto_qat -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:35:21.959 20:11:13 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:35:21.959 20:11:13 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:35:21.959 20:11:13 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:21.959 ************************************ 00:35:21.959 START TEST bdev_fio 00:35:21.959 ************************************ 00:35:21.959 20:11:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1125 -- # fio_test_suite '' 00:35:21.959 20:11:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:35:21.959 20:11:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:35:21.959 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:35:21.959 20:11:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:35:21.959 20:11:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:35:21.959 20:11:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:35:21.959 20:11:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:35:21.959 20:11:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:35:21.959 20:11:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:21.959 20:11:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:35:21.959 20:11:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:35:21.959 20:11:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:35:21.959 20:11:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:35:21.959 20:11:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:35:21.959 20:11:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:35:21.959 20:11:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:35:21.959 20:11:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:21.959 20:11:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:35:21.959 20:11:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram]' 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram1]' 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram1 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram2]' 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram2 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram3]' 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram3 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:35:21.960 ************************************ 00:35:21.960 START TEST bdev_fio_rw_verify 00:35:21.960 ************************************ 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:35:21.960 20:11:13 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:22.533 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:22.533 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:22.533 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:22.533 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:22.533 fio-3.35 00:35:22.533 Starting 4 threads 00:35:37.478 00:35:37.478 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1582581: Wed Jul 24 20:11:26 2024 00:35:37.478 read: IOPS=18.3k, BW=71.6MiB/s (75.0MB/s)(716MiB/10001msec) 00:35:37.478 slat (usec): min=17, max=492, avg=76.08, stdev=32.48 00:35:37.478 clat (usec): min=19, max=1629, avg=413.69, stdev=228.19 00:35:37.478 lat (usec): min=57, max=1802, avg=489.76, stdev=239.84 00:35:37.478 clat percentiles (usec): 00:35:37.478 | 50.000th=[ 371], 99.000th=[ 1020], 99.900th=[ 1336], 99.990th=[ 1565], 00:35:37.478 | 99.999th=[ 1631] 00:35:37.478 write: IOPS=20.2k, BW=79.1MiB/s (82.9MB/s)(771MiB/9750msec); 0 zone resets 00:35:37.478 slat (usec): min=25, max=355, avg=88.16, stdev=31.17 00:35:37.478 clat (usec): min=17, max=2231, avg=456.81, stdev=244.17 00:35:37.478 lat (usec): min=47, max=2467, avg=544.97, stdev=254.12 00:35:37.478 clat percentiles (usec): 00:35:37.478 | 50.000th=[ 420], 99.000th=[ 1090], 99.900th=[ 1434], 99.990th=[ 1729], 00:35:37.478 | 99.999th=[ 2114] 00:35:37.478 bw ( KiB/s): min=59360, max=109728, per=97.92%, avg=79304.84, stdev=3225.06, samples=76 00:35:37.478 iops : min=14840, max=27432, avg=19826.21, stdev=806.26, samples=76 00:35:37.478 lat (usec) : 20=0.01%, 50=0.01%, 100=1.28%, 250=25.02%, 500=38.02% 00:35:37.478 lat (usec) : 750=24.19%, 1000=9.82% 00:35:37.478 lat (msec) : 2=1.67%, 4=0.01% 00:35:37.478 cpu : usr=99.59%, sys=0.00%, ctx=54, majf=0, minf=290 00:35:37.478 IO depths : 1=7.2%, 2=26.5%, 4=53.0%, 8=13.2%, 16=0.0%, 32=0.0%, >=64=0.0% 00:35:37.478 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:35:37.478 complete : 0=0.0%, 4=88.3%, 8=11.7%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:35:37.478 issued rwts: total=183209,197416,0,0 short=0,0,0,0 dropped=0,0,0,0 00:35:37.478 latency : target=0, window=0, percentile=100.00%, depth=8 00:35:37.478 00:35:37.478 Run status group 0 (all jobs): 00:35:37.478 READ: bw=71.6MiB/s (75.0MB/s), 71.6MiB/s-71.6MiB/s (75.0MB/s-75.0MB/s), io=716MiB (750MB), run=10001-10001msec 00:35:37.478 WRITE: bw=79.1MiB/s (82.9MB/s), 79.1MiB/s-79.1MiB/s (82.9MB/s-82.9MB/s), io=771MiB (809MB), run=9750-9750msec 00:35:37.478 00:35:37.478 real 0m13.555s 00:35:37.478 user 0m45.957s 00:35:37.478 sys 0m0.496s 00:35:37.478 20:11:27 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:35:37.478 20:11:27 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:35:37.478 ************************************ 00:35:37.478 END TEST bdev_fio_rw_verify 00:35:37.478 ************************************ 00:35:37.478 20:11:27 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:35:37.478 20:11:27 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:37.478 20:11:27 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:35:37.478 20:11:27 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:37.478 20:11:27 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:35:37.478 20:11:27 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:35:37.478 20:11:27 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:35:37.478 20:11:27 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:35:37.478 20:11:27 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:35:37.478 20:11:27 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:35:37.478 20:11:27 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:35:37.478 20:11:27 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:37.478 20:11:27 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:35:37.478 20:11:27 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:35:37.478 20:11:27 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:35:37.478 20:11:27 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:35:37.478 20:11:27 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "458a9beb-ccdf-5581-a8f1-6994d0312047"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "458a9beb-ccdf-5581-a8f1-6994d0312047",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "677c9d01-f1dc-530a-b77f-ca6218c08ed1"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "677c9d01-f1dc-530a-b77f-ca6218c08ed1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "ae662a0c-c863-512f-afe5-057b76117c11"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "ae662a0c-c863-512f-afe5-057b76117c11",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "463f6e78-c318-5381-89b9-ccf81b29ba95"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "463f6e78-c318-5381-89b9-ccf81b29ba95",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n crypto_ram 00:35:37.479 crypto_ram1 00:35:37.479 crypto_ram2 00:35:37.479 crypto_ram3 ]] 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "458a9beb-ccdf-5581-a8f1-6994d0312047"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "458a9beb-ccdf-5581-a8f1-6994d0312047",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "677c9d01-f1dc-530a-b77f-ca6218c08ed1"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "677c9d01-f1dc-530a-b77f-ca6218c08ed1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "ae662a0c-c863-512f-afe5-057b76117c11"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "ae662a0c-c863-512f-afe5-057b76117c11",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "463f6e78-c318-5381-89b9-ccf81b29ba95"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "463f6e78-c318-5381-89b9-ccf81b29ba95",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram]' 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram1]' 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram1 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram2]' 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram2 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram3]' 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram3 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:35:37.479 ************************************ 00:35:37.479 START TEST bdev_fio_trim 00:35:37.479 ************************************ 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:35:37.479 20:11:27 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:35:37.480 20:11:27 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:35:37.480 20:11:27 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:35:37.480 20:11:27 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:37.480 20:11:27 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:35:37.480 20:11:27 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:35:37.480 20:11:27 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:35:37.480 20:11:27 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:35:37.480 20:11:27 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:35:37.480 20:11:27 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:37.480 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:37.480 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:37.480 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:37.480 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:37.480 fio-3.35 00:35:37.480 Starting 4 threads 00:35:49.687 00:35:49.687 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1584443: Wed Jul 24 20:11:40 2024 00:35:49.687 write: IOPS=31.1k, BW=121MiB/s (127MB/s)(1214MiB/10001msec); 0 zone resets 00:35:49.687 slat (usec): min=11, max=541, avg=76.20, stdev=87.64 00:35:49.687 clat (usec): min=29, max=2324, avg=268.42, stdev=304.30 00:35:49.687 lat (usec): min=47, max=2762, avg=344.62, stdev=379.06 00:35:49.687 clat percentiles (usec): 00:35:49.688 | 50.000th=[ 180], 99.000th=[ 1647], 99.900th=[ 2147], 99.990th=[ 2245], 00:35:49.688 | 99.999th=[ 2311] 00:35:49.688 bw ( KiB/s): min=88065, max=166720, per=99.79%, avg=124059.84, stdev=6522.23, samples=76 00:35:49.688 iops : min=22016, max=41680, avg=31014.79, stdev=1630.55, samples=76 00:35:49.688 trim: IOPS=31.1k, BW=121MiB/s (127MB/s)(1214MiB/10001msec); 0 zone resets 00:35:49.688 slat (usec): min=4, max=1592, avg=20.45, stdev=15.46 00:35:49.688 clat (usec): min=38, max=2763, avg=344.88, stdev=379.17 00:35:49.688 lat (usec): min=46, max=2855, avg=365.33, stdev=391.11 00:35:49.688 clat percentiles (usec): 00:35:49.688 | 50.000th=[ 219], 99.000th=[ 2040], 99.900th=[ 2573], 99.990th=[ 2704], 00:35:49.688 | 99.999th=[ 2737] 00:35:49.688 bw ( KiB/s): min=88065, max=166720, per=99.79%, avg=124059.84, stdev=6522.23, samples=76 00:35:49.688 iops : min=22016, max=41680, avg=31014.79, stdev=1630.55, samples=76 00:35:49.688 lat (usec) : 50=1.95%, 100=12.22%, 250=50.93%, 500=21.30%, 750=4.71% 00:35:49.688 lat (usec) : 1000=3.47% 00:35:49.688 lat (msec) : 2=4.62%, 4=0.80% 00:35:49.688 cpu : usr=99.42%, sys=0.00%, ctx=59, majf=0, minf=107 00:35:49.688 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:35:49.688 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:35:49.688 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:35:49.688 issued rwts: total=0,310826,310827,0 short=0,0,0,0 dropped=0,0,0,0 00:35:49.688 latency : target=0, window=0, percentile=100.00%, depth=8 00:35:49.688 00:35:49.688 Run status group 0 (all jobs): 00:35:49.688 WRITE: bw=121MiB/s (127MB/s), 121MiB/s-121MiB/s (127MB/s-127MB/s), io=1214MiB (1273MB), run=10001-10001msec 00:35:49.688 TRIM: bw=121MiB/s (127MB/s), 121MiB/s-121MiB/s (127MB/s-127MB/s), io=1214MiB (1273MB), run=10001-10001msec 00:35:49.688 00:35:49.688 real 0m13.572s 00:35:49.688 user 0m45.871s 00:35:49.688 sys 0m0.521s 00:35:49.688 20:11:40 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1126 -- # xtrace_disable 00:35:49.688 20:11:40 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:35:49.688 ************************************ 00:35:49.688 END TEST bdev_fio_trim 00:35:49.688 ************************************ 00:35:49.688 20:11:40 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:35:49.688 20:11:40 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:49.688 20:11:40 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:35:49.688 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:35:49.688 20:11:40 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:35:49.688 00:35:49.688 real 0m27.499s 00:35:49.688 user 1m32.024s 00:35:49.688 sys 0m1.215s 00:35:49.688 20:11:40 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1126 -- # xtrace_disable 00:35:49.688 20:11:40 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:35:49.688 ************************************ 00:35:49.688 END TEST bdev_fio 00:35:49.688 ************************************ 00:35:49.688 20:11:40 blockdev_crypto_qat -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:35:49.688 20:11:40 blockdev_crypto_qat -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:35:49.688 20:11:40 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:35:49.688 20:11:40 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:35:49.688 20:11:40 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:49.688 ************************************ 00:35:49.688 START TEST bdev_verify 00:35:49.688 ************************************ 00:35:49.688 20:11:40 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:35:49.688 [2024-07-24 20:11:40.985320] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:35:49.688 [2024-07-24 20:11:40.985405] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1585853 ] 00:35:49.688 [2024-07-24 20:11:41.113361] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:35:49.688 [2024-07-24 20:11:41.215948] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:49.688 [2024-07-24 20:11:41.215953] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:49.688 [2024-07-24 20:11:41.237425] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:35:49.688 [2024-07-24 20:11:41.245451] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:35:49.688 [2024-07-24 20:11:41.253480] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:35:49.947 [2024-07-24 20:11:41.360906] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:35:52.482 [2024-07-24 20:11:43.585339] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:35:52.482 [2024-07-24 20:11:43.585438] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:52.482 [2024-07-24 20:11:43.585452] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:52.482 [2024-07-24 20:11:43.593357] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:35:52.482 [2024-07-24 20:11:43.593379] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:52.482 [2024-07-24 20:11:43.593398] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:52.482 [2024-07-24 20:11:43.601376] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:35:52.482 [2024-07-24 20:11:43.601402] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:35:52.482 [2024-07-24 20:11:43.601414] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:52.482 [2024-07-24 20:11:43.609405] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:35:52.482 [2024-07-24 20:11:43.609423] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:35:52.482 [2024-07-24 20:11:43.609434] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:52.482 Running I/O for 5 seconds... 00:35:57.769 00:35:57.769 Latency(us) 00:35:57.769 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:57.769 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:57.769 Verification LBA range: start 0x0 length 0x1000 00:35:57.769 crypto_ram : 5.07 476.75 1.86 0.00 0.00 267641.03 5413.84 165036.74 00:35:57.769 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:57.769 Verification LBA range: start 0x1000 length 0x1000 00:35:57.769 crypto_ram : 5.07 378.52 1.48 0.00 0.00 336654.10 16982.37 206067.98 00:35:57.769 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:57.769 Verification LBA range: start 0x0 length 0x1000 00:35:57.769 crypto_ram1 : 5.07 479.59 1.87 0.00 0.00 265485.43 6724.56 151359.67 00:35:57.769 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:57.769 Verification LBA range: start 0x1000 length 0x1000 00:35:57.769 crypto_ram1 : 5.07 379.93 1.48 0.00 0.00 334269.14 221.72 187831.87 00:35:57.769 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:57.769 Verification LBA range: start 0x0 length 0x1000 00:35:57.769 crypto_ram2 : 5.05 3676.32 14.36 0.00 0.00 34516.15 5955.23 27696.08 00:35:57.769 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:57.769 Verification LBA range: start 0x1000 length 0x1000 00:35:57.769 crypto_ram2 : 5.06 2972.25 11.61 0.00 0.00 42581.60 3675.71 32141.13 00:35:57.769 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:57.769 Verification LBA range: start 0x0 length 0x1000 00:35:57.769 crypto_ram3 : 5.05 3675.13 14.36 0.00 0.00 34438.47 5613.30 27696.08 00:35:57.769 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:57.769 Verification LBA range: start 0x1000 length 0x1000 00:35:57.769 crypto_ram3 : 5.07 2980.98 11.64 0.00 0.00 42382.67 2678.43 32141.13 00:35:57.769 =================================================================================================================== 00:35:57.769 Total : 15019.48 58.67 0.00 0.00 67713.28 221.72 206067.98 00:35:57.769 00:35:57.769 real 0m8.268s 00:35:57.769 user 0m15.622s 00:35:57.769 sys 0m0.409s 00:35:57.769 20:11:49 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:35:57.769 20:11:49 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:35:57.769 ************************************ 00:35:57.769 END TEST bdev_verify 00:35:57.769 ************************************ 00:35:57.769 20:11:49 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:35:57.769 20:11:49 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:35:57.769 20:11:49 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:35:57.769 20:11:49 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:57.769 ************************************ 00:35:57.769 START TEST bdev_verify_big_io 00:35:57.769 ************************************ 00:35:57.769 20:11:49 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:35:57.769 [2024-07-24 20:11:49.338755] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:35:57.769 [2024-07-24 20:11:49.338818] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1586866 ] 00:35:58.029 [2024-07-24 20:11:49.466575] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:35:58.029 [2024-07-24 20:11:49.568703] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:58.029 [2024-07-24 20:11:49.568709] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:58.029 [2024-07-24 20:11:49.590163] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:35:58.029 [2024-07-24 20:11:49.598195] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:35:58.029 [2024-07-24 20:11:49.606224] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:35:58.287 [2024-07-24 20:11:49.714589] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:36:00.820 [2024-07-24 20:11:51.939732] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:36:00.820 [2024-07-24 20:11:51.939819] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:36:00.820 [2024-07-24 20:11:51.939834] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:00.820 [2024-07-24 20:11:51.947751] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:36:00.820 [2024-07-24 20:11:51.947773] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:36:00.820 [2024-07-24 20:11:51.947784] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:00.820 [2024-07-24 20:11:51.955776] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:36:00.820 [2024-07-24 20:11:51.955796] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:36:00.820 [2024-07-24 20:11:51.955808] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:00.820 [2024-07-24 20:11:51.963798] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:36:00.820 [2024-07-24 20:11:51.963817] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:36:00.820 [2024-07-24 20:11:51.963829] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:00.820 Running I/O for 5 seconds... 00:36:01.387 [2024-07-24 20:11:52.979583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.387 [2024-07-24 20:11:52.980133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.387 [2024-07-24 20:11:52.980233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.387 [2024-07-24 20:11:52.980293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.387 [2024-07-24 20:11:52.980351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.387 [2024-07-24 20:11:52.980409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.387 [2024-07-24 20:11:52.980935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.387 [2024-07-24 20:11:52.980957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.985443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.985512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.985567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.985621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.986246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.986305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.986357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.986418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.986958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.986979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.991188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.991257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.991330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.991421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.991973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.992031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.992083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.992135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.992685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.992707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.996883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.996942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.996995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.997048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.997606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.997664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.997717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.997774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.998307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:52.998329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:53.002402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:53.002464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:53.002516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:53.002569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:53.003068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:53.003144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:53.003198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:53.003266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:53.003779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:53.003801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:53.007861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:53.007921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:53.007975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:53.008028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:53.008611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:53.008667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:53.008721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:53.008783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.649 [2024-07-24 20:11:53.009348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.009369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.013457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.013516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.013569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.013621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.014175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.014243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.014310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.014369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.014836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.014857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.018950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.019029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.019092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.019144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.019744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.019801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.019853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.019905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.020386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.020412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.024433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.024490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.024545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.024598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.025171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.025227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.025280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.025345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.025895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.025917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.029998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.030095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.030162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.030214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.030846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.030903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.030956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.031009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.031566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.031588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.035834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.035892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.035943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.035996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.036585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.036642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.036695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.036748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.037232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.037253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.041303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.041362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.041425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.041478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.041871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.041926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.041977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.042028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.042366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.042387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.045409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.045467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.045519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.045572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.046108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.046163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.046214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.046266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.046804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.650 [2024-07-24 20:11:53.046827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.049581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.049639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.049690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.049741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.050129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.050193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.050248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.050300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.050713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.050736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.054363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.054425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.054477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.054529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.054914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.054978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.055030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.055082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.055486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.055511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.058603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.058679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.058732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.058785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.059421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.059478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.059531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.059584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.060066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.060093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.062732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.062789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.062852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.062903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.063292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.063347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.063404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.063457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.063966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.063988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.067394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.067460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.067511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.067563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.067951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.068006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.068058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.068110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.068643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.068664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.071685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.071742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.071794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.071859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.072494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.072551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.072604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.072656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.073135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.073160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.075805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.075866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.075927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.075978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.076366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.076427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.076480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.076531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.077019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.077039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.080377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.080440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.080500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.080552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.080945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.081000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.081052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.081105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.081609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.081631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.084635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.084693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.084745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.084798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.085444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.085501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.085553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.651 [2024-07-24 20:11:53.085607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.086067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.086087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.088626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.088683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.088735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.088786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.089175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.089239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.089291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.089343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.089740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.089761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.093342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.093404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.093456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.093509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.093903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.093968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.094020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.094072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.094492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.094516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.097210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.097271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.097323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.097376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.097949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.098005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.098058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.098111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.098641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.098663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.101178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.101240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.101291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.101342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.101735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.101790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.101842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.101899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.102242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.102262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.105644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.105707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.105763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.105814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.106203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.106258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.106310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.106361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.106703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.106724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.109400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.109457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.109511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.109563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.110135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.110191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.110257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.110309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.110897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.110919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.113450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.113511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.113563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.113614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.114050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.114105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.114157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.114208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.114550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.114571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.117947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.118008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.118060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.118112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.118555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.118612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.652 [2024-07-24 20:11:53.118664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.118715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.119056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.119076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.121366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.121428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.121488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.121540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.122157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.122213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.122267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.122319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.122855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.122877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.125712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.125769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.125825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.125864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.126417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.126473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.126524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.126576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.126953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.126973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.130405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.132310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.134332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.136434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.138503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.140362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.142444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.144104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.144668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.144689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.149665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.151121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.153086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.155159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.157134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.157638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.158130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.158627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.158997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.159017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.163419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.165525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.166033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.166530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.167767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.169585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.171668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.173754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.174272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.174293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.177204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.177707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.179356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.181186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.183634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.185178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.187004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.189078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.189432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.189453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.194680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.196779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.198302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.200359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.202759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.204442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.204936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.205433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.205962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.205985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.210121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.212185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.214286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.215031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.216075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.216577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.218541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.220608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.220955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.653 [2024-07-24 20:11:53.220976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.654 [2024-07-24 20:11:53.223702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.654 [2024-07-24 20:11:53.224200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.654 [2024-07-24 20:11:53.224699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.654 [2024-07-24 20:11:53.226261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.654 [2024-07-24 20:11:53.228736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.654 [2024-07-24 20:11:53.230829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.654 [2024-07-24 20:11:53.232266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.654 [2024-07-24 20:11:53.234073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.654 [2024-07-24 20:11:53.234428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.654 [2024-07-24 20:11:53.234450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.654 [2024-07-24 20:11:53.238948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.240778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.242850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.244650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.246795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.248853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.250894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.251409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.251942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.251968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.256608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.258489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.260550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.262641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.263546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.264041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.264539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.266518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.266886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.266907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.271168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.271897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.272399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.272892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.275213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.277306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.279380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.280591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.281010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.281032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.284184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.285438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.287255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.289318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.290889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.292695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.294750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.296833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.297317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.297338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.302214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.304293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.305736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.307551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.310002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.310511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.311004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.311505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.311947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.311968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.316232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.318308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.319648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.320153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.321115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.322693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.324497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.326572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.326920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.326941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.329709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.330206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.330746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.332566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.334773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.336396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.338219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.340229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.340808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.340830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.344408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.344906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.345408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.915 [2024-07-24 20:11:53.345917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.346961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.347475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.347973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.348480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.349006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.349026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.352590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.353089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.353597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.354115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.355129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.355631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.356126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.356635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.357177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.357198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.360697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.361195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.361700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.362201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.363228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.363730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.364226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.364729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.365306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.365329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.368806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.369303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.369807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.370305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.371351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.371866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.372360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.372878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.373442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.373465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.376972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.377476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.377973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.378480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.379569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.380065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.380571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.381070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.381624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.381645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.385134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.385653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.386154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.386658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.387745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.388240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.388742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.389245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.389780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.389803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.393287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.393791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.394287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.394792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.395851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.396350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.396857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.397364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.397897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.397920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.401372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.401922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.402427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.402939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.403978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.404482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.404982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.405499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.406051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.406074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.409681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.410180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.410686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.411188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.412231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.412734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.413236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.413742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.414274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.414296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.417873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.418370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.418879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.419397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.916 [2024-07-24 20:11:53.420459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.420954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.421467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.421972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.422518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.422541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.426104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.426608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.428044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.428797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.429648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.430144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.430641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.432266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.432651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.432672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.436700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.438415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.438913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.439414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.441532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.443350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.445418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.447267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.447644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.447666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.450474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.450974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.453041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.455136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.457283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.459311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.461372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.463466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.463867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.463889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.468634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.470717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.472658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.474650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.477078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.479155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.479659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.480155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.480703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.480728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.484575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.486532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.488637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.490179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.491208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.491711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.493470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.495362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.495720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.495742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.499397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.499897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.500397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.500891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.503226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:01.917 [2024-07-24 20:11:53.505306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.506844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.508903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.509252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.509274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.512385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.514448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.516511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.516562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.518638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.520705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.522761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.524824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.525227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.525249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.530009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.532087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.534130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.535731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.535789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.536184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.538307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.540280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.540780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.541279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.541853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.541875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.543931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.543990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.544044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.544099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.544471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.544550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.544615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.544669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.544721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.545067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.545088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.547719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.547778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.547831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.547884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.548305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.548375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.548436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.548487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.548545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.548890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.548912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.550969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.551027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.551079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.551131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.551477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.551550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.551602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.551660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.551714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.552205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.552227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.554891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.554949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.555002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.555058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.555406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.555485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.555543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.555595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.555646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.556058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.556082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.558148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.558207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.558259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.558311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.558848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.558919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.558973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.559026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.559085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.559660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.559683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.561784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.561849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.561901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.180 [2024-07-24 20:11:53.561954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.562294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.562363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.562432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.562501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.562555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.562900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.562921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.565498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.565562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.565614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.565668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.566200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.566269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.566322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.566373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.566431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.566809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.566829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.568913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.568971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.569031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.569092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.569442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.569514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.569568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.569620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.569671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.570182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.570204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.573039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.573110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.573164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.573216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.573564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.573643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.573695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.573747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.573798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.574136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.574162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.576239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.576296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.576348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.576424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.577002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.577069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.577122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.577175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.577227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.577759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.577781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.579995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.580053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.580105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.580156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.580688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.580762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.580815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.580866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.580917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.581287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.581307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.583786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.583845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.583901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.583954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.584471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.584541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.584597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.584659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.584716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.585059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.585079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.587169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.587226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.587278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.587331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.587677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.587753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.587807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.587868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.587924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.588271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.588293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.591293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.591351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.591409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.591469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.591813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.591887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.591948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.181 [2024-07-24 20:11:53.592000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.592051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.592397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.592418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.594499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.594556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.594614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.594667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.595180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.595249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.595303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.595356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.595415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.595929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.595950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.598311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.598374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.598437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.598489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.598928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.599002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.599059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.599116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.599168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.599573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.599595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.602088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.602146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.602198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.602253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.602820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.602891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.602945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.602998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.603053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.603440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.603463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.605562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.605624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.605686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.605738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.606077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.606150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.606203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.606255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.606306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.606652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.606674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.609786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.609845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.609896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.609948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.610317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.610396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.610450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.610507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.610567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.610910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.610932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.612984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.613041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.613097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.613150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.613657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.613729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.613782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.613834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.613887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.614423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.614446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.617120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.617178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.617235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.617290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.617635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.617711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.617767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.617819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.617876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.618228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.618249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.620625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.620683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.620736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.620789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.621337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.621411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.621466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.621520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.621572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.182 [2024-07-24 20:11:53.622023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.622042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.624123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.624180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.624234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.624295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.624645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.624724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.624780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.624832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.624888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.625228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.625249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.628232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.628293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.628348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.628407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.628799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.628872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.628929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.628981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.629033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.629374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.629401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.631465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.631521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.631573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.631624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.632066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.632164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.632219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.632271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.632323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.632855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.632877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.635400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.635455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.635507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.635558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.635898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.635969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.636034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.636088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.636140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.636494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.636516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.638753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.638810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.638862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.638915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.639412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.639479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.639532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.639586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.639638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.640149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.640169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.642214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.642276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.642333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.642385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.642766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.642838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.642892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.642944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.642995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.643332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.643353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.646057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.646114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.646166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.646219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.646626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.646698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.646754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.646813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.646868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.647213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.647233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.649307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.649363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.649428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.649480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.649821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.649894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.649959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.650012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.650065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.650602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.650625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.653228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.183 [2024-07-24 20:11:53.653284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.655367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.655436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.655782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.655854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.655909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.655961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.656016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.656363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.656383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.658745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.658807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.658860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.659355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.659881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.659950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.660002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.660053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.660105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.660523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.660544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.664608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.666671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.667169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.667666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.668199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.669555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.671384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.673467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.675512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.675916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.675936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.678608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.679106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.680841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.682778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.683122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.685178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.686898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.688736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.690799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.691144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.691171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.696375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.698469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.699325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.701376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.701722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.703536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.704033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.704532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.705024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.705485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.705509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.708897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.709406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.709902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.710403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.710884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.711401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.711898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.712396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.712889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.713464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.184 [2024-07-24 20:11:53.713486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.716918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.717440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.717934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.718433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.718923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.719439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.719938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.720444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.720937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.721468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.721490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.724887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.725398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.725895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.726399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.726885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.727404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.727899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.728395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.728889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.729428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.729450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.732852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.733369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.733869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.734362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.734844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.735353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.735854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.736347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.736844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.737343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.737365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.740794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.741307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.741807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.742320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.742855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.743380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.743883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.744374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.744877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.745364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.745386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.748759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.749266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.749765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.750274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.750816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.751342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.751844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.752337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.752834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.753284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.753307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.756704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.757209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.757709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.758208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.758741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.759261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.759761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.760256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.760753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.761192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.761212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.764641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.765143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.765653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.766154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.766678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.767200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.767705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.768196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.768694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.769141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.185 [2024-07-24 20:11:53.769161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.772518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.773028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.773531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.774031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.774564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.775091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.775592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.776084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.776586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.777010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.777031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.780375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.780883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.781378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.781878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.782384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.782911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.783411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.783902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.784399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.784847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.784881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.787950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.788458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.788956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.789470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.789960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.790475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.792240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.794145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.796245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.796596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.796618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.799083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.799584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.800078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.802043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.802385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.804507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.446 [2024-07-24 20:11:53.806038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.808090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.810177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.810529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.810550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.814673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.816570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.818658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.820310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.820661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.822499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.824551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.826216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.826722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.827224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.827244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.831498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.833342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.835290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.837396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.837743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.838256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.838758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.839255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.841071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.841495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.841516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.845495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.847227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.847727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.848220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.848741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.850413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.852240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.854292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.856014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.856361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.856381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.859046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.859552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.861399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.863363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.863714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.865779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.867568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.869471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.871586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.871929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.871950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.876761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.878807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.880881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.882362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.882799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.884894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.886955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.887466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.887968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.888517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.888552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.891841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.893679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.895762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.897818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.898331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.898850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.899344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.900412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.902224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.902573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.902594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.906688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.907193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.907695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.908194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.908663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.910486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.912569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.914655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.915928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.916304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.447 [2024-07-24 20:11:53.916325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.919212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.920354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.922177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.924236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.924586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.925826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.927648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.929722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.931792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.932246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.932268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.937029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.939123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.940270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.942098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.942449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.944548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.945845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.946341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.946843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.947374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.947402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.951300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.953397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.955485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.956379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.956954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.957473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.957969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.960030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.962094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.962443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.962464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.965802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.966300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.966800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.967518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.967911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.970010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.972071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.973288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.975071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.975421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.975442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.978472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.980465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.982530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.984618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.985002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.986836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.988777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.990856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.992447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.993006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.993034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.997654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:53.999723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.001330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.003152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.003503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.005603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.006107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.006611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.007114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.007560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.007582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.011429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.013500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.015561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.016060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.016593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.017099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.018444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.020258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.022315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.022665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.022686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.025189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.025740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.026238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.027981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.028370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.030483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.032549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.034209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.036020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.036362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.448 [2024-07-24 20:11:54.036382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.040122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.041968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.042028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.044097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.044447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.045794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.047623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.049720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.051775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.052257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.052278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.056978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.059070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.060280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.060339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.060728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.062841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.064926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.065431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.065928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.066546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.066570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.068669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.068735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.068787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.068840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.069181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.069256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.069316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.069378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.069436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.069778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.069799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.072401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.072459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.072511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.072562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.073097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.073165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.073217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.711 [2024-07-24 20:11:54.073269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.073320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.073697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.073718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.075781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.075837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.075889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.075950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.076287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.076359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.076419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.076471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.076522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.076988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.077009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.079808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.079870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.079921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.079979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.080317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.080386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.080444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.080496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.080548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.080884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.080904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.082950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.083007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.083060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.083119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.083651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.083731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.083786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.083839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.083893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.084424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.084445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.086653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.086709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.086761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.086813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.087323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.087402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.087455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.087507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.087558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.087924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.087944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.090455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.090512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.090564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.090617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.091150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.091216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.091270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.091323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.091379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.091728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.091749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.093843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.093899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.093955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.094007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.094344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.094420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.094474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.094533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.094588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.094930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.094951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.097944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.098012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.098064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.098123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.098468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.098543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.098607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.098658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.098709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.099054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.099074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.101159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.101215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.101272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.101326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.101866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.101930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.101984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.102037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.102090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.102597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.712 [2024-07-24 20:11:54.102619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.104984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.105050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.105101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.105152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.105577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.105647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.105705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.105756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.105807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.106176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.106197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.108660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.108717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.108774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.108850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.109453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.109527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.109586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.109639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.109692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.110079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.110100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.112237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.112293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.112344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.112415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.112752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.112827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.112882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.112935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.112987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.113595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.113617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.116177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.116240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.116293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.116344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.116689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.116760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.116813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.116865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.116917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.117464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.117485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.119583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.119641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.119693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.119745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.120275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.120337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.120397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.120450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.120520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.121136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.121158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.124213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.124269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.124327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.124383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.124957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.125022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.125087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.125155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.125223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.125749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.125769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.128904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.128972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.129024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.129077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.129529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.129609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.129669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.129722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.129774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.130324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.130345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.133289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.133352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.133411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.133466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.134000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.134076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.134131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.134183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.134235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.134759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.134780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.137658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.137716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.713 [2024-07-24 20:11:54.137771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.137823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.138333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.138420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.138488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.138542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.138594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.139114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.139135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.142153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.142210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.142266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.142322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.142797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.142885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.142940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.142993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.143044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.143576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.143608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.146522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.146578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.146632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.146685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.147227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.147316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.147383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.147441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.147493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.148045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.148066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.150960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.151018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.151072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.151126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.151658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.151733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.151804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.151871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.151928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.152379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.152408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.155456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.155514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.155566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.155618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.156114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.156212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.156268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.156320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.156376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.156925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.156950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.159987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.160045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.160103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.160155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.160661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.160736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.160789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.160842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.160894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.161465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.161486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.164344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.164407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.164460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.164513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.165037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.165120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.165187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.165241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.165293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.165734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.165755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.168854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.168935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.168987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.169056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.169570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.169650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.169704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.169759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.169813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.170350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.170371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.173394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.173452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.173505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.173558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.174033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.174106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.714 [2024-07-24 20:11:54.174164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.174217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.174270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.174804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.174826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.177760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.177820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.177877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.177930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.178368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.178448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.178503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.178557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.178619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.179090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.179111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.182037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.182107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.182176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.182234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.182785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.182853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.182906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.182959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.183011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.183524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.183546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.186457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.186514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.186566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.186619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.187111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.187183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.187238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.187291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.187345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.187841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.187863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.190875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.190932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.191437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.191498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.191993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.192088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.192154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.192206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.192258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.192804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.192826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.195741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.195799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.195851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.196339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.196894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.196959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.197015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.197071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.197124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.197667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.197689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.201251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.201775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.202269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.202770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.203254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.203772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.204272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.204775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.205269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.205838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.205860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.209355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.209873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.210367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.210871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.211372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.211901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.212402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.212894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.213386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.213892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.213914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.218921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.219863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.221113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.221637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.222116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.222628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.223126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.223630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.224126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.715 [2024-07-24 20:11:54.224683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.224705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.228077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.229927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.232007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.234135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.234666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.235177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.235679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.236496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.238326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.238678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.238699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.242875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.243381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.243883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.244381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.244828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.246656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.248742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.250804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.252184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.252591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.252612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.255523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.256607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.258447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.260540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.260887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.262150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.263986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.266068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.268158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.268630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.268652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.273426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.275514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.276728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.278560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.278905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.281019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.281603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.282099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.282598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.283100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.283121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.286972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.289074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.291156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.291669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.292221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.292741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.293666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.295500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.297594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.297938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.297958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.300570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.716 [2024-07-24 20:11:54.301077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.976 [2024-07-24 20:11:54.301578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.976 [2024-07-24 20:11:54.303221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.976 [2024-07-24 20:11:54.303602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.976 [2024-07-24 20:11:54.305698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.976 [2024-07-24 20:11:54.307765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.976 [2024-07-24 20:11:54.309326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.976 [2024-07-24 20:11:54.311163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.976 [2024-07-24 20:11:54.311515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.976 [2024-07-24 20:11:54.311536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.976 [2024-07-24 20:11:54.315210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.976 [2024-07-24 20:11:54.317043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.976 [2024-07-24 20:11:54.319129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.976 [2024-07-24 20:11:54.321207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.321678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.323521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.325610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.327697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.328196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.328749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.328771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.333415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.334656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.336491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.338567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.338912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.339526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.340022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.340519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.341694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.342092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.342113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.346113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.348212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.348720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.349216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.349785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.350627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.352453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.354536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.356621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.357146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.357166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.359895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.360409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.361808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.363634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.363981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.366094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.367472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.369311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.371405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.371754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.371779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.376568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.378658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.380744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.381961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.382403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.384518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.386606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.387267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.387764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.388275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.388296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.391696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.393532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.395621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.397706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.398192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.398708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.399205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.400110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.401920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.402269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.402290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.406420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.406928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.407428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.407946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.408432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.410265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.412347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.414438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.415686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.416085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.416106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.418996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.420087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.421929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.424020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.424366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.425624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.427453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.429540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.431617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.432106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.432127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.436814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.438906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.440135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.441999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.977 [2024-07-24 20:11:54.442349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.444465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.445131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.445630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.446125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.446637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.446658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.450607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.452688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.454740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.455238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.455779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.456297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.457428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.459254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.461342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.461692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.461713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.464228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.464740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.465244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.466859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.467237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.469342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.471440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.472978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.474819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.475166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.475186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.479240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.481071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.483180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.485251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.485660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.487480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.489563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.491617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.492115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.492672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.492695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.497271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.498508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.500335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.502417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.502763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.503406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.503901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.504400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.505585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.505972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.505993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.510121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.512182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.512687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.513195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.513784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.515013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.516837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.518925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.520973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.521445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.521466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.524135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.524652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.526217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.528032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.528379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.530487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.531993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.533822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.535913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.536260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.536281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.541161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.543234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.545312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.546807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.547219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.549310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.551395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.551897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.552400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.553007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.553028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.556324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.558158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.560248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.562304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.562840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.563352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.978 [2024-07-24 20:11:54.563856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.979 [2024-07-24 20:11:54.564925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.979 [2024-07-24 20:11:54.566764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.979 [2024-07-24 20:11:54.567113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:02.979 [2024-07-24 20:11:54.567134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.571282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.571797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.572298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.572798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.573250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.575342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.577416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.578271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.580349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.580708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.580729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.585514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.587578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.589616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.591063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.591444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.593258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.595343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.597303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.597813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.598365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.598387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.601831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.602334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.602837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.603339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.603910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.604423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.604919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.605426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.605929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.606478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.606499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.609924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.610441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.610940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.611443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.612019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.612536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.613042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.613542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.614040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.614609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.614631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.618091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.618610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.618677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.619171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.619689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.620198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.620701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.621199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.621701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.622205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.622225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.625570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.626073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.626573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.626634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.627172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.627684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.628196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.628698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.241 [2024-07-24 20:11:54.629221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.629738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.629760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.632776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.632839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.632891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.632944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.633409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.633485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.633553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.633607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.633679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.634150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.634172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.637045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.637115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.637168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.637220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.637771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.637846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.637900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.637952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.638005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.638483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.638504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.641449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.641509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.641562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.641616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.642161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.642236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.642294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.642346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.642406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.642896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.642917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.645677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.645734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.645790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.645841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.646345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.646427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.646482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.646535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.646587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.647076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.647097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.650047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.650107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.650160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.650212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.650713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.650793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.650849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.650904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.650956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.651504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.651525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.654372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.654435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.654487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.654539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.655043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.655119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.655173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.655227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.655282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.655788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.655816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.658735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.658792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.658844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.658895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.659441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.659515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.659578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.659631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.659683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.660189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.660211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.662968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.663026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.663083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.663135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.663629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.663704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.663758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.663811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.663866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.664359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.242 [2024-07-24 20:11:54.664380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.667353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.667420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.667473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.667525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.668025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.668106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.668162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.668216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.668275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.668827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.668849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.671649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.671706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.671758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.671810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.672287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.672361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.672422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.672476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.672529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.673024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.673046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.675964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.676023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.676074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.676125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.676618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.676693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.676747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.676800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.676852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.677349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.677371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.680209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.680269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.680325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.680377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.680820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.680897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.680956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.681009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.681061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.681614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.681636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.684539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.684597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.684649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.684700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.685130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.685206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.685262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.685316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.685369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.685923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.685945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.688671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.688733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.688784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.688836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.689242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.689316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.689369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.689431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.689484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.690025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.690046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.692887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.692949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.693000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.693057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.693477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.693551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.693634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.693687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.693740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.694271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.694292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.697022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.697084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.697141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.697193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.697553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.697627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.697680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.697733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.697786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.698330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.698352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.701225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.243 [2024-07-24 20:11:54.701286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.701339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.701396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.701818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.701895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.701948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.702002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.702054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.702599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.702621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.704825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.704884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.704936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.704987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.705542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.705605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.705659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.705712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.705768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.706113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.706134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.709035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.709095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.709148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.709201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.709708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.709784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.709842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.709895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.709952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.710449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.710472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.713417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.713475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.713532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.713586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.713968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.714035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.714087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.714144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.714219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.714573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.714595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.716716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.716774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.716825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.716886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.717229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.717305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.717361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.717421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.717473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.718040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.718062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.720663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.720731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.720789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.720844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.721184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.721255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.721308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.721359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.721417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.721923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.721944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.724184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.724243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.724297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.724351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.724901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.724977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.725043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.725100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.725152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.725678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.725704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.727738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.727796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.727857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.727909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.728303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.728374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.728435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.728487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.728539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.728883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.728904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.731647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.731706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.731760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.731812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.732195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.732266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.244 [2024-07-24 20:11:54.732320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.732379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.732438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.732784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.732805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.734875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.734932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.734985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.735036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.735384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.735462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.735523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.735576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.735629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.736142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.736163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.738836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.738894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.738950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.739006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.739346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.739423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.739476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.739529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.739580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.740097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.740117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.742199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.742257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.742312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.742366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.742901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.742965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.743033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.743104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.743159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.743704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.743726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.745835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.745893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.745954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.746005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.746435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.746511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.746565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.746617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.746669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.747006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.747027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.749667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.749724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.749777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.749832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.750268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.750334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.750386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.750444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.750497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.750871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.750892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.752990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.753048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.755125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.755182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.755654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.755729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.755782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.755833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.755886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.756433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.756461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.759052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.759109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.245 [2024-07-24 20:11:54.759161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.761223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.761647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.761715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.761768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.761823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.761882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.762230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.762250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.765410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.767354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.769413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.771495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.771863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.773661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.775579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.777670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.779323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.779841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.779863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.784502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.786572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.788180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.789981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.790323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.792425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.792927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.793429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.793934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.794364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.794385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.798338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.800412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.802147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.802646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.803183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.803709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.805066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.806873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.808935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.809279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.809299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.811899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.812411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.812909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.814818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.815201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.817298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.819363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.821110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.822961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.823304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.823325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.827326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.829148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.246 [2024-07-24 20:11:54.831213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.833273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.833737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.835578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.837654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.839745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.840247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.840786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.840808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.845464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.846709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.848539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.850641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.850986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.851637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.852132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.852649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.854000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.854379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.854406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.858374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.860035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.860537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.861031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.861572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.863310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.865187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.867278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.868941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.869286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.869306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.872171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.872683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.874600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.876665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.877009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.878522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.880482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.882562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.884652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.885144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.885165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.890110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.892210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.893670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.895640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.895984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.898087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.899641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.900158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.900655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.901207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.901229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.507 [2024-07-24 20:11:54.905194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.907260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.909336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.910327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.910907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.911422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.911914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.913812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.915897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.916238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.916259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.919176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.919689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.920183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.921412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.921799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.923919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.926000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.927233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.929062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.929421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.929442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.933293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.935123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.937187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.939269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.939761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.941586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.943661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.945722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.946229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.946779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.946801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.951502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.952801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.954617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.956674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.957019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.957552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.958047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.958557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.959988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.960420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.960441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.964484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.966239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.966740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.967233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.967762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.969409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.971217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.973282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.975077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.975448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.975470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.978266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.978774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.980831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.982934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.983281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.984971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.987015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.989106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.991183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.991604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.991626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.996408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:54.998478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:55.000544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:55.002325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:55.002723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:55.004825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:55.006890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:55.007401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:55.007899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:55.008478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:55.008502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:55.012330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:55.014248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:55.016355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:55.017621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:55.018164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:55.018685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:55.019178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:55.019758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:55.021573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:55.021919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:55.021940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:55.026115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:55.026630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.508 [2024-07-24 20:11:55.028065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.028845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.029385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.031269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.033349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.035458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.036694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.037100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.037121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.040083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.040914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.042737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.044849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.045200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.046118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.047958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.050041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.052111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.052560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.052581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.056129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.056643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.057141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.057643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.058087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.058607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.059100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.059595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.060098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.060638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.060659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.064032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.064581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.065083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.065586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.066050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.066567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.067060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.067557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.068053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.068528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.068549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.071891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.072403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.072904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.073404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.073860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.074373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.074871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.075365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.075866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.076330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.076351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.079698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.080200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.080698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.081193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.081653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.082163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.082660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.083153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.083657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.084179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.084200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.087549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.088056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.088554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.089050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.089498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.090005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.090505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.090999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.091510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.092031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.092057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.095399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.095898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.096398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.096894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.097478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.097988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.509 [2024-07-24 20:11:55.098489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.772 [2024-07-24 20:11:55.098986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.772 [2024-07-24 20:11:55.099496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.772 [2024-07-24 20:11:55.100011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.772 [2024-07-24 20:11:55.100033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.772 [2024-07-24 20:11:55.103367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.772 [2024-07-24 20:11:55.103870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.772 [2024-07-24 20:11:55.104364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.772 [2024-07-24 20:11:55.104866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.772 [2024-07-24 20:11:55.105348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.772 [2024-07-24 20:11:55.105875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.772 [2024-07-24 20:11:55.106372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.772 [2024-07-24 20:11:55.106872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.772 [2024-07-24 20:11:55.107376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.772 [2024-07-24 20:11:55.107940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.772 [2024-07-24 20:11:55.107962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.772 [2024-07-24 20:11:55.111269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.772 [2024-07-24 20:11:55.111776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.112270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.112770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.113293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.113807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.114300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.114797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.115318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.115803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.115825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.119126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.119635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.120129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.120628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.121097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.121611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.122104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.122603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.123108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.123651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.123672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.127005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.127509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.128005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.128511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.129049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.129560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.130053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.130656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.132242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.132689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.132709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.136141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.136657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.137164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.137661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.138133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.138654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.139151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.139647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.140140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.140487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.140508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.144710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.146803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.146862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.147398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.147925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.148432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.149244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.151039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.153123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.153474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.153495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.156121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.156627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.157121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.157177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.157555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.159379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.161461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.163415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.165274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.165684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.165705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.168419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.168477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.168529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.168588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.169112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.169176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.169233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.169285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.169337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.169718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.169739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.171883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.171940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.171992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.172064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.172407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.172479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.172536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.172588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.172639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.173101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.173122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.175967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.176028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.176080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.176132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.773 [2024-07-24 20:11:55.176483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.176555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.176609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.176661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.176720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.177063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.177083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.179202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.179287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.179340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.179397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.179963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.180029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.180082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.180135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.180187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.180724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.180746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.182942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.183005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.183057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.183111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.183457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.183529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.183589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.183655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.183707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.184046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.184067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.186660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.186718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.186770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.186823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.187306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.187373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.187430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.187483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.187534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.187941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.187961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.190018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.190076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.190128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.190179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.190525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.190596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.190652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.190711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.190767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.191250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.191271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.193925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.193982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.194039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.194113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.194459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.194530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.194584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.194635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.194686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.195125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.195146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.197317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.197375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.197437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.197491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.198034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.198102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.198160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.198217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.198269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.198807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.198829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.200993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.201055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.201107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.201158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.201548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.201621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.201674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.201726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.201777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.202113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.202133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.204933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.204991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.205047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.205103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.205446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.205519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.205579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.205636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.774 [2024-07-24 20:11:55.205688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.206030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.206050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.208181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.208237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.208302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.208357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.208710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.208783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.208837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.208890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.208945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.209513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.209535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.212074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.212131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.212183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.212235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.212578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.212654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.212708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.212768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.212821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.213216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.213237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.215584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.215642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.215695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.215748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.216232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.216296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.216352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.216410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.216464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.216986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.217006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.219138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.219195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.219252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.219311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.219660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.219727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.219793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.219846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.219898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.220236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.220256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.223278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.223335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.223397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.223449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.223820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.223890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.223944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.223995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.224047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.224384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.224409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.230910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.230973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.231027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.231080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.231644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.231714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.231770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.231822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.231875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.232365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.232399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.237577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.237648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.237701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.237753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.238176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.238248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.238314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.238367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.238425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.238975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.238996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.243905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.243978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.244030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.244082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.244501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.244573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.244627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.244701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.244753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.245093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.245113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.249590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.775 [2024-07-24 20:11:55.249653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.249711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.249763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.250103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.250174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.250236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.250291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.250349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.250721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.250742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.255668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.255731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.255794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.255846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.256229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.256306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.256361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.256418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.256471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.256809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.256830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.263426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.263490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.263544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.263597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.264077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.264142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.264194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.264246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.264299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.264835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.264857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.270084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.270145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.270197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.270256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.270600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.270674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.270733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.270786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.270839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.271429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.271452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.276490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.276556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.276613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.276664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.277048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.277119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.277174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.277226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.277278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.277625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.277646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.282541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.282608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.282660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.282711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.283052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.283128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.283181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.283233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.283292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.283740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.283761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.288552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.288616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.288669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.288729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.289067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.289136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.289196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.289254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.289306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.289648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.289669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.296173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.296235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.296289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.296343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.296874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.296940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.296995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.297047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.297099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.297643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.297665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.303062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.303125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.303181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.303233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.776 [2024-07-24 20:11:55.303579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.303652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.303708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.303769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.303821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.304353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.304373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.309156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.309225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.309278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.309334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.309682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.309758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.309816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.309868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.309919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.310258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.310279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.315642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.315705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.315757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.315808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.316147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.316220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.316283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.316335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.316386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.316794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.316815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.322462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.322526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.323213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.323269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.323648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.323720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.323772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.323824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.323875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.324220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.324241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.326332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.326394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.326446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.327424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.327991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.328059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.328114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.328167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.328220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.328772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.328794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.332180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.334027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.336111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.338181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.338690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.339200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.339698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.340919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.342751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.343095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.343115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.347152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.347666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.348159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.348655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.348998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.350858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.352907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.354610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.356654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.356997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.357017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.360216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:03.777 [2024-07-24 20:11:55.362242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.364297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.366386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.366783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.368667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.370673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.372770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.374304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.374864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.374886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.379533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.380734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.382809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.384902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.385398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.385907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.386404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.387551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.389375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.389724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.389744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.393822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.394323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.394819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.395312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.395668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.397506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.399586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.401236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.403022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.403426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.403447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.406493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.038 [2024-07-24 20:11:55.406995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.407503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.407998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.408533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.409041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.409548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.410044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.410540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.411097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.411118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.414534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.415041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.415557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.416050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.416658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.417167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.417672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.418168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.418665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.419171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.419193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.422499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.423011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.423519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.424011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.424573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.425080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.425584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.426089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.426593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.427137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.427162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.430511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.431021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.431523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.432016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.432607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.433117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.433622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.434118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.434614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.435155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.435176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.438599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.439104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.439607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.440099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.440634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.441139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.441644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.442139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.442634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.443147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.443168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.446703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.447208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.447708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.448201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.448741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.449260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.449769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.450264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.450777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.451284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.451305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.454806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.455317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.455817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.456310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.456828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.457338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.457842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.458337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.458839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.459270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.459291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.462775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.463280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.463783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.464275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.464802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.465315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.465816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.466307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.466809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.467258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.467279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.470781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.471289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.471789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.039 [2024-07-24 20:11:55.472281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.472762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.473271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.473771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.474265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.474761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.475258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.475279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.478782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.482110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.524027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.524106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.525474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.525533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.527227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.527507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.530919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.532654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.534048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.534444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.534929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.535293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.535346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.535714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.535765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.536131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.538042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.538368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.538384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.538403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.538417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.543901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.545615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.547434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.547826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.548677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.549067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.549462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.550503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.550777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.550794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.550809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.550825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.552969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.554548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.556536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.558402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.560015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.560411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.560798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.561186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.561672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.561690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.561705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.561720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.566245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.568183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.569539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.571264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.573521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.573920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.574309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.574701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.575163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.575180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.575195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.575209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.578787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.580524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.581705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.583322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.585569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.587575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.589579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.589972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.590473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.590492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.590507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.590523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.595574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.597310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.598535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.600129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.602396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.604396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.606317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.606726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.607243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.607261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.607276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.607292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.040 [2024-07-24 20:11:55.611555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.041 [2024-07-24 20:11:55.613372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.041 [2024-07-24 20:11:55.615105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.041 [2024-07-24 20:11:55.616831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.041 [2024-07-24 20:11:55.618401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.041 [2024-07-24 20:11:55.619772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.041 [2024-07-24 20:11:55.621505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.041 [2024-07-24 20:11:55.623388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.041 [2024-07-24 20:11:55.623667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.041 [2024-07-24 20:11:55.623684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.041 [2024-07-24 20:11:55.623699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.041 [2024-07-24 20:11:55.623713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.041 [2024-07-24 20:11:55.628507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.630319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.632311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.634301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.636691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.638057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.639790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.641519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.641794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.641811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.641825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.641840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.644599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.645981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.647349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.649079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.651053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.651889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.653852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.655224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.655504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.655521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.655535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.655550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.660115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.661128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.662467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.664401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.666396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.667604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.669153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.303 [2024-07-24 20:11:55.670504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.670778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.670794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.670809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.670823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.672982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.673376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.673766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.674155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.675772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.677499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.679204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.681098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.681517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.681534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.681548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.681562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.686932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.687331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.687741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.688129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.690168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.692089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.693804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.695525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.695862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.695880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.695894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.695909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.699149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.700202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.701680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.702554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.703396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.705217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.705749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.706138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.706512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.706529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.706543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.706557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.712629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.714626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.716473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.716945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.717834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.718226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.718619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.719127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.719403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.719421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.719435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.719449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.723070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.724772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.726504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.728204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.728873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.729265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.729657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.730048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.730478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.730496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.730511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.730526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.733957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.734357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.734770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.735166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.735980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.736370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.736763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.737153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.737484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.737505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.737520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.737535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.740306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.740704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.741101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.741502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.742381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.742777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.743167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.743566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.744003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.744019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.744033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.744048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.747450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.747850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.304 [2024-07-24 20:11:55.748244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.748640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.749500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.749891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.750280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.750679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.751111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.751128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.751142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.751157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.753854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.754255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.754656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.755052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.755861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.756251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.756641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.757030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.757388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.757408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.757423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.757438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.761058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.761478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.761887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.762281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.763100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.763496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.763890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.764286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.764625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.764642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.764657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.764672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.767414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.767811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.768210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.768606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.769467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.769856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.770249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.770647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.771108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.771125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.771143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.771159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.774837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.775242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.775645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.776036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.776831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.777222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.777619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.778013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.778422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.778439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.778454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.778469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.781480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.781884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.782283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.782682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.783501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.783887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.784278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.784679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.785051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.785068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.785083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.785098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.788528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.788927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.789321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.789714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.790584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.790983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.791379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.791780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.792275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.792291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.792307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.792322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.795153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.795557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.795608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.795996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.796817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.797207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.797254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.797640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.798043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.798402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.305 [2024-07-24 20:11:55.798419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.798433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.798449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.798463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.802019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.802430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.802817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.803205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.803598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.804037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.804440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.804510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.804910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.804967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.805345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.805362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.805376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.805395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.805409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.808068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.808469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.808527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.808923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.809339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.809743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.809789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.810178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.810221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.810605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.810624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.810639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.810654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.810668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.813728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.814127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.814172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.814565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.815023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.815431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.815483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.815874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.815922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.816321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.816337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.816357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.816371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.816385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.818937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.819351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.819405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.819797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.820206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.820610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.820658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.821045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.821088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.821569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.821588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.821603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.821617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.821632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.824674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.825069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.825114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.825506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.825848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.826256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.826313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.826724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.826773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.827272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.827289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.827305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.827320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.827336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.829686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.830086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.830145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.830545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.831012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.831419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.831473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.831879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.831924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.832356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.832374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.832398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.306 [2024-07-24 20:11:55.832414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.832429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.835986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.836387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.836439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.836481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.836921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.837322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.837377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.839344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.839404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.839684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.839701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.839715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.839729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.839743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.841800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.841845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.841892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.841934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.842381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.842787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.842831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.842872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.842912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.843327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.843347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.843362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.843376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.843400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.847673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.847726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.847768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.847811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.848083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.848139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.848181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.848223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.848263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.848616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.848634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.848648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.848662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.848676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.850292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.850355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.850404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.850451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.850929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.850987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.851029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.851072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.851114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.851575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.851593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.851607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.851621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.851635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.855176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.855227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.855268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.855317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.855589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.855644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.855686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.855726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.855770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.856238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.856255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.856269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.856283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.856296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.857828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.857873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.857917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.307 [2024-07-24 20:11:55.857958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.858226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.858285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.858327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.858378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.858425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.858812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.858829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.858844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.858859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.858873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.861652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.861702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.861744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.861784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.862052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.862112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.862153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.862194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.862234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.862509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.862526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.862540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.862555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.862568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.864255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.864300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.864341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.864411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.864684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.864745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.864786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.864827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.864877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.865149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.865169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.865183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.865198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.865211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.868635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.868684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.868726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.868766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.869058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.869119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.869163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.869204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.869245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.869521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.869538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.869552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.869566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.869580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.871215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.871259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.871303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.871345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.871663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.871724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.871766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.871810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.871851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.872122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.872139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.308 [2024-07-24 20:11:55.872153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.872172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.872185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.876440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.876491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.876534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.876575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.877017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.877071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.877114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.877161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.877205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.877482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.877499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.877513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.877527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.877541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.879076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.879122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.879162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.879212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.879645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.879702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.879743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.879784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.879825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.880096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.880113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.880127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.880142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.880156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.885275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.885330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.885373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.885420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.885792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.885844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.885886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.885928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.885969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.886425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.886444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.886459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.886474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.886489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.888101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.888146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.888191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.888232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.888503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.888563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.888605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.888649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.888689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.889016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.889033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.889048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.889063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.309 [2024-07-24 20:11:55.889077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.571 [2024-07-24 20:11:55.894479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.571 [2024-07-24 20:11:55.894547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.571 [2024-07-24 20:11:55.894590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.571 [2024-07-24 20:11:55.894637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.571 [2024-07-24 20:11:55.895120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.571 [2024-07-24 20:11:55.895173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.571 [2024-07-24 20:11:55.895216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.571 [2024-07-24 20:11:55.895257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.571 [2024-07-24 20:11:55.895300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.571 [2024-07-24 20:11:55.895744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.571 [2024-07-24 20:11:55.895763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.571 [2024-07-24 20:11:55.895777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.571 [2024-07-24 20:11:55.895792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.571 [2024-07-24 20:11:55.895805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.898039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.898087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.898128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.898168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.898438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.898497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.898545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.898586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.898627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.898897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.898914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.898928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.898943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.898956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.904092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.904142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.904183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.904224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.904495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.904556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.904601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.904643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.904685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.905085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.905102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.905117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.905131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.905146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.907695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.907741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.907791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.907832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.908099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.908150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.908191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.908243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.908284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.908557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.908574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.908588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.908602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.908616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.912882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.912933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.912975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.913024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.913289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.913342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.913384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.913430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.913473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.913749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.913766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.913780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.913794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.913808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.916160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.916205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.916246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.916286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.916736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.916788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.916830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.916871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.916913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.917233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.917250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.917264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.917278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.917292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.922014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.922064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.922105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.922145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.922466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.922526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.922568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.922609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.922649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.922915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.922931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.922949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.922963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.922977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.924927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.924975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.925021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.925064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.572 [2024-07-24 20:11:55.925489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.925544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.925587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.925632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.925673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.926174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.926191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.926206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.926222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.926236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.929891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.929941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.929982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.930030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.930501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.930557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.930599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.930639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.930679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.930949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.930967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.930981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.930996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.931010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.932496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.932543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.932585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.932627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.933023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.933075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.933116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.933157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.933199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.933627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.933646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.933661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.933676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.933690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.938063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.938113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.938154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.938194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.938465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.938525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.938566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.938607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.938646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.938928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.938944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.938959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.938974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.938988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.940560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.942288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.942340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.942381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.942789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.942849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.943235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.943281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.943326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.943762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.943780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.943795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.943810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.943824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.947825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.947875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.947916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.947956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.948223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.948311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.949585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.949659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.951310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.951641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.951658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.951672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.951687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.951700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.953645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.953693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.954081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.954125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.954509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.954575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.954961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.955004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.955778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.956050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.573 [2024-07-24 20:11:55.956066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.956080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.956095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.956108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.962798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.962853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.964648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.964694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.964968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.965029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.966000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.966044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.966454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.966899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.966916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.966932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.966947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.966962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.972920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.972985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.974556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.974601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.974946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.975002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.976346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.976402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.978124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.978405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.978421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.978435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.978450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.978463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.983416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.983482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.985123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.985175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.985456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.985511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.986956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.987004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.987889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.988216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.988233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.988247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.988261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.988275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.992765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.992823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.993210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.993254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.993697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.993752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.995748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.995794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.997715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.997993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.998013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.998027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.998042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:55.998056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.004813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.004871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.005369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.005767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.006222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.006277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.006672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.006716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.007103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.007516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.007534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.007547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.007562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.007576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.013987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.015984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.017993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.019812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.020200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.020263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.020684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.021075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.021474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.021902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.021920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.021936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.021953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.021974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.026694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.028309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.030308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.032175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.574 [2024-07-24 20:11:56.032465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.033731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.034124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.034521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.034909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.035406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.035426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.035442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.035458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.035473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.039813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.041823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.043451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.045182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.045472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.047136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.047534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.047926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.048315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.048834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.048854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.048869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.048884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.048899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.053616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.055429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.056781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.058515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.058794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.060797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.061193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.061590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.061979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.062441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.062461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.062476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.062492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.062506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.067902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.069354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.070734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.072465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.072753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.074576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.075064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.075468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.075858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.076275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.076292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.076308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.076324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.076339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.081835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.083017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.084385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.086128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.086423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.088168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.088873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.089278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.089680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.090096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.090113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.090128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.090142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.090157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.095556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.096613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.097931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.099742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.100025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.101771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.102684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.103075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.103474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.103857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.103873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.103888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.103902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.103916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.108979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.109502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.111357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.113012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.113292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.115014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.116691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.117082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.117481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.117915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.117933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.117948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.575 [2024-07-24 20:11:56.117962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.117976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.123230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.124101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.126078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.127447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.127726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.129506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.131439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.131832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.132221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.132681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.132699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.132714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.132729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.132744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.138107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.139214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.140859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.142215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.142504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.144513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.146523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.146919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.147311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.147781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.147801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.147817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.147832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.147846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.153147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.154421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.155969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.157327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.157622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.159624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.576 [2024-07-24 20:11:56.161525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.161928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.162319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.162776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.162794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.162809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.162824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.162839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.168301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.169692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.171096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.172453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.172731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.174564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.176260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.176999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.177414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.177862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.177880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.177900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.177915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.177930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.183720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.185355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.186523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.187883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.188161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.189895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.191641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.192532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.192929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.193396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.193415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.193430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.193445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.193460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.199354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.201073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.202119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.203498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.203775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.205522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.207259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.208239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.208635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.209072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.209090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.209106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.209123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.209137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.215043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.216142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.217488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.219327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.837 [2024-07-24 20:11:56.219618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.221354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.222317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.222720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.223112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.223488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.223505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.223520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.223535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.223549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.228111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.228542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.228937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.229325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.229744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.230143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.230548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.230952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.231347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.231796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.231814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.231828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.231843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.231858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.235130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.235543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.235938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.236333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.236774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.237178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.237589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.237982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.238370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.238807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.238825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.238840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.238856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.238871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.242311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.242717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.243108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.243512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.243914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.244320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.244722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.245113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.245507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.245969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.245986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.246002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.246017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.246033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.249462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.249857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.250250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.250655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.251110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.251550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.251941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.252329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.252724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.253123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.253140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.253154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.253169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.253183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.256589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.256994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.257398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.257791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.258204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.258609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.258999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.259387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.259793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.260210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.260226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.260240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.260254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.260268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.263922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.264345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.264771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.265161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.265612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.266012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.266420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.266819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.267219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.267711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.267730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.838 [2024-07-24 20:11:56.267745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.267760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.267775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.271117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.271528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.271921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.272312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.272757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.273162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.273565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.273957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.274345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.274795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.274813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.274828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.274843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.274857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.280564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.282301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.283177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.283574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.284022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.284430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.284820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.285210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.285634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.286010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.286034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.286048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.286063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.286077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.289529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.289937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.290330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.290729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.291163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.291570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.291962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.292366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.292768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.293204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.293220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.293235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.293250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.293264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.296638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.296695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.297084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.297489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.297959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.298357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.298413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.298808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.299208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.299599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.299618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.299633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.299647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.299665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.302989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.303398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.303796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.304186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.304640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.305043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.305110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.305516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.305577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.306033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.306051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.306065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.306080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.306093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.309202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.309613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.309678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.310069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.310532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.310931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.310977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.311366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.311424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.311829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.311846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.311861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.311876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.311890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.314952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.315365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.315423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.315818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.839 [2024-07-24 20:11:56.316226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.316640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.316688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.317075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.317120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.317521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.317544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.317559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.317574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.317589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.320681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.321079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.321124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.321523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.321964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.322366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.322427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.322821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.322870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.323278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.323295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.323309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.323324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.323338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.326347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.326759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.326807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.327195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.327641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.328059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.328106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.328501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.328545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.328882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.328900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.328915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.328930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.328944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.332278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.332898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.332949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.334470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.334892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.335303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.335353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.335753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.335801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.336222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.336240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.336256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.336270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.336286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.339498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.339902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.339946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.340006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.340529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.340930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.340975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.342378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.342433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.342774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.342791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.342805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.342819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.342833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.346835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.346888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.346929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.346971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.347240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.349020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.349070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.349113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.349156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.349448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.349466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.349480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.349494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.349508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.355073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.355126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.355169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.355212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.355491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.355549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.355590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.355631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.355672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.355998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.356015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.356030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.840 [2024-07-24 20:11:56.356044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.356058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.359965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.360017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.360058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.360099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.360371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.360445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.360489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.360529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.360569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.360845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.360861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.360875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.360889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.360903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.364725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.364781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.364822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.364862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.365131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.365182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.365231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.365273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.365321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.365603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.365620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.365634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.365653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.365667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.370600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.370657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.370703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.370744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.371017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.371074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.371121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.371165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.371206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.371486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.371503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.371518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.371532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.371546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.374584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.374635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.374677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.374718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.375148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.375201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.375242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.375284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.375326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.375689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.375706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.375720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.375735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.375748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.380519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.380572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.380614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.380655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.380973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.381032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.381074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.381114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.381154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.381432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.381449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.381463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.381478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.381491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.385739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.385792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.385835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.385876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.386328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.386379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.386437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.386484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.841 [2024-07-24 20:11:56.386527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.386799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.386816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.386830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.386844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.386858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.391858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.391911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.391961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.392006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.392275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.392324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.392365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.392441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.392486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.392757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.392773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.392787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.392801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.392815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.397475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.397526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.397567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.397608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.397885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.397942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.397983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.398024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.398065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.398524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.398547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.398563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.398578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.398592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.402588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.402648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.402692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.402734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.403081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.403141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.403182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.403223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.403264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.403599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.403617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.403631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.403646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.403660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.408107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.408160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.408202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.408261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.408766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.408819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.408862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.408904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.408947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.409373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.409398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.409414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.409429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.409444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.414851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.414907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.414952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.414993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.415264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.415323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.415366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.415444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.415491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.415763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.415780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.415794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.415808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.415821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.421691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.421746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.421790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.421831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.422144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.422200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.842 [2024-07-24 20:11:56.422240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.843 [2024-07-24 20:11:56.422281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.843 [2024-07-24 20:11:56.422322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.843 [2024-07-24 20:11:56.422638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.843 [2024-07-24 20:11:56.422656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.843 [2024-07-24 20:11:56.422670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.843 [2024-07-24 20:11:56.422685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.843 [2024-07-24 20:11:56.422700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.843 [2024-07-24 20:11:56.426405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.843 [2024-07-24 20:11:56.426457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.843 [2024-07-24 20:11:56.426498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.843 [2024-07-24 20:11:56.426539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.843 [2024-07-24 20:11:56.426810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.843 [2024-07-24 20:11:56.426868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.843 [2024-07-24 20:11:56.426911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.843 [2024-07-24 20:11:56.426954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.843 [2024-07-24 20:11:56.426995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.843 [2024-07-24 20:11:56.427272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.843 [2024-07-24 20:11:56.427292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.843 [2024-07-24 20:11:56.427306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.843 [2024-07-24 20:11:56.427320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:04.843 [2024-07-24 20:11:56.427334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.105 [2024-07-24 20:11:56.432666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.105 [2024-07-24 20:11:56.432720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.105 [2024-07-24 20:11:56.432763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.105 [2024-07-24 20:11:56.432804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.105 [2024-07-24 20:11:56.433240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.105 [2024-07-24 20:11:56.433294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.105 [2024-07-24 20:11:56.433351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.105 [2024-07-24 20:11:56.433406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.105 [2024-07-24 20:11:56.433450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.105 [2024-07-24 20:11:56.433953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.105 [2024-07-24 20:11:56.433971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.105 [2024-07-24 20:11:56.433986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.105 [2024-07-24 20:11:56.434001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.105 [2024-07-24 20:11:56.434016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.105 [2024-07-24 20:11:56.437713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.105 [2024-07-24 20:11:56.437765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.105 [2024-07-24 20:11:56.437819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.437862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.438247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.438303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.438344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.438388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.438439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.438749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.438766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.438781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.438796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.438814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.443005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.443057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.443098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.443141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.443582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.443635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.443678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.443720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.443762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.444075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.444092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.444106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.444120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.444134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.448898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.448949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.448991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.449031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.449301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.449360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.449415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.449458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.449499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.449827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.449844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.449858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.449873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.449886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.455307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.455360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.455413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.455455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.455914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.455965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.456009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.456050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.456091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.456513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.456529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.456544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.456559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.456574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.460093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.460151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.460201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.460241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.460515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.460565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.460609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.460662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.460703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.461089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.461105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.461119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.461133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.461147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.465915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.465965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.466006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.466046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.466469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.466528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.466570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.466612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.466653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.467099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.467117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.467132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.467147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.467161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.470924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.470976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.471017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.471058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.471329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.471394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.471437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.471478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.471519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.106 [2024-07-24 20:11:56.471788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.471805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.471819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.471833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.471847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.476169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.476218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.476259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.476300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.476745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.476820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.476864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.476910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.476952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.477433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.477450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.477465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.477480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.477494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.481874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.483800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.483851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.483891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.484166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.484215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.484780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.484827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.484868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.485139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.485155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.485170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.485185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.485199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.489953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.490014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.490055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.490096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.490538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.490594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.491160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.491204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.492821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.493323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.493341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.493357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.493372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.493386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.497880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.497951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.499935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.499986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.500259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.500316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.502039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.502085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.503804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.504076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.504093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.504108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.504122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.504137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.508466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.508521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.510219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.510264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.510544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.510618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.512603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.512655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.513605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.513877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.513894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.513908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.513931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.513945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.519184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.519238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.519633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.519691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.519966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.520018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.520725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.520769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.521155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.521567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.521584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.521598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.521613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.521627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.527756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.107 [2024-07-24 20:11:56.527810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.528200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.528243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.528684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.528737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.529780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.529827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.531507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.531779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.531795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.531809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.531824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.531838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.538964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.539023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.540441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.540486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.540804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.540858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.542189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.542233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.542628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.543095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.543112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.543127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.543141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.543155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.550230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.550286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.552005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.552680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.552954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.553008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.554360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.554412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.556130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.556407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.556424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.556439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.556453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.556466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.562239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.563598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.565393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.567379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.567661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.567722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.568479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.570463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.571906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.572179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.572196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.572210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.572224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.572238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.579191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.580253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.580649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.581295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.581577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.582940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.584604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.586304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.587886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.588294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.588310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.588324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.588338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.588352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.593560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.593967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.594358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.594750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.595024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.596361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.598085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.599933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.601920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.602383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.602407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.602430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.602449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.602463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.609188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.609597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.609988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.611716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.612100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.612520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.613104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.614859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.616802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.108 [2024-07-24 20:11:56.617082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.617099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.617113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.617127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.617141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.623614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.624552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.624943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.625332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.625701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.626100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.626504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.628495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.630037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.630312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.630329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.630343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.630357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.630371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.636910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.638247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.639579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.640600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.641053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.641461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.643384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.643815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.644204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.644573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.644590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.644604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.644618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.644632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.650917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.652658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.654387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.655814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.656200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.656607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.656995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.657387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.657781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.658090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.658106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.658125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.658139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.658153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.664197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.665936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.667681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.669660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.670143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.672140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.672539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.672929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.674153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.674440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.674457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.674472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.674486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.674500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.679657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.680208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.682184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.683827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.684102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.685853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.687562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.687952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.688341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.688774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.688792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.688807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.688824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.688842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.109 [2024-07-24 20:11:56.694098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.695080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.696880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.698250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.698530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.700450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.702435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.702967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.704792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.705318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.705335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.705350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.705365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.705380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.709234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.710964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.712684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.714447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.714837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.717085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.718827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.720559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.722248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.722658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.722675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.722689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.722703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.722718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.727443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.729189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.730911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.731704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.731980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.733346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.735077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.736830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.738814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.739288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.739305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.739319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.739333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.739347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.742908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.743307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.743700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.745688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.746026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.747772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.749512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.751323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.752270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.752551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.372 [2024-07-24 20:11:56.752568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.752585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.752600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.752614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.754478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.754871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.755260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.755659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.756097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.757170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.758529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.760387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.762345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.762785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.762802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.762817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.762831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.762845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.766284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.766690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.768651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.769043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.769476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.770415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.771834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.772222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.772615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.772887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.772904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.772918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.772932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.772946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.776628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.778274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.779997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.781729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.782004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.782419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.782811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.783202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.783596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.784046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.784063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.784079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.784094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.784109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.786495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.786895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.788057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.789251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.789726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.790122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.790524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.790921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.791312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.791766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.791784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.791799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.791816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.791832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.794555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.795249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.796915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.797304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.797754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.799007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.800114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.800507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.800907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.801299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.801320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.801335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.801350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.801364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.804062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.804465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.804860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.805254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.805591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.807116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.807509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.807898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.809763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.810175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.810192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.810206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.373 [2024-07-24 20:11:56.810221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.810235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.813014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.813412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.813802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.814191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.814631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.815031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.815431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.816678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.817789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.818235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.818253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.818268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.818288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.818303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.820837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.821243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.821646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.822038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.822495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.822891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.823283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.823680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.824075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.824537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.824554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.824569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.824584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.824598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.828509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.829034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.829428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.829821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.830282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.830694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.831087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.831479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.831867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.832314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.832331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.832349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.832364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.832379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.835745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.836144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.836538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.838518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.838999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.839409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.839803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.840198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.840601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.841091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.841108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.841122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.841137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.841152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.843948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.844349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.845966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.846709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.847156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.847561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.849550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.374 [2024-07-24 20:11:56.849948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.850337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.850730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.850747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.850761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.850776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.850791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.853583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.853635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.854024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.854489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.854761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.855453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.855509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.856991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.857822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.858106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.858122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.858136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.858150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.858164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.860705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.861099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.862261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.863620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.863891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.865630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.865679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.866250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.866296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.866577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.866594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.866608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.866622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.866636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.868377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.868775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.868823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.869210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.869723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.870122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.870172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.870570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.870620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.871035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.871051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.871066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.871080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.871094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.873403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.875041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.875088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.875615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.876061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.876473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.876535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.876932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.876982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.877367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.877383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.877403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.877418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.877433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.880220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.880625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.880681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.881074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.881531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.883405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.883454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.883848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.883901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.884357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.884374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.884395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.884411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.884426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.886580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.886976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.887033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.887437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.887875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.888271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.888319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.888713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.888757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.889209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.889226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.889241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.375 [2024-07-24 20:11:56.889256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.889271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.891388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.892671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.892715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.893103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.893558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.895384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.895437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.895829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.895876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.896332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.896350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.896370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.896385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.896405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.898727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.899123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.899168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.899216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.899742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.900138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.900183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.900583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.900631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.901026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.901042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.901056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.901075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.901088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.903442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.903489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.903531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.903573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.903844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.904947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.905002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.905044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.905085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.905545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.905562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.905577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.905592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.905606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.907895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.907942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.907984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.908026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.908471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.908524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.908566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.908625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.908667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.909174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.909191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.909206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.909221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.909235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.911485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.911530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.911571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.911611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.911881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.911938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.911980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.912021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.912061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.912518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.912535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.912553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.912568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.912582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.914677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.914724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.914770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.914812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.915194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.915259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.915302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.915344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.915385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.915799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.915816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.915830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.915844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.915858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.918308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.918357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.918408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.918450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.918899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.376 [2024-07-24 20:11:56.918953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.918997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.919063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.919117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.919494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.919512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.919526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.919541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.919555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.921613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.921659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.921701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.921745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.922145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.922204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.922246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.922287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.922327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.922600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.922616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.922631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.922645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.922659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.925104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.925149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.925192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.925234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.925638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.925689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.925731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.925772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.925814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.926250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.926267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.926282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.926297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.926311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.928618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.928664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.928725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.928766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.929146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.929210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.929251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.929312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.929358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.929635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.929652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.929667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.929681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.929694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.931794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.931840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.931885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.931925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.932383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.932440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.932484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.932529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.932571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.932971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.932988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.933003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.933018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.933032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.935426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.935472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.935514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.935556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.935990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.936041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.936082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.936125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.377 [2024-07-24 20:11:56.936169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.936624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.936646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.936664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.936678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.936693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.938641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.938687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.938736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.938780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.939052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.939105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.939147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.939188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.939228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.939714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.939731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.939745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.939760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.939775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.941946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.941993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.942036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.942081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.942519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.942585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.942626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.942693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.942740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.943009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.943025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.943039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.943057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.943071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.944735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.944782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.944826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.944869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.945216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.945270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.945312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.945356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.945401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.945708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.945725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.945740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.945754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.945769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.947880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.947926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.947968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.948009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.948442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.948493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.948536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.948578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.948620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.948894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.948911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.948925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.948939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.948953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.950514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.950563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.950603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.950644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.951089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.951149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.951191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.951232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.951272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.951548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.951565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.951580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.951595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.951609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.953107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.953153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.953196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.953237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.953563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.953618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.953659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.953700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.953741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.954044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.954060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.954075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.954090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.954104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.378 [2024-07-24 20:11:56.956248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.956294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.956339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.956381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.956821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.956873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.956915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.956960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.957002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.957274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.957290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.957304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.957318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.957332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.958899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.958943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.958984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.959024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.959473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.959533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.959575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.959615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.959656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.959924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.959941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.959956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.959970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.959984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.961486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.961533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.379 [2024-07-24 20:11:56.961575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.961616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.961967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.962023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.962069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.962110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.962150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.962461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.962478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.962493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.962507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.962522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.964654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.964699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.964741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.964784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.965218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.965269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.965312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.965354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.965402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.965676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.965693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.965707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.965721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.965734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.967293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.967338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.967383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.967430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.967876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.967936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.967978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.968021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.968062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.968335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.968352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.968366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.968381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.968400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.969903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.969948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.969991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.970032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.970353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.970413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.970454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.970495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.970535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.970846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.970863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.970877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.970892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.641 [2024-07-24 20:11:56.970906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.973023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.973070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.973112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.973153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.973605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.973657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.973700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.973741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.973783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.974055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.974071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.974090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.974103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.974117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.975682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.975726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.975767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.975807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.976241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.976301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.976342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.976383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.976430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.976702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.976718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.976733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.976748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.976762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.978262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.979000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.979048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.979089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.979359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.979425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.979816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.979859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.979901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.980350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.980367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.980382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.980402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.980416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.982529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.982574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.982614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.982654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.982978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.983041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.984919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.984966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.986842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.987113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.987130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.987144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.987158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.987172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.990881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.990936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.992867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.992913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.993300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.993356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.995170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.995214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.995605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.996038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.996055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.996071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.996086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:56.996101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:57.000136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:57.000195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:57.001916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:57.001964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:57.002240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:57.002299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:57.004013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:57.004059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:57.005192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:57.005505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:57.005523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:57.005537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:57.005552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:57.005565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:57.008396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:57.008446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:57.009955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:57.010002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:57.010376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:57.010440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:57.010830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.642 [2024-07-24 20:11:57.010874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.011630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.011900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.011916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.011931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.011945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.011959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.016056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.016110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.018003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.018050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.018323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.018382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.018884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.018930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.020724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.020997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.021014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.021028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.021042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.021056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.024635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.024689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.025078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.025122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.025567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.025620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.026991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.027037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.027842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.028264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.028281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.028297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.028312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.028327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.031669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.031720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.033401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.034458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.034729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.034790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.036715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.036765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.038692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.038966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.038982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.038996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.039010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.039024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.041343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.043261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.043696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.044086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.044440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.044499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.045931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.047921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.049906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.050177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.050194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.050208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.050222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.050236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.053839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.055563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.056178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.058122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.058551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.058955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.059573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.061313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.061707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.062151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.062173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.062188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.062204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.062219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.065433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.066472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.068162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.069520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.069791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.071789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.073722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.074129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.076105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.076596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.076613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.076627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.076642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.076657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.079156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.643 [2024-07-24 20:11:57.080854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.082207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.083930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.084200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.086086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.086555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.088513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.090166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.090445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.090462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.090476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.090490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.090508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.092549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.092947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.094043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.095305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.095803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.096203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.098005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.099354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.101071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.101343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.101359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.101375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.101393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.101408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.104808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.106533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.108174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.108569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.109044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.109454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.111040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.111813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.112203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.112616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.112642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.112656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.112671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.112685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.116116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.116989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.118468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.120457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.120734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.122506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.123698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.124088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.124487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.124888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.124905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.124920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.124934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.124948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.128316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.130223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.132224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.134115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.134525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.136366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.137737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.139468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.141211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.141493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.141510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.141525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.141540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.141555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.144591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.144991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.145699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.147329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.147617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.148606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.150113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.151464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.152852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.153249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.153265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.153281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.153298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.153312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.155886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.156282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.157853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.158624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.158987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.160995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.644 [2024-07-24 20:11:57.161399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.161798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.162192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.162512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.162528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.162544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.162558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.162573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.166281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.168014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.169735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.171018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.171370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.173065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.174801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.176650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.178642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.179107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.179124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.179138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.179153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.179167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.181731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.182397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.184073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.186071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.186342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.188104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.189472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.190851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.192197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.192476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.192492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.192507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.192521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.192536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.194660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.195532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.197017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.197418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.197860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.199522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.200873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.202618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.204342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.204629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.204646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.204660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.204674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.204688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.208044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.209727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.210119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.210519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.210959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.212830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.213343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.213740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.214958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.215256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.215273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.215287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.215303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.215316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.217718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.219080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.220980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.222960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.223232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.224344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.224739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.225131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.226286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.226569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.226586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.226605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.226619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.226633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.645 [2024-07-24 20:11:57.230441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.232214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.234192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.234963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.235234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.236667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.238384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.240090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.241458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.241800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.241817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.241832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.241846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.241861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.244342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.245545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.246903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.248632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.248905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.250661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.251578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.253434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.254770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.255043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.255059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.255073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.255087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.255101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.257342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.258846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.259699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.260089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.260491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.262482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.264073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.265799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.267526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.267800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.267817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.267832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.267847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.267861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.271158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.272427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.272818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.273212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.273581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.275439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.275830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.276217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.277931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.278257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.278273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.278288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.278302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.278316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.281414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.282777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.909 [2024-07-24 20:11:57.284503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.286225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.286506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.287046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.287446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.287835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.289671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.290069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.290086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.290100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.290115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.290130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.293770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.295512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.296828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.298247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.298580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.300581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.302587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.304351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.304917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.305396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.305413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.305429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.305443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.305457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.307903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.309892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.311507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.313229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.313507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.315268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.316276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.317613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.319566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.319838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.319854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.319868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.319883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.319897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.322679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.324109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.324502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.324891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.325164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.326517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.328233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.329955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.331829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.332264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.332280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.332295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.332309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.332322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.335786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.336186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.336583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.337128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.337405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.338023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.338419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.339481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.340831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.341103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.341120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.341134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.341148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.341162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.344201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.344253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.346030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.348013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.348291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.349240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.349288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.349710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.350106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.350552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.350569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.350584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.350598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.350612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.354185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.355908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.357231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.358650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.358976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.360960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.361020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.363003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.363071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.363340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.910 [2024-07-24 20:11:57.363360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.363374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.363393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.363407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.365688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.366081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.366126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.367624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.367956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.369956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.370008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.371954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.372002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.372270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.372287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.372301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.372315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.372329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.374123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.375856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.375904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.377556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.377986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.378396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.378442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.378831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.378876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.379340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.379357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.379372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.379387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.379414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.381177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.383133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.383181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.384207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.384488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.386190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.386238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.387956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.388002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.388273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.388290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.388304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.388318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.388331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.390482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.390880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.390925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.391933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.392207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.392621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.392671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.394654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.394705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.394995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.395012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.395026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.395040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.395054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.397192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.397599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.397645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.398035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.398482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.398883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.398933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.399321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.399366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.399800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.399818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.399836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.399852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.399867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.402194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.402601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.402654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.402696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.403183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.403588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.403638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.404026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.404069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.404514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.404531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.404546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.404561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.404575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.406775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.406821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.406866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.911 [2024-07-24 20:11:57.406907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.407326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.407734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.407779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.407821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.407862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.408212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.408229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.408243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.408258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.408272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.410647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.410692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.410734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.410774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.411230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.411285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.411328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.411369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.411416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.411790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.411807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.411823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.411837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.411851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.414370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.414421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.414463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.414504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.414963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.415014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.415056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.415102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.415145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.415579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.415596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.415610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.415624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.415638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.418032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.418085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.418127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.418168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.418649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.418707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.418752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.418794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.418835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.419223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.419240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.419255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.419269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.419283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.421625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.421679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.421728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.421769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.422241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.422293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.422335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.422381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.422429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.422814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.422832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.422846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.422861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.422876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.425292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.425344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.425386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.425435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.425909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.425961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.426006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.426047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.426088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.426517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.426534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.426548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.426565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.426580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.429089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.429141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.429183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.429224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.429678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.429732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.429775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.429818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.429862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.912 [2024-07-24 20:11:57.430242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.430260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.430274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.430293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.430307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.432738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.432785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.432827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.432867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.433296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.433350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.433401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.433443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.433485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.433832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.433850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.433865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.433880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.433894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.436233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.436279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.436325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.436366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.436818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.436873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.436917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.436963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.437019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.437428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.437447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.437462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.437477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.437491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.439747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.439793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.439837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.439881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.440323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.440376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.440447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.440503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.440544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.440926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.440944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.440958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.440973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.440987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.443217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.443265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.443308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.443350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.443772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.443837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.443889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.443954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.444024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.444495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.444513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.444529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.444543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.444557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.446891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.446941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.446983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.447030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.447409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.447474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.447515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.447557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.447598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.447872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.447888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.447902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.447917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.447930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.449675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.449720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.913 [2024-07-24 20:11:57.449765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.449805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.450075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.450135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.450176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.450220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.450260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.450545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.450564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.450578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.450593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.450606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.452821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.452868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.452910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.452953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.453345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.453417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.453464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.453505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.453547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.453848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.453865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.453880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.453894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.453908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.455557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.455605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.455647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.455687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.456005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.456066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.456107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.456148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.456188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.456467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.456484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.456498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.456512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.456526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.458665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.458712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.458755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.458797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.459236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.459288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.459347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.459395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.459445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.459823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.459842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.459857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.459872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.459888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.462078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.462127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.462173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.462215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.462617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.462697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.462756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.462815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.462857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.463343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.463360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.463376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.463401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.463415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.465738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.465788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.465831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.465873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.466197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.466262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.466305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.466348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.466397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.466722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.466739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.466758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.466772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.466788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.469292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.469340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.469408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.469479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.469973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.470039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.914 [2024-07-24 20:11:57.470090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.470162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.470220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.470671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.470689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.470704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.470718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.470732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.473065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.473113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.473154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.473196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.473529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.473595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.473645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.473686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.473734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.474119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.474136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.474152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.474166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.474184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.476403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.476463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.476508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.476565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.477045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.477117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.477174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.477226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.477268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.477792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.477813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.477829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.477844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.477861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.480102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.480151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.480192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.480235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.480590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.480654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.480697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.480738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.480779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.481143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.481160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.481174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.481189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.481202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.483590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.483655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.483703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.483745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.484225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.484278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.484321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.484378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.484448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.484938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.484954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.484969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.484984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.484998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.487153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.487199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.487243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.487284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.487630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.487694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.487735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.487777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.487817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.488234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.488251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.488267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.488283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.488297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.490500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.490904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.490955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.490998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.491427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.491506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.491904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.491952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.492009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.492497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.492515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.492529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.492544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.915 [2024-07-24 20:11:57.492557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.916 [2024-07-24 20:11:57.494990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.916 [2024-07-24 20:11:57.495038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.916 [2024-07-24 20:11:57.495079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.916 [2024-07-24 20:11:57.495144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.916 [2024-07-24 20:11:57.495503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.916 [2024-07-24 20:11:57.495577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.916 [2024-07-24 20:11:57.495971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.916 [2024-07-24 20:11:57.496019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.916 [2024-07-24 20:11:57.496416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.916 [2024-07-24 20:11:57.496898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.916 [2024-07-24 20:11:57.496916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.916 [2024-07-24 20:11:57.496930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.916 [2024-07-24 20:11:57.496945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:05.916 [2024-07-24 20:11:57.496960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.499596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.499653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.500047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.500095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.500372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.500443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.501488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.501544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.501940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.502282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.502299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.502314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.502329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.502343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.504933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.504990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.505381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.505445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.505908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.505964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.506688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.506735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.508160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.508446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.508463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.508477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.508492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.508505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.512150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.512212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.514129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.514178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.514461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.514515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.514989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.515037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.515434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.515876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.515893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.515908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.515922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.515936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.519280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.519336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.520691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.520736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.521040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.521102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.522448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.522497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.524215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.524500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.524517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.524531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.524545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.524559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.526766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.526843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.529168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.529216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.531240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.531527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.531544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.531558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.531572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.533301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.537139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.176 [2024-07-24 20:11:57.537553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:06.789 00:36:06.789 Latency(us) 00:36:06.789 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:06.789 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:36:06.789 Verification LBA range: start 0x0 length 0x100 00:36:06.789 crypto_ram : 6.13 41.73 2.61 0.00 0.00 2987211.02 217009.64 2553054.61 00:36:06.789 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:36:06.789 Verification LBA range: start 0x100 length 0x100 00:36:06.789 crypto_ram : 5.95 31.60 1.98 0.00 0.00 3728409.12 39663.53 3165787.71 00:36:06.789 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:36:06.789 Verification LBA range: start 0x0 length 0x100 00:36:06.789 crypto_ram1 : 6.14 41.72 2.61 0.00 0.00 2879639.37 216097.84 2348810.24 00:36:06.789 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:36:06.789 Verification LBA range: start 0x100 length 0x100 00:36:06.789 crypto_ram1 : 5.97 34.16 2.14 0.00 0.00 3348787.36 53112.65 2888598.93 00:36:06.789 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:36:06.789 Verification LBA range: start 0x0 length 0x100 00:36:06.789 crypto_ram2 : 5.67 249.55 15.60 0.00 0.00 453210.86 6981.01 616380.33 00:36:06.789 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:36:06.789 Verification LBA range: start 0x100 length 0x100 00:36:06.789 crypto_ram2 : 5.74 207.82 12.99 0.00 0.00 536620.18 28835.84 692971.97 00:36:06.789 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:36:06.789 Verification LBA range: start 0x0 length 0x100 00:36:06.789 crypto_ram3 : 5.80 262.55 16.41 0.00 0.00 419254.20 53112.65 474138.71 00:36:06.789 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:36:06.789 Verification LBA range: start 0x100 length 0x100 00:36:06.789 crypto_ram3 : 5.85 223.89 13.99 0.00 0.00 482594.68 13050.21 634616.43 00:36:06.789 =================================================================================================================== 00:36:06.789 Total : 1093.02 68.31 0.00 0.00 857063.03 6981.01 3165787.71 00:36:07.048 00:36:07.048 real 0m9.338s 00:36:07.048 user 0m17.673s 00:36:07.048 sys 0m0.492s 00:36:07.048 20:11:58 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:36:07.048 20:11:58 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:36:07.048 ************************************ 00:36:07.048 END TEST bdev_verify_big_io 00:36:07.048 ************************************ 00:36:07.306 20:11:58 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:36:07.306 20:11:58 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:36:07.306 20:11:58 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:36:07.306 20:11:58 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:36:07.306 ************************************ 00:36:07.306 START TEST bdev_write_zeroes 00:36:07.306 ************************************ 00:36:07.306 20:11:58 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:36:07.306 [2024-07-24 20:11:58.764712] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:36:07.306 [2024-07-24 20:11:58.764775] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1588000 ] 00:36:07.306 [2024-07-24 20:11:58.893044] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:07.565 [2024-07-24 20:11:58.995946] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:07.565 [2024-07-24 20:11:59.017284] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:36:07.565 [2024-07-24 20:11:59.025313] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:36:07.565 [2024-07-24 20:11:59.033331] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:36:07.565 [2024-07-24 20:11:59.146574] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:36:10.098 [2024-07-24 20:12:01.364061] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:36:10.098 [2024-07-24 20:12:01.364126] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:36:10.098 [2024-07-24 20:12:01.364142] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:10.098 [2024-07-24 20:12:01.372080] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:36:10.098 [2024-07-24 20:12:01.372099] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:36:10.098 [2024-07-24 20:12:01.372111] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:10.098 [2024-07-24 20:12:01.380101] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:36:10.098 [2024-07-24 20:12:01.380119] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:36:10.098 [2024-07-24 20:12:01.380131] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:10.098 [2024-07-24 20:12:01.388122] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:36:10.098 [2024-07-24 20:12:01.388139] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:36:10.098 [2024-07-24 20:12:01.388151] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:10.098 Running I/O for 1 seconds... 00:36:11.035 00:36:11.035 Latency(us) 00:36:11.035 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:11.035 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:36:11.035 crypto_ram : 1.03 1996.23 7.80 0.00 0.00 63642.98 5670.29 77047.54 00:36:11.036 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:36:11.036 crypto_ram1 : 1.03 2001.76 7.82 0.00 0.00 63102.00 5641.79 71576.71 00:36:11.036 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:36:11.036 crypto_ram2 : 1.02 15365.18 60.02 0.00 0.00 8203.80 2464.72 10827.69 00:36:11.036 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:36:11.036 crypto_ram3 : 1.02 15397.48 60.15 0.00 0.00 8160.30 2464.72 8548.17 00:36:11.036 =================================================================================================================== 00:36:11.036 Total : 34760.65 135.78 0.00 0.00 14557.25 2464.72 77047.54 00:36:11.605 00:36:11.605 real 0m4.230s 00:36:11.605 user 0m3.806s 00:36:11.605 sys 0m0.372s 00:36:11.605 20:12:02 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1126 -- # xtrace_disable 00:36:11.605 20:12:02 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:36:11.605 ************************************ 00:36:11.605 END TEST bdev_write_zeroes 00:36:11.605 ************************************ 00:36:11.605 20:12:02 blockdev_crypto_qat -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:36:11.605 20:12:02 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:36:11.605 20:12:02 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:36:11.605 20:12:02 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:36:11.605 ************************************ 00:36:11.605 START TEST bdev_json_nonenclosed 00:36:11.605 ************************************ 00:36:11.605 20:12:03 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:36:11.605 [2024-07-24 20:12:03.081585] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:36:11.605 [2024-07-24 20:12:03.081636] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1588667 ] 00:36:11.605 [2024-07-24 20:12:03.195253] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:11.865 [2024-07-24 20:12:03.306041] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:11.865 [2024-07-24 20:12:03.306117] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:36:11.865 [2024-07-24 20:12:03.306135] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:36:11.865 [2024-07-24 20:12:03.306148] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:36:11.865 00:36:11.865 real 0m0.388s 00:36:11.865 user 0m0.250s 00:36:11.865 sys 0m0.134s 00:36:11.865 20:12:03 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:36:11.865 20:12:03 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:36:11.865 ************************************ 00:36:11.865 END TEST bdev_json_nonenclosed 00:36:11.865 ************************************ 00:36:12.123 20:12:03 blockdev_crypto_qat -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:36:12.123 20:12:03 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:36:12.123 20:12:03 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:36:12.123 20:12:03 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:36:12.123 ************************************ 00:36:12.123 START TEST bdev_json_nonarray 00:36:12.123 ************************************ 00:36:12.123 20:12:03 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:36:12.123 [2024-07-24 20:12:03.563064] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:36:12.123 [2024-07-24 20:12:03.563134] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1588845 ] 00:36:12.123 [2024-07-24 20:12:03.693180] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:12.383 [2024-07-24 20:12:03.795004] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:12.383 [2024-07-24 20:12:03.795083] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:36:12.383 [2024-07-24 20:12:03.795100] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:36:12.383 [2024-07-24 20:12:03.795113] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:36:12.383 00:36:12.383 real 0m0.399s 00:36:12.383 user 0m0.233s 00:36:12.383 sys 0m0.162s 00:36:12.383 20:12:03 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1126 -- # xtrace_disable 00:36:12.383 20:12:03 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:36:12.383 ************************************ 00:36:12.383 END TEST bdev_json_nonarray 00:36:12.383 ************************************ 00:36:12.383 20:12:03 blockdev_crypto_qat -- bdev/blockdev.sh@786 -- # [[ crypto_qat == bdev ]] 00:36:12.383 20:12:03 blockdev_crypto_qat -- bdev/blockdev.sh@793 -- # [[ crypto_qat == gpt ]] 00:36:12.383 20:12:03 blockdev_crypto_qat -- bdev/blockdev.sh@797 -- # [[ crypto_qat == crypto_sw ]] 00:36:12.383 20:12:03 blockdev_crypto_qat -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:36:12.383 20:12:03 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # cleanup 00:36:12.383 20:12:03 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:36:12.383 20:12:03 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:36:12.383 20:12:03 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:36:12.383 20:12:03 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:36:12.383 20:12:03 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:36:12.383 20:12:03 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:36:12.383 00:36:12.383 real 1m14.007s 00:36:12.383 user 2m43.328s 00:36:12.383 sys 0m9.439s 00:36:12.383 20:12:03 blockdev_crypto_qat -- common/autotest_common.sh@1126 -- # xtrace_disable 00:36:12.383 20:12:03 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:36:12.383 ************************************ 00:36:12.383 END TEST blockdev_crypto_qat 00:36:12.383 ************************************ 00:36:12.642 20:12:03 -- spdk/autotest.sh@364 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:36:12.642 20:12:04 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:36:12.642 20:12:04 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:36:12.642 20:12:04 -- common/autotest_common.sh@10 -- # set +x 00:36:12.642 ************************************ 00:36:12.642 START TEST chaining 00:36:12.642 ************************************ 00:36:12.642 20:12:04 chaining -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:36:12.642 * Looking for test storage... 00:36:12.642 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:36:12.642 20:12:04 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@7 -- # uname -s 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:36:12.642 20:12:04 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:36:12.642 20:12:04 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:36:12.642 20:12:04 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:36:12.642 20:12:04 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:12.642 20:12:04 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:12.642 20:12:04 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:12.642 20:12:04 chaining -- paths/export.sh@5 -- # export PATH 00:36:12.642 20:12:04 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@47 -- # : 0 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:36:12.642 20:12:04 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:36:12.642 20:12:04 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:36:12.642 20:12:04 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:36:12.642 20:12:04 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:36:12.642 20:12:04 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:36:12.642 20:12:04 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:36:12.642 20:12:04 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:36:12.642 20:12:04 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:36:12.642 20:12:04 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:36:12.643 20:12:04 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:36:12.643 20:12:04 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:36:12.643 20:12:04 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:36:12.643 20:12:04 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:20.767 20:12:11 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:36:20.767 20:12:11 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:36:20.767 20:12:11 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:36:20.767 20:12:11 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:36:20.767 20:12:11 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:36:20.767 20:12:11 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:36:20.767 20:12:11 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:36:20.767 20:12:11 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:36:20.767 20:12:11 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@296 -- # e810=() 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@297 -- # x722=() 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@298 -- # mlx=() 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@336 -- # return 1 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:36:20.768 WARNING: No supported devices were found, fallback requested for tcp test 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:36:20.768 20:12:11 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:36:20.768 Cannot find device "nvmf_tgt_br" 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@155 -- # true 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:36:20.768 Cannot find device "nvmf_tgt_br2" 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@156 -- # true 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:36:20.768 Cannot find device "nvmf_tgt_br" 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@158 -- # true 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:36:20.768 Cannot find device "nvmf_tgt_br2" 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@159 -- # true 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:36:20.768 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@162 -- # true 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:36:20.768 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@163 -- # true 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:36:20.768 20:12:12 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:36:21.027 20:12:12 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:36:21.027 20:12:12 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:36:21.027 20:12:12 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:36:21.027 20:12:12 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:36:21.027 20:12:12 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:36:21.027 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:36:21.027 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.122 ms 00:36:21.027 00:36:21.027 --- 10.0.0.2 ping statistics --- 00:36:21.027 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:36:21.027 rtt min/avg/max/mdev = 0.122/0.122/0.122/0.000 ms 00:36:21.027 20:12:12 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:36:21.027 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:36:21.027 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.074 ms 00:36:21.027 00:36:21.027 --- 10.0.0.3 ping statistics --- 00:36:21.027 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:36:21.027 rtt min/avg/max/mdev = 0.074/0.074/0.074/0.000 ms 00:36:21.027 20:12:12 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:36:21.027 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:36:21.027 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.027 ms 00:36:21.027 00:36:21.027 --- 10.0.0.1 ping statistics --- 00:36:21.027 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:36:21.027 rtt min/avg/max/mdev = 0.027/0.027/0.027/0.000 ms 00:36:21.027 20:12:12 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:36:21.027 20:12:12 chaining -- nvmf/common.sh@433 -- # return 0 00:36:21.027 20:12:12 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:36:21.027 20:12:12 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:36:21.027 20:12:12 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:36:21.027 20:12:12 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:36:21.027 20:12:12 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:36:21.027 20:12:12 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:36:21.027 20:12:12 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:36:21.027 20:12:12 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:36:21.027 20:12:12 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:36:21.027 20:12:12 chaining -- common/autotest_common.sh@724 -- # xtrace_disable 00:36:21.028 20:12:12 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:21.028 20:12:12 chaining -- nvmf/common.sh@481 -- # nvmfpid=1592918 00:36:21.028 20:12:12 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:36:21.028 20:12:12 chaining -- nvmf/common.sh@482 -- # waitforlisten 1592918 00:36:21.028 20:12:12 chaining -- common/autotest_common.sh@831 -- # '[' -z 1592918 ']' 00:36:21.028 20:12:12 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:36:21.028 20:12:12 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:36:21.028 20:12:12 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:36:21.028 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:36:21.028 20:12:12 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:36:21.028 20:12:12 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:21.286 [2024-07-24 20:12:12.706615] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:36:21.286 [2024-07-24 20:12:12.706753] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:36:21.545 [2024-07-24 20:12:12.925707] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:21.545 [2024-07-24 20:12:13.064109] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:36:21.545 [2024-07-24 20:12:13.064167] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:36:21.545 [2024-07-24 20:12:13.064186] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:36:21.545 [2024-07-24 20:12:13.064203] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:36:21.545 [2024-07-24 20:12:13.064217] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:36:21.545 [2024-07-24 20:12:13.064251] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:36:22.112 20:12:13 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:36:22.112 20:12:13 chaining -- common/autotest_common.sh@864 -- # return 0 00:36:22.112 20:12:13 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:36:22.112 20:12:13 chaining -- common/autotest_common.sh@730 -- # xtrace_disable 00:36:22.112 20:12:13 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:22.112 20:12:13 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:36:22.112 20:12:13 chaining -- bdev/chaining.sh@69 -- # mktemp 00:36:22.112 20:12:13 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.XoTNgWVhoJ 00:36:22.112 20:12:13 chaining -- bdev/chaining.sh@69 -- # mktemp 00:36:22.112 20:12:13 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.5Nd1fOscQg 00:36:22.112 20:12:13 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:36:22.112 20:12:13 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:36:22.112 20:12:13 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:22.112 20:12:13 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:22.112 malloc0 00:36:22.112 true 00:36:22.112 true 00:36:22.112 [2024-07-24 20:12:13.684733] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:36:22.112 crypto0 00:36:22.112 [2024-07-24 20:12:13.692761] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:36:22.112 crypto1 00:36:22.112 [2024-07-24 20:12:13.700913] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:36:22.371 [2024-07-24 20:12:13.717203] tcp.c:1006:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:36:22.371 20:12:13 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@85 -- # update_stats 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:22.371 20:12:13 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:22.371 20:12:13 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:22.371 20:12:13 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:22.371 20:12:13 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:22.371 20:12:13 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:22.371 20:12:13 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:22.371 20:12:13 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:22.371 20:12:13 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:22.371 20:12:13 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:22.371 20:12:13 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:22.371 20:12:13 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:22.371 20:12:13 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.XoTNgWVhoJ bs=1K count=64 00:36:22.371 64+0 records in 00:36:22.371 64+0 records out 00:36:22.371 65536 bytes (66 kB, 64 KiB) copied, 0.00107201 s, 61.1 MB/s 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.XoTNgWVhoJ --ob Nvme0n1 --bs 65536 --count 1 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@25 -- # local config 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:36:22.371 20:12:13 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:36:22.371 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:36:22.630 20:12:13 chaining -- bdev/chaining.sh@31 -- # config='{ 00:36:22.630 "subsystems": [ 00:36:22.630 { 00:36:22.630 "subsystem": "bdev", 00:36:22.630 "config": [ 00:36:22.630 { 00:36:22.630 "method": "bdev_nvme_attach_controller", 00:36:22.630 "params": { 00:36:22.630 "trtype": "tcp", 00:36:22.630 "adrfam": "IPv4", 00:36:22.630 "name": "Nvme0", 00:36:22.630 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:22.630 "traddr": "10.0.0.2", 00:36:22.630 "trsvcid": "4420" 00:36:22.630 } 00:36:22.630 }, 00:36:22.630 { 00:36:22.630 "method": "bdev_set_options", 00:36:22.630 "params": { 00:36:22.630 "bdev_auto_examine": false 00:36:22.630 } 00:36:22.630 } 00:36:22.630 ] 00:36:22.630 } 00:36:22.630 ] 00:36:22.630 }' 00:36:22.630 20:12:13 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:36:22.630 "subsystems": [ 00:36:22.630 { 00:36:22.630 "subsystem": "bdev", 00:36:22.630 "config": [ 00:36:22.630 { 00:36:22.630 "method": "bdev_nvme_attach_controller", 00:36:22.630 "params": { 00:36:22.630 "trtype": "tcp", 00:36:22.630 "adrfam": "IPv4", 00:36:22.630 "name": "Nvme0", 00:36:22.630 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:22.630 "traddr": "10.0.0.2", 00:36:22.630 "trsvcid": "4420" 00:36:22.630 } 00:36:22.630 }, 00:36:22.630 { 00:36:22.630 "method": "bdev_set_options", 00:36:22.630 "params": { 00:36:22.630 "bdev_auto_examine": false 00:36:22.630 } 00:36:22.630 } 00:36:22.630 ] 00:36:22.630 } 00:36:22.630 ] 00:36:22.630 }' 00:36:22.630 20:12:13 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.XoTNgWVhoJ --ob Nvme0n1 --bs 65536 --count 1 00:36:22.630 [2024-07-24 20:12:14.044842] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:36:22.630 [2024-07-24 20:12:14.044923] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1593140 ] 00:36:22.630 [2024-07-24 20:12:14.173866] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:22.889 [2024-07-24 20:12:14.272421] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:23.148  Copying: 64/64 [kB] (average 20 MBps) 00:36:23.148 00:36:23.148 20:12:14 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:36:23.148 20:12:14 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:23.148 20:12:14 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:23.148 20:12:14 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:23.148 20:12:14 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:23.148 20:12:14 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:23.148 20:12:14 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:23.148 20:12:14 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:23.148 20:12:14 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:23.148 20:12:14 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:23.148 20:12:14 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:23.408 20:12:14 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:23.408 20:12:14 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:23.408 20:12:14 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:23.408 20:12:14 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:23.408 20:12:14 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:23.408 20:12:14 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:23.408 20:12:14 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:23.408 20:12:14 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:23.408 20:12:14 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@96 -- # update_stats 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:23.408 20:12:14 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:23.408 20:12:14 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:23.408 20:12:14 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:23.408 20:12:14 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:23.408 20:12:14 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:23.408 20:12:14 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:23.408 20:12:14 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:23.408 20:12:14 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:23.408 20:12:14 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:23.667 20:12:15 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:23.667 20:12:15 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:36:23.667 20:12:15 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:36:23.667 20:12:15 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:23.667 20:12:15 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:23.667 20:12:15 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:23.667 20:12:15 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:23.667 20:12:15 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:23.667 20:12:15 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:23.667 20:12:15 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:23.667 20:12:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:23.667 20:12:15 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:23.667 20:12:15 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:23.667 20:12:15 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:36:23.667 20:12:15 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.5Nd1fOscQg --ib Nvme0n1 --bs 65536 --count 1 00:36:23.667 20:12:15 chaining -- bdev/chaining.sh@25 -- # local config 00:36:23.667 20:12:15 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:36:23.667 20:12:15 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:36:23.667 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:36:23.667 20:12:15 chaining -- bdev/chaining.sh@31 -- # config='{ 00:36:23.667 "subsystems": [ 00:36:23.667 { 00:36:23.667 "subsystem": "bdev", 00:36:23.667 "config": [ 00:36:23.667 { 00:36:23.667 "method": "bdev_nvme_attach_controller", 00:36:23.667 "params": { 00:36:23.667 "trtype": "tcp", 00:36:23.667 "adrfam": "IPv4", 00:36:23.667 "name": "Nvme0", 00:36:23.667 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:23.667 "traddr": "10.0.0.2", 00:36:23.667 "trsvcid": "4420" 00:36:23.667 } 00:36:23.667 }, 00:36:23.667 { 00:36:23.667 "method": "bdev_set_options", 00:36:23.667 "params": { 00:36:23.667 "bdev_auto_examine": false 00:36:23.667 } 00:36:23.667 } 00:36:23.667 ] 00:36:23.667 } 00:36:23.667 ] 00:36:23.667 }' 00:36:23.667 20:12:15 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.5Nd1fOscQg --ib Nvme0n1 --bs 65536 --count 1 00:36:23.667 20:12:15 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:36:23.667 "subsystems": [ 00:36:23.667 { 00:36:23.667 "subsystem": "bdev", 00:36:23.667 "config": [ 00:36:23.667 { 00:36:23.667 "method": "bdev_nvme_attach_controller", 00:36:23.667 "params": { 00:36:23.668 "trtype": "tcp", 00:36:23.668 "adrfam": "IPv4", 00:36:23.668 "name": "Nvme0", 00:36:23.668 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:23.668 "traddr": "10.0.0.2", 00:36:23.668 "trsvcid": "4420" 00:36:23.668 } 00:36:23.668 }, 00:36:23.668 { 00:36:23.668 "method": "bdev_set_options", 00:36:23.668 "params": { 00:36:23.668 "bdev_auto_examine": false 00:36:23.668 } 00:36:23.668 } 00:36:23.668 ] 00:36:23.668 } 00:36:23.668 ] 00:36:23.668 }' 00:36:23.668 [2024-07-24 20:12:15.189484] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:36:23.668 [2024-07-24 20:12:15.189556] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1593351 ] 00:36:23.926 [2024-07-24 20:12:15.320832] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:23.926 [2024-07-24 20:12:15.424328] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:24.443  Copying: 64/64 [kB] (average 31 MBps) 00:36:24.443 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:24.443 20:12:15 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:24.443 20:12:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:24.443 20:12:15 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:24.443 20:12:15 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:24.443 20:12:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:24.443 20:12:15 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:24.443 20:12:15 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:24.443 20:12:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:24.443 20:12:15 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:24.443 20:12:15 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:24.443 20:12:15 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:24.444 20:12:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:24.444 20:12:16 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:24.444 20:12:16 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:36:24.444 20:12:16 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.XoTNgWVhoJ /tmp/tmp.5Nd1fOscQg 00:36:24.444 20:12:16 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:36:24.444 20:12:16 chaining -- bdev/chaining.sh@25 -- # local config 00:36:24.703 20:12:16 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:36:24.703 20:12:16 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:36:24.703 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:36:24.703 20:12:16 chaining -- bdev/chaining.sh@31 -- # config='{ 00:36:24.703 "subsystems": [ 00:36:24.703 { 00:36:24.703 "subsystem": "bdev", 00:36:24.703 "config": [ 00:36:24.703 { 00:36:24.703 "method": "bdev_nvme_attach_controller", 00:36:24.703 "params": { 00:36:24.703 "trtype": "tcp", 00:36:24.703 "adrfam": "IPv4", 00:36:24.703 "name": "Nvme0", 00:36:24.703 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:24.703 "traddr": "10.0.0.2", 00:36:24.703 "trsvcid": "4420" 00:36:24.703 } 00:36:24.703 }, 00:36:24.703 { 00:36:24.703 "method": "bdev_set_options", 00:36:24.703 "params": { 00:36:24.703 "bdev_auto_examine": false 00:36:24.703 } 00:36:24.703 } 00:36:24.703 ] 00:36:24.703 } 00:36:24.703 ] 00:36:24.703 }' 00:36:24.703 20:12:16 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:36:24.703 20:12:16 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:36:24.703 "subsystems": [ 00:36:24.703 { 00:36:24.703 "subsystem": "bdev", 00:36:24.703 "config": [ 00:36:24.703 { 00:36:24.703 "method": "bdev_nvme_attach_controller", 00:36:24.703 "params": { 00:36:24.703 "trtype": "tcp", 00:36:24.703 "adrfam": "IPv4", 00:36:24.703 "name": "Nvme0", 00:36:24.703 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:24.703 "traddr": "10.0.0.2", 00:36:24.703 "trsvcid": "4420" 00:36:24.703 } 00:36:24.703 }, 00:36:24.703 { 00:36:24.703 "method": "bdev_set_options", 00:36:24.703 "params": { 00:36:24.703 "bdev_auto_examine": false 00:36:24.703 } 00:36:24.703 } 00:36:24.703 ] 00:36:24.703 } 00:36:24.703 ] 00:36:24.703 }' 00:36:24.703 [2024-07-24 20:12:16.140604] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:36:24.703 [2024-07-24 20:12:16.140675] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1593492 ] 00:36:24.703 [2024-07-24 20:12:16.259214] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:24.962 [2024-07-24 20:12:16.356069] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:25.221  Copying: 64/64 [kB] (average 20 MBps) 00:36:25.221 00:36:25.221 20:12:16 chaining -- bdev/chaining.sh@106 -- # update_stats 00:36:25.221 20:12:16 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:36:25.221 20:12:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:25.221 20:12:16 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:25.221 20:12:16 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:25.221 20:12:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:25.221 20:12:16 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:25.221 20:12:16 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:25.221 20:12:16 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:25.221 20:12:16 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:25.221 20:12:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:25.221 20:12:16 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:25.221 20:12:16 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:36:25.221 20:12:16 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:25.480 20:12:16 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:25.480 20:12:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:25.480 20:12:16 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:25.480 20:12:16 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:25.480 20:12:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:25.480 20:12:16 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:25.480 20:12:16 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:25.480 20:12:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:25.480 20:12:16 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.XoTNgWVhoJ --ob Nvme0n1 --bs 4096 --count 16 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@25 -- # local config 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:36:25.480 20:12:16 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:36:25.481 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:36:25.481 20:12:17 chaining -- bdev/chaining.sh@31 -- # config='{ 00:36:25.481 "subsystems": [ 00:36:25.481 { 00:36:25.481 "subsystem": "bdev", 00:36:25.481 "config": [ 00:36:25.481 { 00:36:25.481 "method": "bdev_nvme_attach_controller", 00:36:25.481 "params": { 00:36:25.481 "trtype": "tcp", 00:36:25.481 "adrfam": "IPv4", 00:36:25.481 "name": "Nvme0", 00:36:25.481 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:25.481 "traddr": "10.0.0.2", 00:36:25.481 "trsvcid": "4420" 00:36:25.481 } 00:36:25.481 }, 00:36:25.481 { 00:36:25.481 "method": "bdev_set_options", 00:36:25.481 "params": { 00:36:25.481 "bdev_auto_examine": false 00:36:25.481 } 00:36:25.481 } 00:36:25.481 ] 00:36:25.481 } 00:36:25.481 ] 00:36:25.481 }' 00:36:25.481 20:12:17 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:36:25.481 "subsystems": [ 00:36:25.481 { 00:36:25.481 "subsystem": "bdev", 00:36:25.481 "config": [ 00:36:25.481 { 00:36:25.481 "method": "bdev_nvme_attach_controller", 00:36:25.481 "params": { 00:36:25.481 "trtype": "tcp", 00:36:25.481 "adrfam": "IPv4", 00:36:25.481 "name": "Nvme0", 00:36:25.481 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:25.481 "traddr": "10.0.0.2", 00:36:25.481 "trsvcid": "4420" 00:36:25.481 } 00:36:25.481 }, 00:36:25.481 { 00:36:25.481 "method": "bdev_set_options", 00:36:25.481 "params": { 00:36:25.481 "bdev_auto_examine": false 00:36:25.481 } 00:36:25.481 } 00:36:25.481 ] 00:36:25.481 } 00:36:25.481 ] 00:36:25.481 }' 00:36:25.481 20:12:17 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.XoTNgWVhoJ --ob Nvme0n1 --bs 4096 --count 16 00:36:25.481 [2024-07-24 20:12:17.067483] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:36:25.481 [2024-07-24 20:12:17.067551] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1593580 ] 00:36:25.740 [2024-07-24 20:12:17.201186] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:25.740 [2024-07-24 20:12:17.305146] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:26.258  Copying: 64/64 [kB] (average 12 MBps) 00:36:26.258 00:36:26.258 20:12:17 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:36:26.258 20:12:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:26.258 20:12:17 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:26.258 20:12:17 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:26.258 20:12:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:26.258 20:12:17 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:26.258 20:12:17 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:26.258 20:12:17 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:26.258 20:12:17 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:26.258 20:12:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:26.258 20:12:17 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:26.258 20:12:17 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:36:26.258 20:12:17 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:36:26.258 20:12:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:26.258 20:12:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:26.258 20:12:17 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:26.258 20:12:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:26.258 20:12:17 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:26.258 20:12:17 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:26.258 20:12:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:26.258 20:12:17 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:26.258 20:12:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:26.258 20:12:17 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:26.518 20:12:17 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:26.518 20:12:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:26.518 20:12:17 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:26.518 20:12:17 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:26.518 20:12:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:26.518 20:12:17 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@114 -- # update_stats 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:26.518 20:12:17 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:26.518 20:12:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:26.518 20:12:17 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:26.518 20:12:17 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:26.518 20:12:18 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:26.518 20:12:18 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:26.518 20:12:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:26.518 20:12:18 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:26.518 20:12:18 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:36:26.518 20:12:18 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:36:26.518 20:12:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:26.518 20:12:18 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:26.518 20:12:18 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:26.518 20:12:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:26.518 20:12:18 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:26.518 20:12:18 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:26.518 20:12:18 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:26.518 20:12:18 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:26.518 20:12:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:26.518 20:12:18 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:26.777 20:12:18 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:36:26.777 20:12:18 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:36:26.777 20:12:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:26.777 20:12:18 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:26.777 20:12:18 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:26.777 20:12:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:26.777 20:12:18 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:26.777 20:12:18 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:26.777 20:12:18 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:26.777 20:12:18 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:26.777 20:12:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:26.777 20:12:18 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:26.777 20:12:18 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:36:26.777 20:12:18 chaining -- bdev/chaining.sh@117 -- # : 00:36:26.777 20:12:18 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.5Nd1fOscQg --ib Nvme0n1 --bs 4096 --count 16 00:36:26.777 20:12:18 chaining -- bdev/chaining.sh@25 -- # local config 00:36:26.777 20:12:18 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:36:26.777 20:12:18 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:36:26.777 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:36:26.777 20:12:18 chaining -- bdev/chaining.sh@31 -- # config='{ 00:36:26.777 "subsystems": [ 00:36:26.777 { 00:36:26.777 "subsystem": "bdev", 00:36:26.777 "config": [ 00:36:26.777 { 00:36:26.777 "method": "bdev_nvme_attach_controller", 00:36:26.777 "params": { 00:36:26.777 "trtype": "tcp", 00:36:26.777 "adrfam": "IPv4", 00:36:26.777 "name": "Nvme0", 00:36:26.777 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:26.777 "traddr": "10.0.0.2", 00:36:26.777 "trsvcid": "4420" 00:36:26.777 } 00:36:26.777 }, 00:36:26.777 { 00:36:26.777 "method": "bdev_set_options", 00:36:26.777 "params": { 00:36:26.777 "bdev_auto_examine": false 00:36:26.777 } 00:36:26.777 } 00:36:26.777 ] 00:36:26.777 } 00:36:26.777 ] 00:36:26.777 }' 00:36:26.777 20:12:18 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.5Nd1fOscQg --ib Nvme0n1 --bs 4096 --count 16 00:36:26.777 20:12:18 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:36:26.777 "subsystems": [ 00:36:26.777 { 00:36:26.777 "subsystem": "bdev", 00:36:26.777 "config": [ 00:36:26.777 { 00:36:26.777 "method": "bdev_nvme_attach_controller", 00:36:26.777 "params": { 00:36:26.777 "trtype": "tcp", 00:36:26.777 "adrfam": "IPv4", 00:36:26.777 "name": "Nvme0", 00:36:26.777 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:26.777 "traddr": "10.0.0.2", 00:36:26.777 "trsvcid": "4420" 00:36:26.777 } 00:36:26.777 }, 00:36:26.777 { 00:36:26.777 "method": "bdev_set_options", 00:36:26.777 "params": { 00:36:26.777 "bdev_auto_examine": false 00:36:26.777 } 00:36:26.777 } 00:36:26.777 ] 00:36:26.777 } 00:36:26.777 ] 00:36:26.777 }' 00:36:26.777 [2024-07-24 20:12:18.272870] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:36:26.777 [2024-07-24 20:12:18.272941] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1593794 ] 00:36:27.036 [2024-07-24 20:12:18.403245] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:27.036 [2024-07-24 20:12:18.503894] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:27.553  Copying: 64/64 [kB] (average 1333 kBps) 00:36:27.553 00:36:27.553 20:12:18 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:36:27.553 20:12:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:27.553 20:12:18 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:27.553 20:12:18 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:27.553 20:12:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:27.553 20:12:18 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:27.553 20:12:18 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:27.553 20:12:18 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:27.553 20:12:18 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:27.553 20:12:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:27.553 20:12:18 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:27.553 20:12:19 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:36:27.553 20:12:19 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:36:27.553 20:12:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:27.553 20:12:19 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:27.553 20:12:19 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:27.553 20:12:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:27.553 20:12:19 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:27.553 20:12:19 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:27.553 20:12:19 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:27.553 20:12:19 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:27.553 20:12:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:27.553 20:12:19 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:27.553 20:12:19 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:36:27.553 20:12:19 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:36:27.553 20:12:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:27.553 20:12:19 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:27.553 20:12:19 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:27.553 20:12:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:27.553 20:12:19 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:27.553 20:12:19 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:27.553 20:12:19 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:27.553 20:12:19 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:27.553 20:12:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:27.553 20:12:19 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:27.553 20:12:19 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:36:27.553 20:12:19 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:36:27.553 20:12:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:27.553 20:12:19 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:27.553 20:12:19 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:27.553 20:12:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:27.553 20:12:19 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:27.553 20:12:19 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:27.553 20:12:19 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:27.553 20:12:19 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:27.553 20:12:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:27.813 20:12:19 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:27.813 20:12:19 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:36:27.813 20:12:19 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.XoTNgWVhoJ /tmp/tmp.5Nd1fOscQg 00:36:27.813 20:12:19 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:36:27.813 20:12:19 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:36:27.813 20:12:19 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.XoTNgWVhoJ /tmp/tmp.5Nd1fOscQg 00:36:27.813 20:12:19 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:36:27.813 20:12:19 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:36:27.813 20:12:19 chaining -- nvmf/common.sh@117 -- # sync 00:36:27.813 20:12:19 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:36:27.813 20:12:19 chaining -- nvmf/common.sh@120 -- # set +e 00:36:27.813 20:12:19 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:36:27.813 20:12:19 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:36:27.813 rmmod nvme_tcp 00:36:27.813 rmmod nvme_fabrics 00:36:27.813 rmmod nvme_keyring 00:36:27.813 20:12:19 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:36:27.813 20:12:19 chaining -- nvmf/common.sh@124 -- # set -e 00:36:27.813 20:12:19 chaining -- nvmf/common.sh@125 -- # return 0 00:36:27.813 20:12:19 chaining -- nvmf/common.sh@489 -- # '[' -n 1592918 ']' 00:36:27.813 20:12:19 chaining -- nvmf/common.sh@490 -- # killprocess 1592918 00:36:27.813 20:12:19 chaining -- common/autotest_common.sh@950 -- # '[' -z 1592918 ']' 00:36:27.813 20:12:19 chaining -- common/autotest_common.sh@954 -- # kill -0 1592918 00:36:27.813 20:12:19 chaining -- common/autotest_common.sh@955 -- # uname 00:36:27.813 20:12:19 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:36:27.813 20:12:19 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1592918 00:36:27.813 20:12:19 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:36:27.813 20:12:19 chaining -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:36:27.813 20:12:19 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1592918' 00:36:27.813 killing process with pid 1592918 00:36:27.813 20:12:19 chaining -- common/autotest_common.sh@969 -- # kill 1592918 00:36:27.813 20:12:19 chaining -- common/autotest_common.sh@974 -- # wait 1592918 00:36:28.072 20:12:19 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:36:28.072 20:12:19 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:36:28.072 20:12:19 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:36:28.072 20:12:19 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:36:28.072 20:12:19 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:36:28.072 20:12:19 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:36:28.072 20:12:19 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:36:28.072 20:12:19 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:36:28.331 20:12:19 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:36:28.331 20:12:19 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:36:28.331 20:12:19 chaining -- bdev/chaining.sh@132 -- # bperfpid=1594004 00:36:28.331 20:12:19 chaining -- bdev/chaining.sh@134 -- # waitforlisten 1594004 00:36:28.331 20:12:19 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:36:28.331 20:12:19 chaining -- common/autotest_common.sh@831 -- # '[' -z 1594004 ']' 00:36:28.331 20:12:19 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:36:28.331 20:12:19 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:36:28.331 20:12:19 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:36:28.331 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:36:28.331 20:12:19 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:36:28.331 20:12:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:28.331 [2024-07-24 20:12:19.755624] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:36:28.331 [2024-07-24 20:12:19.755702] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1594004 ] 00:36:28.331 [2024-07-24 20:12:19.876292] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:28.590 [2024-07-24 20:12:19.973457] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:29.158 20:12:20 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:36:29.158 20:12:20 chaining -- common/autotest_common.sh@864 -- # return 0 00:36:29.158 20:12:20 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:36:29.158 20:12:20 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:29.158 20:12:20 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:29.158 malloc0 00:36:29.417 true 00:36:29.417 true 00:36:29.417 [2024-07-24 20:12:20.764428] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:36:29.417 crypto0 00:36:29.417 [2024-07-24 20:12:20.772454] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:36:29.417 crypto1 00:36:29.417 20:12:20 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:29.417 20:12:20 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:36:29.417 Running I/O for 5 seconds... 00:36:34.703 00:36:34.703 Latency(us) 00:36:34.703 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:34.703 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:36:34.703 Verification LBA range: start 0x0 length 0x2000 00:36:34.703 crypto1 : 5.01 11447.15 44.72 0.00 0.00 22302.38 6439.62 14246.96 00:36:34.703 =================================================================================================================== 00:36:34.703 Total : 11447.15 44.72 0.00 0.00 22302.38 6439.62 14246.96 00:36:34.703 0 00:36:34.703 20:12:26 chaining -- bdev/chaining.sh@146 -- # killprocess 1594004 00:36:34.703 20:12:26 chaining -- common/autotest_common.sh@950 -- # '[' -z 1594004 ']' 00:36:34.703 20:12:26 chaining -- common/autotest_common.sh@954 -- # kill -0 1594004 00:36:34.703 20:12:26 chaining -- common/autotest_common.sh@955 -- # uname 00:36:34.703 20:12:26 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:36:34.703 20:12:26 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1594004 00:36:34.703 20:12:26 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:36:34.703 20:12:26 chaining -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:36:34.703 20:12:26 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1594004' 00:36:34.703 killing process with pid 1594004 00:36:34.703 20:12:26 chaining -- common/autotest_common.sh@969 -- # kill 1594004 00:36:34.703 Received shutdown signal, test time was about 5.000000 seconds 00:36:34.703 00:36:34.703 Latency(us) 00:36:34.703 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:34.703 =================================================================================================================== 00:36:34.703 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:36:34.703 20:12:26 chaining -- common/autotest_common.sh@974 -- # wait 1594004 00:36:34.962 20:12:26 chaining -- bdev/chaining.sh@152 -- # bperfpid=1594880 00:36:34.962 20:12:26 chaining -- bdev/chaining.sh@154 -- # waitforlisten 1594880 00:36:34.962 20:12:26 chaining -- common/autotest_common.sh@831 -- # '[' -z 1594880 ']' 00:36:34.962 20:12:26 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:36:34.962 20:12:26 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:36:34.962 20:12:26 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:36:34.962 20:12:26 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:36:34.962 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:36:34.962 20:12:26 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:36:34.962 20:12:26 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:34.962 [2024-07-24 20:12:26.403707] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:36:34.962 [2024-07-24 20:12:26.403847] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1594880 ] 00:36:35.259 [2024-07-24 20:12:26.602320] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:35.259 [2024-07-24 20:12:26.712532] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:35.259 20:12:26 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:36:35.259 20:12:26 chaining -- common/autotest_common.sh@864 -- # return 0 00:36:35.259 20:12:26 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:36:35.259 20:12:26 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:35.259 20:12:26 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:35.549 malloc0 00:36:35.549 true 00:36:35.549 true 00:36:35.549 [2024-07-24 20:12:26.970018] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:36:35.549 [2024-07-24 20:12:26.970067] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:35.549 [2024-07-24 20:12:26.970087] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15936e0 00:36:35.549 [2024-07-24 20:12:26.970099] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:35.549 [2024-07-24 20:12:26.971186] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:35.549 [2024-07-24 20:12:26.971211] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:36:35.549 pt0 00:36:35.549 [2024-07-24 20:12:26.978048] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:36:35.549 crypto0 00:36:35.549 [2024-07-24 20:12:26.986069] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:36:35.549 crypto1 00:36:35.549 20:12:26 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:35.549 20:12:26 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:36:35.808 Running I/O for 5 seconds... 00:36:41.082 00:36:41.082 Latency(us) 00:36:41.082 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:41.082 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:36:41.082 Verification LBA range: start 0x0 length 0x2000 00:36:41.082 crypto1 : 5.02 8922.54 34.85 0.00 0.00 28613.35 6496.61 17780.20 00:36:41.083 =================================================================================================================== 00:36:41.083 Total : 8922.54 34.85 0.00 0.00 28613.35 6496.61 17780.20 00:36:41.083 0 00:36:41.083 20:12:32 chaining -- bdev/chaining.sh@167 -- # killprocess 1594880 00:36:41.083 20:12:32 chaining -- common/autotest_common.sh@950 -- # '[' -z 1594880 ']' 00:36:41.083 20:12:32 chaining -- common/autotest_common.sh@954 -- # kill -0 1594880 00:36:41.083 20:12:32 chaining -- common/autotest_common.sh@955 -- # uname 00:36:41.083 20:12:32 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:36:41.083 20:12:32 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1594880 00:36:41.083 20:12:32 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:36:41.083 20:12:32 chaining -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:36:41.083 20:12:32 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1594880' 00:36:41.083 killing process with pid 1594880 00:36:41.083 20:12:32 chaining -- common/autotest_common.sh@969 -- # kill 1594880 00:36:41.083 Received shutdown signal, test time was about 5.000000 seconds 00:36:41.083 00:36:41.083 Latency(us) 00:36:41.083 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:41.083 =================================================================================================================== 00:36:41.083 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:36:41.083 20:12:32 chaining -- common/autotest_common.sh@974 -- # wait 1594880 00:36:41.083 20:12:32 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:36:41.083 20:12:32 chaining -- bdev/chaining.sh@170 -- # killprocess 1594880 00:36:41.083 20:12:32 chaining -- common/autotest_common.sh@950 -- # '[' -z 1594880 ']' 00:36:41.083 20:12:32 chaining -- common/autotest_common.sh@954 -- # kill -0 1594880 00:36:41.083 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 954: kill: (1594880) - No such process 00:36:41.083 20:12:32 chaining -- common/autotest_common.sh@977 -- # echo 'Process with pid 1594880 is not found' 00:36:41.083 Process with pid 1594880 is not found 00:36:41.083 20:12:32 chaining -- bdev/chaining.sh@171 -- # wait 1594880 00:36:41.083 20:12:32 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:36:41.083 20:12:32 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:36:41.083 20:12:32 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:36:41.083 20:12:32 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@296 -- # e810=() 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@297 -- # x722=() 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@298 -- # mlx=() 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@336 -- # return 1 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:36:41.083 WARNING: No supported devices were found, fallback requested for tcp test 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:36:41.083 Cannot find device "nvmf_tgt_br" 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@155 -- # true 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:36:41.083 Cannot find device "nvmf_tgt_br2" 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@156 -- # true 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:36:41.083 Cannot find device "nvmf_tgt_br" 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@158 -- # true 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:36:41.083 Cannot find device "nvmf_tgt_br2" 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@159 -- # true 00:36:41.083 20:12:32 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:36:41.343 20:12:32 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:36:41.343 20:12:32 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:36:41.343 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:36:41.343 20:12:32 chaining -- nvmf/common.sh@162 -- # true 00:36:41.343 20:12:32 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:36:41.343 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:36:41.343 20:12:32 chaining -- nvmf/common.sh@163 -- # true 00:36:41.343 20:12:32 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:36:41.343 20:12:32 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:36:41.343 20:12:32 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:36:41.343 20:12:32 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:36:41.343 20:12:32 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:36:41.343 20:12:32 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:36:41.343 20:12:32 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:36:41.343 20:12:32 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:36:41.343 20:12:32 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:36:41.343 20:12:32 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:36:41.343 20:12:32 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:36:41.343 20:12:32 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:36:41.343 20:12:32 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:36:41.343 20:12:32 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:36:41.343 20:12:32 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:36:41.343 20:12:32 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:36:41.343 20:12:32 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:36:41.343 20:12:32 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:36:41.344 20:12:32 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:36:41.601 20:12:32 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:36:41.601 20:12:33 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:36:41.601 20:12:33 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:36:41.601 20:12:33 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:36:41.601 20:12:33 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:36:41.601 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:36:41.601 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.111 ms 00:36:41.601 00:36:41.601 --- 10.0.0.2 ping statistics --- 00:36:41.601 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:36:41.601 rtt min/avg/max/mdev = 0.111/0.111/0.111/0.000 ms 00:36:41.601 20:12:33 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:36:41.601 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:36:41.601 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.071 ms 00:36:41.601 00:36:41.601 --- 10.0.0.3 ping statistics --- 00:36:41.601 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:36:41.601 rtt min/avg/max/mdev = 0.071/0.071/0.071/0.000 ms 00:36:41.601 20:12:33 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:36:41.601 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:36:41.601 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.106 ms 00:36:41.601 00:36:41.601 --- 10.0.0.1 ping statistics --- 00:36:41.601 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:36:41.601 rtt min/avg/max/mdev = 0.106/0.106/0.106/0.000 ms 00:36:41.601 20:12:33 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:36:41.601 20:12:33 chaining -- nvmf/common.sh@433 -- # return 0 00:36:41.601 20:12:33 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:36:41.601 20:12:33 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:36:41.601 20:12:33 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:36:41.601 20:12:33 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:36:41.601 20:12:33 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:36:41.601 20:12:33 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:36:41.601 20:12:33 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:36:41.601 20:12:33 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:36:41.601 20:12:33 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:36:41.601 20:12:33 chaining -- common/autotest_common.sh@724 -- # xtrace_disable 00:36:41.601 20:12:33 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:41.601 20:12:33 chaining -- nvmf/common.sh@481 -- # nvmfpid=1596013 00:36:41.601 20:12:33 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:36:41.601 20:12:33 chaining -- nvmf/common.sh@482 -- # waitforlisten 1596013 00:36:41.601 20:12:33 chaining -- common/autotest_common.sh@831 -- # '[' -z 1596013 ']' 00:36:41.601 20:12:33 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:36:41.601 20:12:33 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:36:41.602 20:12:33 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:36:41.602 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:36:41.602 20:12:33 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:36:41.602 20:12:33 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:41.871 [2024-07-24 20:12:33.211046] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:36:41.871 [2024-07-24 20:12:33.211112] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:36:41.871 [2024-07-24 20:12:33.352741] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:42.133 [2024-07-24 20:12:33.467626] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:36:42.133 [2024-07-24 20:12:33.467680] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:36:42.133 [2024-07-24 20:12:33.467698] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:36:42.133 [2024-07-24 20:12:33.467715] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:36:42.133 [2024-07-24 20:12:33.467728] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:36:42.133 [2024-07-24 20:12:33.467762] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:36:42.701 20:12:34 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:36:42.701 20:12:34 chaining -- common/autotest_common.sh@864 -- # return 0 00:36:42.701 20:12:34 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:36:42.701 20:12:34 chaining -- common/autotest_common.sh@730 -- # xtrace_disable 00:36:42.701 20:12:34 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:42.701 20:12:34 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:36:42.701 20:12:34 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:36:42.701 20:12:34 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:36:42.701 20:12:34 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:42.701 malloc0 00:36:42.701 [2024-07-24 20:12:34.206835] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:36:42.701 [2024-07-24 20:12:34.223091] tcp.c:1006:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:36:42.701 20:12:34 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:36:42.701 20:12:34 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:36:42.701 20:12:34 chaining -- bdev/chaining.sh@189 -- # bperfpid=1596125 00:36:42.701 20:12:34 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:36:42.701 20:12:34 chaining -- bdev/chaining.sh@191 -- # waitforlisten 1596125 /var/tmp/bperf.sock 00:36:42.701 20:12:34 chaining -- common/autotest_common.sh@831 -- # '[' -z 1596125 ']' 00:36:42.701 20:12:34 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/bperf.sock 00:36:42.701 20:12:34 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:36:42.701 20:12:34 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:36:42.701 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:36:42.701 20:12:34 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:36:42.701 20:12:34 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:42.961 [2024-07-24 20:12:34.343927] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:36:42.961 [2024-07-24 20:12:34.344062] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1596125 ] 00:36:42.961 [2024-07-24 20:12:34.540919] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:43.220 [2024-07-24 20:12:34.643106] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:44.157 20:12:35 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:36:44.157 20:12:35 chaining -- common/autotest_common.sh@864 -- # return 0 00:36:44.157 20:12:35 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:36:44.157 20:12:35 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:36:44.416 [2024-07-24 20:12:35.907074] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:36:44.416 nvme0n1 00:36:44.416 true 00:36:44.416 crypto0 00:36:44.416 20:12:35 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:36:44.674 Running I/O for 5 seconds... 00:36:49.944 00:36:49.944 Latency(us) 00:36:49.944 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:49.944 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:36:49.944 Verification LBA range: start 0x0 length 0x2000 00:36:49.944 crypto0 : 5.03 6909.76 26.99 0.00 0.00 36920.72 5527.82 27582.11 00:36:49.944 =================================================================================================================== 00:36:49.944 Total : 6909.76 26.99 0.00 0.00 36920.72 5527.82 27582.11 00:36:49.944 0 00:36:49.944 20:12:41 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:36:49.944 20:12:41 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:36:49.944 20:12:41 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:49.944 20:12:41 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:49.944 20:12:41 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:49.944 20:12:41 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:49.944 20:12:41 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:49.944 20:12:41 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:36:49.944 20:12:41 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:49.944 20:12:41 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:49.944 20:12:41 chaining -- bdev/chaining.sh@205 -- # sequence=69484 00:36:49.944 20:12:41 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:36:49.944 20:12:41 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:36:49.944 20:12:41 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:49.944 20:12:41 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:49.944 20:12:41 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:49.944 20:12:41 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:49.944 20:12:41 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:49.944 20:12:41 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:49.944 20:12:41 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:36:49.944 20:12:41 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:50.203 20:12:41 chaining -- bdev/chaining.sh@206 -- # encrypt=34742 00:36:50.203 20:12:41 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:36:50.203 20:12:41 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:36:50.203 20:12:41 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:50.203 20:12:41 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:50.203 20:12:41 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:50.203 20:12:41 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:50.203 20:12:41 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:50.203 20:12:41 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:36:50.203 20:12:41 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:50.204 20:12:41 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:50.462 20:12:42 chaining -- bdev/chaining.sh@207 -- # decrypt=34742 00:36:50.463 20:12:42 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:36:50.463 20:12:42 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:36:50.463 20:12:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:50.463 20:12:42 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:50.463 20:12:42 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:36:50.463 20:12:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:50.463 20:12:42 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:36:50.463 20:12:42 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:36:50.463 20:12:42 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:36:50.463 20:12:42 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:50.721 20:12:42 chaining -- bdev/chaining.sh@208 -- # crc32c=69484 00:36:50.721 20:12:42 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:36:50.721 20:12:42 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:36:50.721 20:12:42 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:36:50.721 20:12:42 chaining -- bdev/chaining.sh@214 -- # killprocess 1596125 00:36:50.721 20:12:42 chaining -- common/autotest_common.sh@950 -- # '[' -z 1596125 ']' 00:36:50.721 20:12:42 chaining -- common/autotest_common.sh@954 -- # kill -0 1596125 00:36:50.721 20:12:42 chaining -- common/autotest_common.sh@955 -- # uname 00:36:50.721 20:12:42 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:36:50.721 20:12:42 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1596125 00:36:50.980 20:12:42 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:36:50.980 20:12:42 chaining -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:36:50.980 20:12:42 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1596125' 00:36:50.980 killing process with pid 1596125 00:36:50.980 20:12:42 chaining -- common/autotest_common.sh@969 -- # kill 1596125 00:36:50.980 Received shutdown signal, test time was about 5.000000 seconds 00:36:50.980 00:36:50.980 Latency(us) 00:36:50.980 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:50.980 =================================================================================================================== 00:36:50.980 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:36:50.980 20:12:42 chaining -- common/autotest_common.sh@974 -- # wait 1596125 00:36:51.239 20:12:42 chaining -- bdev/chaining.sh@219 -- # bperfpid=1597153 00:36:51.239 20:12:42 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:36:51.239 20:12:42 chaining -- bdev/chaining.sh@221 -- # waitforlisten 1597153 /var/tmp/bperf.sock 00:36:51.239 20:12:42 chaining -- common/autotest_common.sh@831 -- # '[' -z 1597153 ']' 00:36:51.239 20:12:42 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/bperf.sock 00:36:51.239 20:12:42 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:36:51.239 20:12:42 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:36:51.239 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:36:51.239 20:12:42 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:36:51.239 20:12:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:51.239 [2024-07-24 20:12:42.641316] Starting SPDK v24.09-pre git sha1 3bc1795d3 / DPDK 24.03.0 initialization... 00:36:51.239 [2024-07-24 20:12:42.641398] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1597153 ] 00:36:51.239 [2024-07-24 20:12:42.771904] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:51.498 [2024-07-24 20:12:42.877055] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:52.064 20:12:43 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:36:52.064 20:12:43 chaining -- common/autotest_common.sh@864 -- # return 0 00:36:52.064 20:12:43 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:36:52.064 20:12:43 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:36:52.632 [2024-07-24 20:12:43.986883] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:36:52.632 nvme0n1 00:36:52.632 true 00:36:52.632 crypto0 00:36:52.632 20:12:44 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:36:52.632 Running I/O for 5 seconds... 00:36:57.904 00:36:57.904 Latency(us) 00:36:57.904 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:57.904 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:36:57.904 Verification LBA range: start 0x0 length 0x200 00:36:57.904 crypto0 : 5.01 1641.01 102.56 0.00 0.00 19117.59 1702.51 21997.30 00:36:57.904 =================================================================================================================== 00:36:57.904 Total : 1641.01 102.56 0.00 0.00 19117.59 1702.51 21997.30 00:36:57.904 0 00:36:57.904 20:12:49 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:36:57.904 20:12:49 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:36:57.904 20:12:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:57.904 20:12:49 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:57.904 20:12:49 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:57.904 20:12:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:57.904 20:12:49 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:57.904 20:12:49 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:36:57.904 20:12:49 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:57.904 20:12:49 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:57.904 20:12:49 chaining -- bdev/chaining.sh@233 -- # sequence=16434 00:36:57.904 20:12:49 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:36:57.904 20:12:49 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:36:57.904 20:12:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:57.904 20:12:49 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:57.904 20:12:49 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:57.904 20:12:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:57.904 20:12:49 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:57.904 20:12:49 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:57.905 20:12:49 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:36:57.905 20:12:49 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:58.163 20:12:49 chaining -- bdev/chaining.sh@234 -- # encrypt=8217 00:36:58.163 20:12:49 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:36:58.163 20:12:49 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:36:58.163 20:12:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:58.163 20:12:49 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:58.163 20:12:49 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:58.163 20:12:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:58.163 20:12:49 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:58.163 20:12:49 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:36:58.163 20:12:49 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:58.163 20:12:49 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:58.423 20:12:49 chaining -- bdev/chaining.sh@235 -- # decrypt=8217 00:36:58.423 20:12:49 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:36:58.423 20:12:49 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:36:58.423 20:12:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:58.423 20:12:49 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:58.423 20:12:49 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:36:58.423 20:12:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:58.423 20:12:49 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:36:58.423 20:12:49 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:36:58.423 20:12:49 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:36:58.423 20:12:49 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:58.682 20:12:50 chaining -- bdev/chaining.sh@236 -- # crc32c=16434 00:36:58.682 20:12:50 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:36:58.682 20:12:50 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:36:58.682 20:12:50 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:36:58.682 20:12:50 chaining -- bdev/chaining.sh@242 -- # killprocess 1597153 00:36:58.682 20:12:50 chaining -- common/autotest_common.sh@950 -- # '[' -z 1597153 ']' 00:36:58.682 20:12:50 chaining -- common/autotest_common.sh@954 -- # kill -0 1597153 00:36:58.682 20:12:50 chaining -- common/autotest_common.sh@955 -- # uname 00:36:58.682 20:12:50 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:36:58.682 20:12:50 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1597153 00:36:58.683 20:12:50 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:36:58.683 20:12:50 chaining -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:36:58.683 20:12:50 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1597153' 00:36:58.683 killing process with pid 1597153 00:36:58.683 20:12:50 chaining -- common/autotest_common.sh@969 -- # kill 1597153 00:36:58.683 Received shutdown signal, test time was about 5.000000 seconds 00:36:58.683 00:36:58.683 Latency(us) 00:36:58.683 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:58.683 =================================================================================================================== 00:36:58.683 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:36:58.683 20:12:50 chaining -- common/autotest_common.sh@974 -- # wait 1597153 00:36:58.942 20:12:50 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:36:58.942 20:12:50 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:36:58.942 20:12:50 chaining -- nvmf/common.sh@117 -- # sync 00:36:58.942 20:12:50 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:36:58.942 20:12:50 chaining -- nvmf/common.sh@120 -- # set +e 00:36:58.942 20:12:50 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:36:58.942 20:12:50 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:36:58.942 rmmod nvme_tcp 00:36:58.942 rmmod nvme_fabrics 00:36:58.942 rmmod nvme_keyring 00:36:58.942 20:12:50 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:36:58.942 20:12:50 chaining -- nvmf/common.sh@124 -- # set -e 00:36:58.942 20:12:50 chaining -- nvmf/common.sh@125 -- # return 0 00:36:58.942 20:12:50 chaining -- nvmf/common.sh@489 -- # '[' -n 1596013 ']' 00:36:58.942 20:12:50 chaining -- nvmf/common.sh@490 -- # killprocess 1596013 00:36:58.942 20:12:50 chaining -- common/autotest_common.sh@950 -- # '[' -z 1596013 ']' 00:36:58.942 20:12:50 chaining -- common/autotest_common.sh@954 -- # kill -0 1596013 00:36:58.942 20:12:50 chaining -- common/autotest_common.sh@955 -- # uname 00:36:58.942 20:12:50 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:36:58.942 20:12:50 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1596013 00:36:59.201 20:12:50 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:36:59.201 20:12:50 chaining -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:36:59.201 20:12:50 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1596013' 00:36:59.201 killing process with pid 1596013 00:36:59.201 20:12:50 chaining -- common/autotest_common.sh@969 -- # kill 1596013 00:36:59.201 20:12:50 chaining -- common/autotest_common.sh@974 -- # wait 1596013 00:36:59.460 20:12:50 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:36:59.460 20:12:50 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:36:59.460 20:12:50 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:36:59.460 20:12:50 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:36:59.460 20:12:50 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:36:59.460 20:12:50 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:36:59.460 20:12:50 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:36:59.460 20:12:50 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:36:59.460 20:12:50 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:36:59.460 20:12:50 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:36:59.460 00:36:59.460 real 0m46.885s 00:36:59.460 user 1m1.115s 00:36:59.460 sys 0m13.964s 00:36:59.460 20:12:50 chaining -- common/autotest_common.sh@1126 -- # xtrace_disable 00:36:59.460 20:12:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:59.460 ************************************ 00:36:59.460 END TEST chaining 00:36:59.460 ************************************ 00:36:59.460 20:12:50 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:36:59.460 20:12:50 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:36:59.460 20:12:50 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:36:59.460 20:12:50 -- spdk/autotest.sh@379 -- # [[ 0 -eq 1 ]] 00:36:59.460 20:12:50 -- spdk/autotest.sh@384 -- # trap - SIGINT SIGTERM EXIT 00:36:59.460 20:12:50 -- spdk/autotest.sh@386 -- # timing_enter post_cleanup 00:36:59.460 20:12:50 -- common/autotest_common.sh@724 -- # xtrace_disable 00:36:59.460 20:12:50 -- common/autotest_common.sh@10 -- # set +x 00:36:59.460 20:12:50 -- spdk/autotest.sh@387 -- # autotest_cleanup 00:36:59.460 20:12:50 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:36:59.460 20:12:50 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:36:59.460 20:12:50 -- common/autotest_common.sh@10 -- # set +x 00:37:04.734 INFO: APP EXITING 00:37:04.734 INFO: killing all VMs 00:37:04.734 INFO: killing vhost app 00:37:04.734 INFO: EXIT DONE 00:37:08.084 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:37:08.084 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:37:08.084 Waiting for block devices as requested 00:37:08.084 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:37:08.084 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:37:08.084 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:37:08.084 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:37:08.344 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:37:08.344 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:37:08.344 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:37:08.603 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:37:08.603 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:37:08.603 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:37:08.863 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:37:08.863 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:37:08.863 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:37:09.122 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:37:09.122 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:37:09.122 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:37:09.380 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:37:13.574 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:37:13.574 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:37:13.574 Cleaning 00:37:13.574 Removing: /var/run/dpdk/spdk0/config 00:37:13.574 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:37:13.574 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:37:13.574 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:37:13.574 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:37:13.574 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:37:13.574 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:37:13.574 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:37:13.574 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:37:13.574 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:37:13.574 Removing: /var/run/dpdk/spdk0/hugepage_info 00:37:13.574 Removing: /dev/shm/nvmf_trace.0 00:37:13.574 Removing: /dev/shm/spdk_tgt_trace.pid1330856 00:37:13.574 Removing: /var/run/dpdk/spdk0 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1330001 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1330856 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1331385 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1332109 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1332359 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1333213 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1333232 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1333516 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1336123 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1337453 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1337702 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1338087 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1338355 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1338636 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1338875 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1339111 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1339378 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1340146 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1342851 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1343052 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1343290 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1343589 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1343696 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1343888 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1344119 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1344320 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1344516 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1344714 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1344944 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1345260 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1345467 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1345664 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1345860 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1346061 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1346332 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1346612 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1346815 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1347012 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1347213 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1347406 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1347730 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1347962 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1348162 00:37:13.574 Removing: /var/run/dpdk/spdk_pid1348366 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1348559 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1348796 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1349117 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1349450 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1349683 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1350055 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1350424 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1350633 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1350995 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1351365 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1351429 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1351846 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1352320 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1352577 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1352724 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1357306 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1358963 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1360591 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1361484 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1362569 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1362926 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1362989 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1363135 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1366926 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1367413 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1368373 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1368584 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1376200 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1377796 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1378769 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1383521 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1385153 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1386125 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1390204 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1392637 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1393453 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1403179 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1405227 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1406211 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1416940 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1419159 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1420140 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1430095 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1433838 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1435173 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1446663 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1449306 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1450351 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1462063 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1464819 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1466229 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1478375 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1482571 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1483553 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1484691 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1487869 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1493337 00:37:13.833 Removing: /var/run/dpdk/spdk_pid1495904 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1500450 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1503676 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1509044 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1511931 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1519273 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1521860 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1528164 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1530742 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1537052 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1540322 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1544888 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1545239 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1545589 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1546018 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1546511 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1547238 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1547966 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1548405 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1550175 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1551941 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1553545 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1554893 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1556627 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1558234 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1559842 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1561063 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1561673 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1562044 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1564341 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1566579 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1568414 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1569478 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1570707 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1571251 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1571278 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1571503 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1571707 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1571893 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1573118 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1574629 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1576129 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1576849 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1577610 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1577924 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1577948 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1578038 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1578911 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1579483 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1579993 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1582157 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1584018 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1585853 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1586866 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1588000 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1588667 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1588845 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1593140 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1593351 00:37:14.092 Removing: /var/run/dpdk/spdk_pid1593492 00:37:14.352 Removing: /var/run/dpdk/spdk_pid1593580 00:37:14.352 Removing: /var/run/dpdk/spdk_pid1593794 00:37:14.352 Removing: /var/run/dpdk/spdk_pid1594004 00:37:14.352 Removing: /var/run/dpdk/spdk_pid1594880 00:37:14.352 Removing: /var/run/dpdk/spdk_pid1596125 00:37:14.352 Removing: /var/run/dpdk/spdk_pid1597153 00:37:14.352 Clean 00:37:14.352 20:13:05 -- common/autotest_common.sh@1451 -- # return 0 00:37:14.352 20:13:05 -- spdk/autotest.sh@388 -- # timing_exit post_cleanup 00:37:14.352 20:13:05 -- common/autotest_common.sh@730 -- # xtrace_disable 00:37:14.352 20:13:05 -- common/autotest_common.sh@10 -- # set +x 00:37:14.352 20:13:05 -- spdk/autotest.sh@390 -- # timing_exit autotest 00:37:14.352 20:13:05 -- common/autotest_common.sh@730 -- # xtrace_disable 00:37:14.352 20:13:05 -- common/autotest_common.sh@10 -- # set +x 00:37:14.352 20:13:05 -- spdk/autotest.sh@391 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:37:14.352 20:13:05 -- spdk/autotest.sh@393 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:37:14.352 20:13:05 -- spdk/autotest.sh@393 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:37:14.352 20:13:05 -- spdk/autotest.sh@395 -- # hash lcov 00:37:14.352 20:13:05 -- spdk/autotest.sh@395 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:37:14.352 20:13:05 -- spdk/autotest.sh@397 -- # hostname 00:37:14.352 20:13:05 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-wfp-50 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:37:14.611 geninfo: WARNING: invalid characters removed from testname! 00:37:46.686 20:13:33 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:37:46.686 20:13:36 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:37:48.065 20:13:39 -- spdk/autotest.sh@400 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:37:51.350 20:13:42 -- spdk/autotest.sh@401 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:37:53.881 20:13:44 -- spdk/autotest.sh@402 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:37:56.414 20:13:47 -- spdk/autotest.sh@403 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:37:58.946 20:13:50 -- spdk/autotest.sh@404 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:37:58.946 20:13:50 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:37:58.946 20:13:50 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:37:58.946 20:13:50 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:37:58.946 20:13:50 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:37:58.946 20:13:50 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:58.946 20:13:50 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:58.946 20:13:50 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:58.946 20:13:50 -- paths/export.sh@5 -- $ export PATH 00:37:58.946 20:13:50 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:58.946 20:13:50 -- common/autobuild_common.sh@446 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:37:58.946 20:13:50 -- common/autobuild_common.sh@447 -- $ date +%s 00:37:58.946 20:13:50 -- common/autobuild_common.sh@447 -- $ mktemp -dt spdk_1721844830.XXXXXX 00:37:58.946 20:13:50 -- common/autobuild_common.sh@447 -- $ SPDK_WORKSPACE=/tmp/spdk_1721844830.aujkdy 00:37:58.946 20:13:50 -- common/autobuild_common.sh@449 -- $ [[ -n '' ]] 00:37:58.946 20:13:50 -- common/autobuild_common.sh@453 -- $ '[' -n '' ']' 00:37:58.946 20:13:50 -- common/autobuild_common.sh@456 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:37:58.946 20:13:50 -- common/autobuild_common.sh@460 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:37:58.946 20:13:50 -- common/autobuild_common.sh@462 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:37:58.946 20:13:50 -- common/autobuild_common.sh@463 -- $ get_config_params 00:37:58.946 20:13:50 -- common/autotest_common.sh@398 -- $ xtrace_disable 00:37:58.946 20:13:50 -- common/autotest_common.sh@10 -- $ set +x 00:37:58.946 20:13:50 -- common/autobuild_common.sh@463 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:37:58.946 20:13:50 -- common/autobuild_common.sh@465 -- $ start_monitor_resources 00:37:58.946 20:13:50 -- pm/common@17 -- $ local monitor 00:37:58.946 20:13:50 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:58.946 20:13:50 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:58.946 20:13:50 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:58.946 20:13:50 -- pm/common@21 -- $ date +%s 00:37:58.946 20:13:50 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:58.946 20:13:50 -- pm/common@21 -- $ date +%s 00:37:58.946 20:13:50 -- pm/common@25 -- $ sleep 1 00:37:58.946 20:13:50 -- pm/common@21 -- $ date +%s 00:37:58.946 20:13:50 -- pm/common@21 -- $ date +%s 00:37:58.946 20:13:50 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721844830 00:37:58.946 20:13:50 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721844830 00:37:58.946 20:13:50 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721844830 00:37:58.946 20:13:50 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721844830 00:37:58.946 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721844830_collect-vmstat.pm.log 00:37:58.946 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721844830_collect-cpu-load.pm.log 00:37:58.946 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721844830_collect-cpu-temp.pm.log 00:37:58.946 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721844830_collect-bmc-pm.bmc.pm.log 00:37:59.883 20:13:51 -- common/autobuild_common.sh@466 -- $ trap stop_monitor_resources EXIT 00:37:59.883 20:13:51 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j72 00:37:59.883 20:13:51 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:37:59.883 20:13:51 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:37:59.883 20:13:51 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:37:59.883 20:13:51 -- spdk/autopackage.sh@19 -- $ timing_finish 00:37:59.883 20:13:51 -- common/autotest_common.sh@736 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:37:59.883 20:13:51 -- common/autotest_common.sh@737 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:37:59.883 20:13:51 -- common/autotest_common.sh@739 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:37:59.883 20:13:51 -- spdk/autopackage.sh@20 -- $ exit 0 00:37:59.883 20:13:51 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:37:59.883 20:13:51 -- pm/common@29 -- $ signal_monitor_resources TERM 00:37:59.883 20:13:51 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:37:59.883 20:13:51 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:59.883 20:13:51 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:37:59.883 20:13:51 -- pm/common@44 -- $ pid=1608083 00:37:59.883 20:13:51 -- pm/common@50 -- $ kill -TERM 1608083 00:37:59.883 20:13:51 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:59.883 20:13:51 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:37:59.883 20:13:51 -- pm/common@44 -- $ pid=1608085 00:37:59.883 20:13:51 -- pm/common@50 -- $ kill -TERM 1608085 00:37:59.883 20:13:51 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:59.883 20:13:51 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:37:59.883 20:13:51 -- pm/common@44 -- $ pid=1608087 00:37:59.883 20:13:51 -- pm/common@50 -- $ kill -TERM 1608087 00:37:59.883 20:13:51 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:59.883 20:13:51 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:37:59.883 20:13:51 -- pm/common@44 -- $ pid=1608111 00:37:59.883 20:13:51 -- pm/common@50 -- $ sudo -E kill -TERM 1608111 00:37:59.883 + [[ -n 1213812 ]] 00:37:59.883 + sudo kill 1213812 00:37:59.893 [Pipeline] } 00:37:59.913 [Pipeline] // stage 00:37:59.918 [Pipeline] } 00:37:59.937 [Pipeline] // timeout 00:37:59.942 [Pipeline] } 00:37:59.961 [Pipeline] // catchError 00:37:59.966 [Pipeline] } 00:37:59.982 [Pipeline] // wrap 00:37:59.989 [Pipeline] } 00:38:00.000 [Pipeline] // catchError 00:38:00.007 [Pipeline] stage 00:38:00.009 [Pipeline] { (Epilogue) 00:38:00.021 [Pipeline] catchError 00:38:00.024 [Pipeline] { 00:38:00.039 [Pipeline] echo 00:38:00.040 Cleanup processes 00:38:00.044 [Pipeline] sh 00:38:00.382 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:38:00.382 1608188 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:38:00.382 1608407 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:38:00.397 [Pipeline] sh 00:38:00.681 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:38:00.682 ++ grep -v 'sudo pgrep' 00:38:00.682 ++ awk '{print $1}' 00:38:00.682 + sudo kill -9 1608188 00:38:00.693 [Pipeline] sh 00:38:00.975 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:38:13.197 [Pipeline] sh 00:38:13.483 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:38:13.483 Artifacts sizes are good 00:38:13.497 [Pipeline] archiveArtifacts 00:38:13.504 Archiving artifacts 00:38:13.665 [Pipeline] sh 00:38:13.949 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:38:13.962 [Pipeline] cleanWs 00:38:13.996 [WS-CLEANUP] Deleting project workspace... 00:38:13.996 [WS-CLEANUP] Deferred wipeout is used... 00:38:14.003 [WS-CLEANUP] done 00:38:14.005 [Pipeline] } 00:38:14.024 [Pipeline] // catchError 00:38:14.036 [Pipeline] sh 00:38:14.317 + logger -p user.info -t JENKINS-CI 00:38:14.325 [Pipeline] } 00:38:14.341 [Pipeline] // stage 00:38:14.346 [Pipeline] } 00:38:14.362 [Pipeline] // node 00:38:14.368 [Pipeline] End of Pipeline 00:38:14.397 Finished: SUCCESS